This question evaluates proficiency in end-to-end latency analysis, observability and instrumentation (client and server), distributed tracing, performance engineering, and SLI/SLO definition.
A user request flows: Browser → CDN → Load Balancer → API Gateway → Microservices → Caches/Databases → Third‑party services → Response to Browser. You need to analyze and reduce end‑to‑end latency without harming reliability or throughput.
Assume modern browsers (Navigation/Resource/Long Tasks APIs), HTTP/2+, and backend microservices where you can add distributed tracing and metrics.
Login required