Design an A/B Testing Platform
Company: Affirm
Role: Software Engineer
Category: System Design
Difficulty: hard
Interview Round: Onsite
Design an internal A/B testing platform for a large consumer product.
The platform should let product managers and engineers create experiments, define variants, choose targeting rules, set rollout percentages, and start, stop, or ramp experiments safely. For each incoming user request, the system should deterministically assign the user to a variant and keep the assignment sticky across sessions. The platform must log exposures and collect downstream events so teams can compute business metrics such as click-through rate, conversion, retention, and revenue.
Discuss:
- Core requirements and APIs
- Experiment configuration and versioning
- Traffic allocation, bucketing, and mutual exclusion
- Exposure logging and event ingestion
- Metric computation and reporting
- Real-time monitoring, ramp-up, rollback, and guardrails
- Statistical correctness concerns such as sample ratio mismatch, bot filtering, and avoiding double counting
- Scalability, reliability, and data freshness trade-offs
Quick Answer: This question evaluates a candidate's ability to design scalable, reliable experimentation and analytics platforms, testing competencies in distributed systems architecture, deterministic user allocation and stickiness, event ingestion and logging, metric computation, and statistical validity for A/B testing.