Design a content moderation system
Company: TikTok
Role: Software Engineer
Category: System Design
Difficulty: medium
Interview Round: Technical Screen
Design a **content moderation system** for a large-scale product where users can submit content (e.g., text posts, images, videos, comments, or messages).
Your design should cover:
- **What is being moderated:** user-generated content before/after publishing.
- **Policy outcomes:** allow, block, label, downrank, age-gate, or send to human review.
- **Scale/latency goals:** handle high QPS; some content needs near-real-time decisions.
- **Abuse resistance:** adversarial users trying to evade detection.
- **Human-in-the-loop:** reviewer tooling, queues, SLAs, and escalation.
- **Auditing/appeals:** traceability and user appeals.
Provide an end-to-end architecture and justify key trade-offs (precision/recall, latency vs. quality, cost, and reliability).
Quick Answer: This question evaluates a candidate's competency in designing large-scale content moderation systems, covering distributed architecture, scalability and latency engineering, policy outcome modeling, abuse-resistance considerations, human-in-the-loop workflows, auditing and traceability, and operational trade-offs.