Design a content moderation system for a large-scale product where users can submit content (e.g., text posts, images, videos, comments, or messages).
Your design should cover:
-
What is being moderated:
user-generated content before/after publishing.
-
Policy outcomes:
allow, block, label, downrank, age-gate, or send to human review.
-
Scale/latency goals:
handle high QPS; some content needs near-real-time decisions.
-
Abuse resistance:
adversarial users trying to evade detection.
-
Human-in-the-loop:
reviewer tooling, queues, SLAs, and escalation.
-
Auditing/appeals:
traceability and user appeals.
Provide an end-to-end architecture and justify key trade-offs (precision/recall, latency vs. quality, cost, and reliability).