Without a documented content policy, users cannot know what the platform prohibits, moderators have no authoritative standard to apply consistently, and the platform has no defensible basis for enforcement actions. DSA Art. 14 requires platforms to document and publish their terms and conditions, including content restrictions. COPPA 16 CFR 312 further mandates transparency about moderation practices for platforms serving minors. A policy vague enough to read as 'be nice' cannot support consistent enforcement and exposes the platform to claims of arbitrary or biased moderation — a reputational and legal liability.
High because the absence of a documented content policy creates a DSA compliance gap and eliminates the legal basis for consistent moderation enforcement.
Create a public content policy page at src/app/community-guidelines/page.tsx that explicitly names prohibited content categories and the actions taken when they are violated. At minimum, the policy must cover:
Document enforcement tiers: warning, content removal, temporary suspension, permanent ban. Link this page from the signup flow, the footer, and any moderation notification emails. A generic 'be nice' statement with no enumerated categories does not satisfy this check.
community-moderation-safety.policy-transparency.policy-documentedhigh"No content policy page. Users don't know what's prohibited or what happens if they violate rules."src/app/community-guidelines/page.tsx documenting prohibited content categories and enforcement actions.