Collecting reports without a tool to review them is operationally equivalent to collecting no reports at all. Under DSA Art. 22, platforms must ensure reported content is reviewed in a timely manner — storing reports in a database that only a developer can query via SQL satisfies neither the letter nor the spirit of that requirement. Without a moderation interface, backlogs grow unbounded, harmful content stays visible indefinitely, and moderators cannot triage by severity or content type. This is the enforcement gap between detection and action.
Critical because without a reviewable queue, collected reports are inert — no moderator can act on them, so the reporting system provides no actual safety improvement.
Build a moderation dashboard at src/app/admin/moderation/page.tsx that surfaces pending reports with full context and at least two action buttons (approve, remove). Minimum viable interface:
export default async function ModerationQueue() {
const reports = await db.reports.findMany({
where: { status: 'pending' },
orderBy: { createdAt: 'asc' },
include: { content: true },
});
return (
<table>
<thead>
<tr><th>Content</th><th>Reason</th><th>Reported At</th><th>Actions</th></tr>
</thead>
<tbody>
{reports.map(r => (
<tr key={r.id}>
<td>{r.content?.preview}</td>
<td>{r.reason}</td>
<td>{r.createdAt.toISOString()}</td>
<td>
<button onClick={() => approve(r.id)}>Approve</button>
<button onClick={() => remove(r.id)}>Remove</button>
</td>
</tr>
))}
</tbody>
</table>
);
}
Gate the route with moderator-role middleware; a raw DB view without action buttons does not satisfy this check.
ID: community-moderation-safety.report-enforcement.moderation-queue
Severity: critical
What to look for: Look for an admin/moderator dashboard or interface. Check if there's a moderation queue showing reported content, automated flags, or pending review items. Verify that moderators can view, approve, or take action on reported content.
Pass criteria: A moderation interface exists accessible to moderators/admins. It displays reported content, reasons for reports, and options to take at least 2 distinct actions (e.g., delete, hide, approve, escalate). Quote the actual route or component path of the moderation interface. Enumerate all action types available in the queue and verify it shows pending items with timestamps.
Fail criteria: No moderator interface or queue exists. Reports are collected but not reviewed. A raw database view without action buttons does not count as pass.
Skip (N/A) when: Never — moderation is essential.
Detail on fail: "Reports are stored in the database but there's no admin interface to review them. Moderators have no way to see or act on reports."
Cross-reference: Compare with community-moderation-safety.report-enforcement.user-reporting — reports must be both collected (user-reporting) and reviewable (this check).
Remediation: Build a moderation dashboard at src/app/admin/moderation/page.tsx that displays pending reports with options to approve, delete, or take action:
export default function ModerationQueue() {
const reports = usePendingReports();
return (
<table>
<thead><tr><th>Content</th><th>Reporter</th><th>Reason</th><th>Actions</th></tr></thead>
<tbody>
{reports.map(r => (
<tr key={r.id}>
<td>{r.contentPreview}</td>
<td>{r.reporterName}</td>
<td>{r.reason}</td>
<td>
<button onClick={() => approve(r.id)}>Approve</button>
<button onClick={() => remove(r.id)}>Remove</button>
</td>
</tr>
))}
</tbody>
</table>
);
}