Without a moderation log, enforcement becomes arbitrary and unauditable. ISO 27001:2022 A.5.35 requires that organizations document accountability measures — a moderation log is the primary accountability artifact for content enforcement. Without it, there is no way to detect bias (one moderator removing content the others would approve), no way to identify moderator error rates, and no defensible record if a user challenges a moderation decision in court or through a DSA complaint. Platforms with no moderation audit trail also cannot train new moderators, because there is no historical record of what decisions were made and why.
Low because absent logging degrades moderation accountability and consistency but does not directly expose user data or enable security attacks.
Create a moderation_logs table and write a record for every enforcement action. Add the logging helper to src/lib/moderationLog.ts:
export async function logModerationAction(params: {
action: 'remove' | 'approve' | 'ban' | 'warn' | 'restore';
contentId: string;
reason: string;
moderatorId: string;
metadata?: Record<string, unknown>;
}) {
await db.moderationLogs.create({
...params,
createdAt: new Date(),
});
}
A log entry must capture at minimum: action, reason, moderatorId, and timestamp. A log that records only the action but omits the reason does not satisfy this check. Call logModerationAction from every moderation endpoint before returning a response.
ID: community-moderation-safety.policy-transparency.consistency-fairness
Severity: low
What to look for: Check if there's a moderation log documenting decisions (reasons, outcomes, moderators). Look for patterns: are similar violations treated the same way? Is there bias visible in enforcement? Check if multiple moderators make decisions or if one person controls everything.
Pass criteria: A moderation log exists documenting decisions with at least 4 fields: action, reason, moderator ID, and timestamp. Count the number of fields tracked per moderation action. Similar violations are treated consistently.
Fail criteria: No moderation log. Decisions appear arbitrary or inconsistent. One person makes all decisions with no oversight. A log that records only the action but not the reason does not count as pass.
Skip (N/A) when: Platform has only 1 moderator or fewer than 100 active users.
Detail on fail: "No moderation log. Decisions appear arbitrary. It's unclear why some content was deleted and other similar content was allowed."
Remediation: Maintain a moderation log tracking all actions at src/lib/moderationLog.ts:
async function logModerationAction({ action, contentId, reason, moderatorId }) {
await db.moderationLogs.create({
action, // 'remove', 'approve', 'ban', 'warn'
contentId,
reason,
moderatorId,
createdAt: new Date()
});
}