GDPR Art. 20 (data portability) and CCPA §1798.100 both require that users receive a complete, machine-readable copy of their data upon request, within defined timeframes. An export that omits message history, follower lists, or activity logs is a partial response that fails compliance. Regulators specifically audit export completeness — the ICO and CNIL have both taken enforcement action against platforms that provided incomplete or unprocessable archives. Beyond compliance, denying users access to their own data creates lock-in and erodes trust.
High because incomplete or missing data export is a direct violation of GDPR Art. 20 and CCPA §1798.100, with documented regulatory enforcement history.
Build a background job that assembles all user data categories — profile, posts, comments, messages (participant's side only), followers, following, settings, activity logs — into a JSON or ZIP archive, then stores a signed download URL and notifies the user by email. Track export jobs in a data_export_jobs table to enforce the 72-hour SLA.
// src/jobs/generateDataExport.ts
async function generateDataExport(jobId: string) {
const job = await db.dataExportJob.findUnique({ where: { id: jobId } });
const [profile, posts, messages, followers] = await Promise.all([
db.user.findUnique({ where: { id: job.userId } }),
db.post.findMany({ where: { authorId: job.userId } }),
db.message.findMany({ where: { OR: [{ senderId: job.userId }, { recipientId: job.userId }] } }),
db.follow.findMany({ where: { followingId: job.userId } }),
]);
const archive = { profile, posts, messages, followers, exportedAt: new Date().toISOString() };
const zipPath = await createZip(archive);
await db.dataExportJob.update({
where: { id: jobId },
data: { status: 'completed', completedAt: new Date(), downloadUrl: generateSignedUrl(zipPath) }
});
}
ID: community-privacy-controls.data-rights.data-export-completeness
Severity: high
What to look for: Enumerate every relevant item. Examine data export functionality. Check API endpoints or UI that initiate export requests. Verify the system generates an archive (JSON, CSV, ZIP, or similar) containing the user's data. Check database schema for export job tracking. Verify the export window (should be within 72 hours per GDPR).
Pass criteria: At least 1 of the following conditions is met. Data export endpoint exists. Export returns a complete archive including profile data, posts, comments, followers/following, messages (participant's side), settings, and activity logs. Archive is machine-readable (JSON or CSV). Export completes within 72 hours and can be downloaded or emailed.
Fail criteria: Export endpoint missing or returns incomplete data. Export takes longer than 72 hours. Archive format is not machine-readable (e.g., HTML, screenshot of data). Some data categories are omitted.
Skip (N/A) when: Never — data portability is a requirement.
Detail on fail: Describe what's missing. Example: "Export endpoint exists but only includes profile and posts. Messages, followers list, and activity logs are excluded." or "Export generation takes 5+ days; no 72-hour SLA documented."
Remediation: Implement comprehensive data export with 72-hour SLA:
// Trigger export request
async function requestDataExport(userId: string) {
const job = await db.dataExportJob.create({
data: {
userId,
status: 'pending',
requestedAt: new Date(),
}
});
// Queue background job
await exportQueue.add('generate-data-export', { jobId: job.id });
return job;
}
// Background job
async function generateDataExport(jobId: string) {
const job = await db.dataExportJob.findUnique({ where: { id: jobId } });
const userId = job.userId;
const user = await db.user.findUnique({ where: { id: userId } });
const posts = await db.post.findMany({ where: { authorId: userId } });
const followers = await db.user.findMany({
where: { following: { some: { followingId: userId } } }
});
const messages = await db.message.findMany({
where: { OR: [{ senderId: userId }, { recipientId: userId }] }
});
const archive = {
profile: user,
posts: posts,
followers: followers.map(f => ({ id: f.id, username: f.username })),
messages: messages,
exportedAt: new Date().toISOString(),
};
const zipPath = await createZip(archive);
await db.dataExportJob.update({
where: { id: jobId },
data: {
status: 'completed',
completedAt: new Date(),
downloadUrl: generateSignedUrl(zipPath),
}
});
}