All 22 checks with why-it-matters prose, severity, and cross-references to related audits.
Without a documented lawful basis for each processing activity, every data collection your application performs is potentially unlawful under GDPR Art. 6. Regulators do not need to prove harm — the absence of a documented basis is itself the violation. Marketing emails sent to users who signed up only for account delivery violate Art. 6 directly: the 'contract' basis covers service delivery, not promotional campaigns, and each is a separate legal act. The distinction between consent, contract, and legitimate interest is not cosmetic — 'legitimate interest' requires a balancing test, and skipping it leaves you exposed to a GDPR Art. 5(1)(a) breach finding that can carry fines up to 4% of global turnover.
Why this severity: Critical because undocumented lawful basis renders every processing activity potentially unlawful under GDPR Art. 6, and regulators can levy maximum-tier fines without requiring proof of actual harm to individuals.
gdpr-readiness.lawful-basis.lawful-basis-documentedSee full patternGDPR Art. 5(1)(c) and Art. 25 (privacy by design) require that only data strictly necessary for the stated purpose is collected. Collecting a phone number on a SaaS tool with no SMS or calling feature, or a date of birth on a product with no age gate, is not a grey area — it is excess collection with no lawful purpose. This creates real risk: every field you collect is a field that can be breached, subpoenaed, or scraped. Excess collection also signals to regulators that data minimization was never considered, which compounds penalties for other violations. AI-built apps are particularly prone to schema bloat because code generators default to comprehensive field lists rather than minimal ones.
Why this severity: Critical because collecting personal data with no active purpose violates GDPR Art. 5(1)(c) directly, and unnecessary fields expand the breach surface with no compensating benefit to users.
gdpr-readiness.lawful-basis.data-minimizationSee full patternGDPR Art. 5(1)(b) prohibits using data for purposes incompatible with the original collection purpose. When a user hands you their email to receive a password reset link, they are not consenting to receive a marketing newsletter — these are legally distinct acts under Art. 6 and Art. 7. Sending marketing email to all registered users on the basis of the signup 'contract' is one of the most common GDPR violations in AI-built apps, and it is one regulators can identify from a single test signup. Passing account data to enrichment services like Clearbit or advertising audiences on Facebook without disclosure compounds the violation into an Art. 13 breach (no notice at collection).
Why this severity: High because repurposing data without separate consent directly violates Art. 5(1)(b) and Art. 6(4), and marketing violations are routinely flagged by regulators in enforcement actions without requiring a formal complaint.
gdpr-readiness.lawful-basis.purpose-limitationSee full patternGDPR Articles 13 and 14 impose a positive obligation to inform users at the point of data collection — not just in a footer-buried privacy policy. A signup form that submits an email address with no indication of what happens to it, how long it is kept, or why it is needed violates Art. 13(1) on first contact. Cookie banners that present an 'Accept' button without describing which cookies fire or for how long similarly fail Art. 13 and the ePrivacy Directive Art. 5(3). Regulators treat missing point-of-collection notices as independently sanctionable from other violations, not merely as context for other failures.
Why this severity: High because omitting point-of-collection disclosure violates the transparency principle under Art. 5(1)(a) and the specific disclosure obligations of Art. 13, independently of any other GDPR breach.
gdpr-readiness.lawful-basis.collection-transparencySee full patternGDPR Art. 15 grants every EU data subject the right to receive a copy of all personal data held about them, within 30 days of requesting it. An application with no data export feature and no DSAR process is not compliant — not having a formal process is itself a violation, regardless of whether any user has actually made a request. When a subject access request does arrive (and they will, especially as GDPR awareness grows), scrambling to manually assemble a data export from multiple tables is both error-prone and legally risky. Incomplete exports — for example, returning profile fields but omitting activity history or consent records — also violate Art. 15.
Why this severity: High because the absence of any DSAR mechanism makes it structurally impossible to comply with Art. 15's 30-day fulfillment deadline, leaving the controller in ongoing breach for every day the gap exists.
gdpr-readiness.user-rights.right-to-accessSee full patternGDPR Art. 17 — the 'right to be forgotten' — requires permanent deletion of personal data when the user requests it and no overriding legal basis for retention exists. A soft-delete that sets a `deleted_at` flag without a subsequent purge job leaves PII sitting indefinitely in your database, which is a direct Art. 17 violation every day after the grace period expires. Cascade failures are equally common: deleting the user row while leaving activity logs, uploaded files, and messages orphaned with the original `user_id` and any associated PII is not erasure. Regulators do not accept 'the data is inaccessible through the UI' as equivalent to deletion.
Why this severity: High because failing to implement actual erasure — not just soft-delete — keeps PII in your database beyond the user's exercise of their Art. 17 right, constituting ongoing unlawful retention under Art. 5(1)(e).
gdpr-readiness.user-rights.right-to-erasureSee full patternGDPR Art. 20 grants data subjects the right to receive data they have 'provided' in a structured, commonly used, machine-readable format. This is not satisfied by a PDF, an HTML page, or a JSON blob full of abbreviated field names like `ts`, `uid`, and `evt`. The portability right exists specifically so users can take their data to a competing service — which requires a format that a human or automated import tool can interpret without a technical manual. Failing to provide a self-documenting, versioned export also makes your own DSAR compliance fragile: if the schema changes and old exports are unreadable, you have no durable evidence trail.
Why this severity: Medium because the portability gap is a discrete Art. 20 violation but does not by itself put users at immediate harm — however, it compounds with Art. 15 (access) gaps when no export exists at all.
gdpr-readiness.user-rights.right-to-portabilitySee full patternGDPR Art. 16 and Art. 5(1)(d) require that personal data be kept accurate and that data subjects can correct inaccurate information. A profile page that displays personal data as read-only — no edit form, no save button — removes the user's ability to exercise this right. This is a quiet but persistent violation: the user's name or contact details may be wrong, and they have no mechanism to fix it. Email address changes carry additional security weight: allowing an email swap without verification exposes accounts to takeover, which creates a compounding GDPR breach risk on top of the rectification failure.
Why this severity: Medium because the violation is a clear but non-urgent Art. 16 gap — users cannot rectify inaccuracies, but the immediate harm is limited unless stale data drives automated decisions that affect the user.
gdpr-readiness.user-rights.right-to-rectificationSee full patternGDPR Art. 18 grants users the right to request restriction of processing in four specific circumstances: contested accuracy, unlawful processing they oppose erasing, data needed for legal claims, or pending objection under Art. 21. This right is rarely exercised but must be possible to honor. An application with no restriction mechanism and no documented process cannot respond to an Art. 18 request within the legally required one-month window. Crucially, a restriction flag must actually prevent the data from being used in analytics exports, marketing sends, and third-party pipelines — a flag that exists in the database but is never checked in application code is not compliance.
Why this severity: Low because Art. 18 requests are infrequent in practice for most SaaS products, but the absence of any mechanism means zero ability to comply when a request arrives, which is a clear-cut regulatory gap.
gdpr-readiness.user-rights.right-to-restrictSee full patternGDPR Art. 7 and Art. 4(11) define valid consent as freely given, specific, informed, and unambiguous — requiring an affirmative act. Pre-ticked boxes are explicitly prohibited. A cookie banner with a single 'Accept All' button and no per-category breakdown fails both the granularity requirement and the specificity requirement: users cannot meaningfully consent to 'analytics' and 'advertising' as a single undifferentiated mass. The ePrivacy Directive Art. 5(3) separately requires consent before setting non-essential cookies. Analytics scripts that fire unconditionally on page load — before the user has interacted with any banner — are violating both frameworks simultaneously, and this is detectable by any regulator with a browser dev tools panel.
Why this severity: Critical because pre-ticked non-essential cookie categories and unconditional script loading constitute unlawful processing under GDPR Art. 6(1)(a) — consent is the claimed basis but was never validly obtained.
gdpr-readiness.consent-management.granular-opt-inSee full patternGDPR Art. 7(1) and Art. 5(2) (accountability) require that controllers be able to demonstrate that consent was validly obtained. A bare boolean in `localStorage` — `accepted: true` — is not evidence of anything: it records no timestamp, no consent version, no which categories were accepted, and is trivially forgeable or deleted. When a regulator or data subject challenges whether consent was given for a specific purpose at a specific time, you need a server-side audit trail tied to a versioned consent notice. Without it, you cannot meet the Art. 5(2) accountability obligation even if you collected consent in good faith.
Why this severity: High because unstructured or versionless consent records fail the Art. 7(1) burden-of-proof requirement — you cannot demonstrate lawful consent was given for each processing purpose, which undermines every consent-based processing activity.
gdpr-readiness.consent-management.consent-recordsSee full patternGDPR Art. 7(3) states explicitly that withdrawal of consent must be as easy as giving it. If a user accepted analytics cookies in one click from a banner, withdrawal cannot require navigating to three nested settings menus, sending an email, or hunting for a link that only appears during the initial visit. This symmetry requirement is not ambiguous — it is black-letter GDPR text. Applications that bury consent management after the initial banner interaction effectively make consent irrevocable in practice, which invalidates the consent entirely under Art. 4(11) (consent must be freely withdrawable).
Why this severity: Medium because difficult withdrawal does not immediately expose data — it invalidates previously given consent under Art. 7(3), converting all consent-based processing into unlawful processing for affected users.
gdpr-readiness.consent-management.easy-withdrawalSee full patternPlacing a compliant consent banner on your page while loading Google Analytics, Facebook Pixel, or Hotjar unconditionally in `layout.tsx` is security theater — the compliance UI exists but the data collection proceeds regardless of what the user clicks. ePrivacy Directive Art. 5(3) and GDPR Art. 6(1)(a) both require that non-essential cookies and tracking technologies activate only after the user's informed consent. Unconditional script loading in root layout files is the most common technical GDPR violation in Next.js applications, and it is trivially detectable from the browser network panel. Google Tag Manager compounds this risk because tags added via the GTM UI bypass the codebase entirely and may fire before any consent check.
Why this severity: Medium because unconditional tracking script loading makes all consent-based processing unlawful regardless of banner UX quality — the scripts process data before any consent signal is evaluated.
gdpr-readiness.consent-management.conditional-script-loadingSee full patternGDPR Art. 7(3) and Art. 13(3) require fresh consent whenever the purposes or conditions of processing change materially. A user who consented to analytics-only cookies in January did not consent to session recording you added in March. If the stored consent record has no version, your application cannot distinguish old consent from new consent — and will treat the January acceptance as valid for the March tracking, which it is not. This is a silent compliance gap: nothing breaks, but every user who visited before your tracking expansion is being processed without valid consent for the new purposes.
Why this severity: Low because the failure only manifests when tracking purposes change — but when it does, it silently converts all previously consented users into users without valid consent for the new processing.
gdpr-readiness.consent-management.re-consent-on-changeSee full patternGDPR Art. 28 requires a written Data Processing Agreement between a controller and any processor that handles personal data on its behalf. This is not optional even when the processor is a major vendor — Stripe, SendGrid, Vercel, Sentry, and Auth0 all offer DPAs, but they must be actively accepted through account settings; they do not apply by default. Without accepted DPAs, there is no contractual basis governing how these processors handle user data, no obligation for them to assist with DSARs or breach notification, and no mechanism to compel data deletion on contract termination. Regulators have fined controllers specifically for missing DPAs with cloud vendors.
Why this severity: Medium because absent DPAs do not immediately expose user data but leave every third-party processing relationship uncontracted under Art. 28, making the controller fully liable for processor behavior without any contractual recourse.
gdpr-readiness.data-processing.data-processing-agreementsSee full patternGDPR Chapter V (Arts. 44–46) prohibits transferring EU personal data outside the EU/EEA unless a lawful transfer mechanism is in place — adequacy decision, Standard Contractual Clauses, or Binding Corporate Rules. Most AI-built applications transfer EU data to the US routinely: a US-hosted database, US-based analytics, a US email provider. Each transfer without a documented mechanism is an independent violation. Post-Schrems II, relying on 'the vendor is reputable' is not a legal basis. Standard Contractual Clauses are included in most major vendor DPAs — but only if those DPAs are accepted. Undocumented transfers are also a disclosure failure under Art. 13(1)(f).
Why this severity: Low because most transfers are covered by SCCs embedded in accepted vendor DPAs — the primary risk is documentation failure rather than actual data exposure, but undocumented transfers remain an independent Art. 44 violation.
gdpr-readiness.data-processing.cross-border-safeguardsSee full patternGDPR Art. 13(1)(e) requires controllers to name the recipients or categories of recipients of personal data at the point of collection. Art. 28(3) requires that processor contracts specify which subprocessors are authorized. A privacy policy that says 'we share data with trusted third-party service providers' names nobody and satisfies neither requirement. Users cannot exercise their rights intelligently — including objecting to specific processing — if they do not know which companies receive their data. Regulators have issued guidance specifically requiring named subprocessors, not category descriptions, for transparency obligations to be met.
Why this severity: Low because the missing list is a transparency and documentation failure under Arts. 13 and 28 rather than an active harm — but it is independently sanctionable and blocks meaningful exercise of user rights.
gdpr-readiness.data-processing.sub-processor-listSee full patternGDPR Art. 35 mandates a Data Protection Impact Assessment before carrying out high-risk processing — systematic profiling with significant effects, large-scale processing of special category data (health, biometrics, racial origin), automated decision-making with legal consequences, or large-scale monitoring. A recommendation engine, credit scoring algorithm, or health data platform that launches without a DPIA is not just non-compliant — it may be subject to a mandatory consultation with the supervisory authority under Art. 36 before processing can begin. Most SaaS products do not trigger Art. 35, but failing to document that assessment explicitly leaves you unable to demonstrate the exemption.
Why this severity: Info because most standard SaaS products do not trigger Art. 35 thresholds — the finding is about documenting the assessment and completing the DPIA when triggers do apply, not an immediate data exposure risk.
gdpr-readiness.data-processing.dpia-high-riskSee full patternGDPR Art. 33 requires that a personal data breach be notified to the competent supervisory authority within 72 hours of the controller becoming aware of it — not 72 hours after they feel ready, not 72 hours after legal review, 72 hours flat. Without a documented breach notification procedure specifying who is responsible, what constitutes 'becoming aware,' and which supervisory authority is relevant, this deadline is structurally unachievable. Art. 34 separately requires notifying affected individuals without undue delay when the breach poses high risk to their rights. An organization that discovers a breach and has no runbook will spend the 72 hours figuring out process instead of executing it.
Why this severity: Low because breach notification is a procedural obligation that only activates at breach time — but missing the 72-hour window after a breach is one of the most commonly cited standalone GDPR violations, carrying significant independent fines.
gdpr-readiness.breach-accountability.breach-notificationSee full patternGDPR Art. 30 requires controllers to maintain written records of processing activities (ROPA) covering processing purposes, data categories, recipients, retention periods, and security measures. Though organizations under 250 employees are generally exempt unless processing is systematic, non-occasional, or involves special category data, the practical effect is that most SaaS products should maintain a ROPA — and those that cannot demonstrate the exemption on request are treated as non-exempt. In a breach or regulatory investigation, the ROPA is the first document a DPA requests. Absence of any data register also makes it impossible to correctly scope DSARs, breach notifications, or deletion requests.
Why this severity: Info because ROPA is a documentation obligation rather than a direct data protection control — the immediate harm is regulatory and reputational rather than a risk to user data.
gdpr-readiness.breach-accountability.records-of-processingSee full patternGDPR Art. 37 mandates designation of a Data Protection Officer for public authorities, organizations engaged in large-scale systematic monitoring, or those processing special category data at scale. Most B2B and B2C SaaS products do not meet these thresholds — but the absence of any privacy contact creates a separate problem: Art. 13(1)(b) requires controllers to provide contact details for the DPO or, where no DPO exists, the controller itself. If data subjects have no documented way to reach someone about privacy matters, they cannot exercise their Art. 15–22 rights in practice. A privacy email address in the footer is minimal but required; not having one is independently sanctionable.
Why this severity: Info because mandatory DPO requirements are rarely triggered for standard SaaS — the primary gap is usually missing documentation of the exemption assessment and absence of a reachable privacy contact.
gdpr-readiness.breach-accountability.dpo-if-requiredSee full patternGDPR Arts. 12, 13, and 14 require a privacy policy that is accessible, written in plain language, and covers specific mandatory disclosures — controller identity, legal bases, retention periods, data subject rights (including the right to lodge a complaint with a supervisory authority), and subprocessor names. A policy that requires authentication to access violates the accessibility requirement immediately: prospective users and regulators must be able to read it. An undated policy cannot demonstrate whether it was in effect before or after a particular data collection event. Vague language like 'we share with trusted partners' without naming anyone fails the specificity requirements of Art. 13(1)(e) independently of other gaps.
Why this severity: Info because the policy is a documentation and transparency obligation — its absence or inadequacy is independently sanctionable but does not itself expose user data, though it signals systemic GDPR non-compliance to regulators.
gdpr-readiness.breach-accountability.accessible-privacy-policySee full patternRun this audit in your AI coding tool (Claude Code, Cursor, Bolt, etc.) and submit results here for scoring and benchmarks.
Open GDPR Readiness Audit