All 22 checks with why-it-matters prose, severity, and cross-references to related audits.
Without an analytics SDK wired up, you cannot answer basic product questions: which acquisition channels convert, which pages drive signups, which features get used. You are shipping to production blind. Marketing spend gets allocated on gut feel, pricing experiments cannot be measured, and conversion regressions go undetected until revenue drops. Installing a package without wiring a measurement ID is worse than not installing it — it signals the job is done when zero data is flowing.
Why this severity: High because zero observability blocks every downstream growth, conversion, and retention decision the product depends on.
marketing-analytics.core-analytics.script-presentSee full patternSingle-page applications do not issue new HTTP requests on navigation, so analytics SDKs see exactly one pageview per session — the landing page. Every subsequent route the user visits is invisible. Funnel analysis breaks: you cannot tell which internal pages drive conversion, which articles users read, or where they drop off. Bounce rate looks artificially perfect because no second pageview ever fires. This is the most common analytics failure mode in Next.js, Remix, and SvelteKit apps.
Why this severity: High because missing route-change tracking silently discards the majority of pageview data in any SPA.
marketing-analytics.core-analytics.page-view-trackingSee full patternAnalytics SDKs reach for `window`, `document`, and `localStorage` during initialization. When that code runs inside a React Server Component or at module top-level in an SSR file, the server throws `ReferenceError: window is not defined` and the page either 500s or renders a broken shell. Users see a white screen; crawlers see an error; Vercel logs fill with the same stack trace. The analytics tool you installed to observe the site is actively preventing it from rendering.
Why this severity: High because SSR-time crashes take entire pages offline and the fix is small but mandatory.
marketing-analytics.core-analytics.analytics-initialized-client-sideSee full patternDuplicate initialization doubles every event. Pageview counts inflate, conversion rates halve against truth, session counts get split across two cookies, and funnel steps show impossible transitions because the same user is being tracked as two identities. Once bad data is in GA4 or PostHog you cannot retroactively deduplicate it — every historical report based on that period is permanently corrupted. A/B test significance calculations lie. Board metrics lie.
Why this severity: Medium because duplicate events corrupt historical data irreversibly but do not break user-facing functionality.
marketing-analytics.core-analytics.single-analytics-instanceSee full patternUTM parameters are how you tell paid search, paid social, email, and partner referrals apart. Lose them and every marketing dollar collapses into a single undifferentiated `(direct) / (none)` bucket, making channel ROI impossible to calculate. If UTMs are stripped by a client-side redirect before the analytics SDK captures them, or if you chose an SDK that does not read them automatically, every campaign you spent money on looks identical to organic traffic in the dashboard.
Why this severity: High because lost campaign attribution makes marketing spend unmeasurable and unchannelable.
marketing-analytics.core-analytics.utm-parameter-handlingSee full patternWhen marketing traffic lands on `www.example.com` and converts on `app.example.com`, missing cross-domain configuration resets the session at the subdomain boundary. The user who clicked your Google Ad and signed up is recorded as two separate visitors: one that bounced on the marketing site, one that appeared from `(direct) / (none)` on the app. Campaign ROI gets understated, paid channels look unprofitable, and budgets move away from the channels that actually work.
Why this severity: Low because the issue only surfaces when multiple domains exist and affects attribution rather than core tracking.
marketing-analytics.core-analytics.cross-domain-trackingSee full patternThe CTA click is the single most valuable event on a marketing site — it is the moment intent converts to action. Without a tracked click event, you cannot measure CTA copy experiments, cannot tell whether the hero or pricing page drives more signups, cannot debug funnels that drop off between click and form submission, and cannot compare variant performance. You see the downstream signup but lose the entire top of the funnel that produced it.
Why this severity: High because untracked primary CTAs make every conversion optimization experiment unmeasurable.
marketing-analytics.event-conversion-tracking.cta-click-eventsSee full patternForms are conversion endpoints. When the contact form, newsletter signup, demo request, or waitlist form fires no event on submit, you lose the ability to compare form conversion rates across pages, detect forms that suddenly stop converting, separate submission intent from successful delivery, and attribute leads back to their acquisition channel. A form with a silent submit handler is a leak in the measurement pipeline between visitor and customer.
Why this severity: High because untracked form submissions sever the final link between marketing traffic and qualified leads.
marketing-analytics.event-conversion-tracking.form-submission-eventsSee full patternA conversion goal is what turns analytics from a pageview counter into a revenue measurement tool. Without a configured goal — a dedicated success page, a named `purchase` or `sign_up` event, a webhook-triggered completion event — GA4 has nothing to optimize against, Google Ads smart bidding has nothing to learn from, and your dashboards cannot answer the one question that matters: how many visitors became customers. Conversion reports stay empty.
Why this severity: High because without configured goals the analytics platform cannot measure the outcome the business exists to produce.
marketing-analytics.event-conversion-tracking.conversion-goal-configuredSee full patternScroll depth tells you whether visitors actually read the page or bounced after the hero. Without it, a visitor who scrolls past every pricing tier, reads every testimonial, and leaves looks identical to one who bounced after two seconds. You cannot distinguish content that holds attention from content that fails, cannot identify the fold where engagement drops, and cannot quantify the impact of page length experiments. Long-form marketing pages become uninstrumented.
Why this severity: Medium because missing scroll data degrades content performance analysis but does not break core conversion tracking.
marketing-analytics.event-conversion-tracking.scroll-depth-trackingSee full patternForm abandonment tracking distinguishes a form nobody starts from a form everyone starts but abandons at field three. Without it you cannot find the field that kills your signup flow — the one asking for phone number, or tax ID, or company size — and you cannot prioritize which friction to remove. Tracking only the final submit gives you the denominator of conversion but never the numerator of drop-off, which is where the optimization gains live.
Why this severity: Low because abandonment tracking improves optimization precision but its absence does not break primary conversion measurement.
marketing-analytics.event-conversion-tracking.form-abandonment-trackingSee full patternWithout user properties, every analytics report averages free users, paid users, enterprise accounts, and trial signups into a single number that describes no actual cohort. You cannot compare activation between plans, cannot isolate power-user behavior, cannot segment by signup date to measure cohort quality, and cannot feed PLG motion with account-level data. Anonymous event streams produce vanity metrics; identified event streams produce business intelligence.
Why this severity: Info because segmentation enables advanced analysis but the core event pipeline functions without it.
marketing-analytics.event-conversion-tracking.custom-dimensions-segmentationSee full patternDeploying cookie-based analytics (GA4, Mixpanel, Amplitude, Segment) without a consent banner violates GDPR Art. 6 and Art. 7 for EU/EEA visitors and CCPA §1798.120 for California residents. Regulators treat the absence of a consent mechanism as a strict liability violation — no harm needs to be proved. DPA enforcement actions have resulted in fines up to 4% of global annual turnover. Beyond legal exposure, analytics data collected without lawful basis cannot be legally used and may need to be deleted entirely.
Why this severity: Critical because collecting persistent tracking cookies without user consent is a per-visit GDPR violation with direct regulatory fine exposure under Art. 83.
marketing-analytics.privacy-compliance.consent-banner-presentSee full patternA consent banner that exists as a UI component but does not actually block analytics from firing provides zero legal protection. Under GDPR Art. 7, consent must be freely given before processing begins — not after. This is the most common AI-built project failure: analytics loads unconditionally in the root layout, the banner is purely decorative, and every page view constitutes an unlawful processing event under GDPR and eprivacy Art. 5(3). Data regulators consider cosmetic consent banners an aggravating factor in enforcement.
Why this severity: Critical because analytics firing before consent is granted means every page load is an unlawful processing event under GDPR Art. 6, regardless of whether the banner UI is present.
marketing-analytics.privacy-compliance.analytics-gated-by-consentSee full patternPassing email addresses or full names as analytics user identifiers or event properties violates GDPR Art. 5(1)(c) (data minimisation) and OWASP A02 (Cryptographic Failures / data exposure). Most analytics platforms (GA4, Mixpanel) explicitly prohibit PII in their terms of service and will terminate accounts that send it. Beyond compliance, PII in analytics creates a data breach vector: analytics dashboards are typically accessible to more people than your production database, and data export APIs make the PII recoverable by anyone with dashboard access.
Why this severity: Critical because PII in analytics events is both a GDPR Art. 5(1)(c) violation and a breach of analytics platform ToS — CWE-359 exposure that creates an additional data breach surface outside your production security controls.
marketing-analytics.privacy-compliance.no-pii-in-analyticsSee full patternGDPR Art. 13 and CCPA §1798.100 require disclosure of what data is collected, why, and by whom — at the point of collection. A missing or unlinked privacy or cookie policy leaves users unable to exercise their rights (access, deletion, opt-out) and exposes operators to regulatory complaints. Under GDPR, the absence of a privacy notice is itself a violation independent of whether any data was misused. Consent management platforms auto-generate cookie declarations, but only if a policy page is present and linked where users can find it.
Why this severity: Medium because the absence of a privacy or cookie policy is a standalone GDPR Art. 13 disclosure violation, but does not by itself cause data exfiltration or immediate user harm.
marketing-analytics.privacy-compliance.cookie-declarationSee full patternGDPR Art. 7(3) and CCPA §1798.120 both grant users the right to withdraw consent as easily as they granted it. A consent banner that can only accept — with no mechanism to subsequently revoke — fails this requirement. In practice this means users who later change their mind have no path to opt out short of clearing their browser storage, which most won't do. DPA enforcement has specifically targeted "roach motel" consent implementations where acceptance is one click but withdrawal requires contacting support.
Why this severity: Medium because a missing opt-out mechanism is a GDPR Art. 7(3) violation that prevents users from exercising a granted right, but does not directly cause data capture beyond what was already consented.
marketing-analytics.privacy-compliance.opt-out-mechanismSee full patternDebug mode sends extra diagnostic events, enables verbose console output, routes events to GA4 DebugView, and can expose internal event schemas and user property names in the browser console. Shipped to production it pollutes the real data stream with test events, inflates event counts by double-digit percentages, and leaks implementation detail visible in DevTools. Every downstream report based on that data is contaminated for as long as debug mode stayed on.
Why this severity: High because unguarded debug mode silently corrupts production analytics data and leaks implementation detail.
marketing-analytics.data-quality.debug-mode-disabled-productionSee full patternMixed analytics event naming conventions — `camelCase` alongside `snake_case` alongside `PascalCase` — cause silent query failures in analytics dashboards. A funnel that joins `buttonClicked` with `button_clicked` misses events and produces artificially low conversion numbers. This is an observability issue (iso-25010:2011 Maintainability): when event names are inconsistent, adding a new tracking point requires reading existing code to guess which convention applies, and mistakes go undetected until conversion data looks wrong in production.
Why this severity: Low because inconsistent naming degrades analytics data quality and developer experience without creating a security or availability risk.
marketing-analytics.data-quality.event-naming-conventionSee full patternGoogle Universal Analytics (UA- tracking IDs) was permanently shut down on July 1, 2024. Any codebase still referencing `analytics.js` or a `UA-XXXXXXXX-X` tracking ID is collecting zero data — the service is gone. Operators may not notice for months because the UI still renders and no error is thrown. Beyond UA, running analytics SDKs two or more major versions behind current (CWE-1357) means missing security patches and compatibility fixes, and relying on deprecated API surfaces that platforms remove without warning.
Why this severity: Low because an outdated SDK version degrades observability and may miss security patches (CWE-1357), but does not directly expose user data or create an availability risk.
marketing-analytics.data-quality.analytics-version-currentSee full patternSession recording tools (PostHog, Hotjar, Microsoft Clarity) with no explicit sampling rate default to recording every session. On a medium-to-large project this creates unexpected data volume, processing costs, and storage costs that compound as traffic grows. From an iso-25010:2011 Performance efficiency perspective, unsampled session recording on high-traffic pages can also introduce measurable page weight from the recording SDK. This is an informational signal — default settings are not a crisis, but an intentional sampling configuration is a sign the project has thought through its analytics operational costs.
Why this severity: Info because unsampled session recording creates cost and data-volume risk at scale but does not cause immediate user-facing or security harm on most projects.
marketing-analytics.data-quality.sampling-rate-configuredSee full patternWhen the same tracking script loads twice — typically `gtag.js` direct plus GA4 nested inside a Google Tag Manager container — every pageview, every event, every conversion fires twice. Real conversion rates look 50% lower than they are, user counts double, session durations break because the same user is hitting consecutive pageviews in millisecond intervals, and automated bidding on ad platforms optimizes against phantom volume. The data contamination is permanent for the affected window.
Why this severity: Info because redundant scripts corrupt data but the condition is rare and easy to remediate once spotted.
marketing-analytics.data-quality.no-redundant-scriptsSee full patternRun this audit in your AI coding tool (Claude Code, Cursor, Bolt, etc.) and submit results here for scoring and benchmarks.
Open Analytics & Tracking Audit