All 26 checks with why-it-matters prose, severity, and cross-references to related audits.
Apple has required PrivacyInfo.xcprivacy for all iOS app updates since May 1, 2024. A missing or incomplete manifest means Apple's automated binary validation rejects your upload before a human reviewer ever sees it. Beyond the submission block, an incomplete manifest misrepresents your app's data practices to users and regulators — violating GDPR Art.5(1)(c) data-minimisation transparency and CCPA §1798.100 disclosure requirements. If your app or any of its third-party pods uses Required Reason APIs (UserDefaults, FileManager, SystemBootTime, DiskSpace) without declaring them, every update you ship is an unverifiable compliance gap.
Why this severity: Critical because a missing or incomplete PrivacyInfo.xcprivacy causes binary rejection on every iOS submission — there is no workaround or grace period.
app-store-privacy-data.privacy-declarations.no-privacy-manifestSee full patternGoogle Play enforces Data Safety form accuracy through ongoing policy reviews and user reports, not just initial submission. A mismatch between what your code collects and what the form declares is grounds for a policy strike and app removal under GDPR Art.13 and CCPA §1798.100 disclosure obligations. Common misses: apps that initialise AdMob or react-native-firebase without declaring device ID sharing with Google, or crash reporters left undeclared because they feel like infrastructure rather than data collection. Google distinguishes between data you collect and data third-party SDKs collect on your behalf — both must be declared.
Why this severity: High because an inaccurate Data Safety form results in policy strikes that escalate to app removal and can trigger regulatory scrutiny under GDPR and CCPA.
app-store-privacy-data.privacy-declarations.data-safety-form-accuracySee full patternApple reviews nutrition label accuracy both during the review process and in response to user reports filed through the App Store. Mismatches like declaring analytics as 'Data Not Linked to You' while calling `setUserId()` expose you to enforcement and rejection on every subsequent update. Under Apple's App Privacy guidelines (external apple-app-privacy) and GDPR Art.13, users have a right to accurate disclosure of what data is collected and how it is linked to their identity. A mislabelled label is not a minor oversight — Apple has removed apps and forced resubmission for label inaccuracies.
Why this severity: High because nutrition label mismatches trigger rejection on resubmission and expose the developer to Apple enforcement actions and potential GDPR Art.13 violations.
app-store-privacy-data.privacy-declarations.privacy-nutrition-labelsSee full patternMissing or inadequate NS*UsageDescription strings cause automatic binary rejection — apps without them cannot be installed on TestFlight, let alone the App Store. Apple enforces this at the binary validation stage before human review begins. Beyond rejection, an empty or generic string like 'We need camera access' violates GDPR Art.5(1)(b) (purpose limitation) by failing to communicate the specific feature requiring access, and CCPA §1798.100 by withholding disclosure of why personal data is collected. Every permission your code requests must be matched with a substantive, feature-specific explanation.
Why this severity: Critical because missing NS*UsageDescription keys cause automatic binary rejection — the app cannot be distributed on TestFlight or the App Store under any circumstance.
app-store-privacy-data.privacy-declarations.ns-usage-descriptions-completeSee full patternApple and Google both require that third-party SDK data collection appear in your app's privacy declarations — even data your own code never touches directly. SDKs like AppsFlyer, AdMob, and Meta Audience Network collect device identifiers, IP addresses, and app-usage signals that must be declared under GDPR Art.28 (processor relationships), GDPR Art.13 (disclosure), CCPA §1798.140(t) (sale/sharing disclosure), and COPPA §312.5 for children's data. On iOS, initialising tracking SDKs before ATT authorisation is obtained compounds the violation — the SDK reads IDFA without user consent, which is grounds for immediate rejection.
Why this severity: High because undeclared SDK data collection creates policy violations on both platforms simultaneously and may constitute a GDPR Art.28 breach for undisclosed data processor relationships.
app-store-privacy-data.privacy-declarations.third-party-sdk-declarationsSee full patternApple began requiring privacy manifests in third-party SDKs in May 2024. Pods at versions below their compliance threshold ship without a PrivacyInfo.xcprivacy, which means your app's aggregate privacy manifest will be incomplete — Apple flags this during binary validation. This is a supply-chain compliance failure: you inherit the compliance gap from the dependency (CWE-1357, SSDF PW.4), and GDPR Art.28 requires you to ensure your data processors are themselves compliant. An outdated pod version is an audit trail showing you did not exercise due diligence.
Why this severity: Medium because outdated SDK versions with missing privacy manifests cause Apple to flag submissions, but the violation is fixable with a version bump rather than an architectural change.
app-store-privacy-data.privacy-declarations.sdk-version-complianceSee full patternThe Linked/Not Linked classification is one of Apple's most scrutinised nutrition label fields. Declaring analytics as 'Data Not Linked to You' while calling `setUserId()`, `identify()`, or `Sentry.setUser()` misrepresents your data practices to every user who reads your App Store listing. Under GDPR Art.13 and Art.4(1), data is 'personal' the moment it can be linked to an identifiable person — calling `setUserId()` crosses that threshold and obligates accurate disclosure. Apple reviews this classification both during initial review and in response to user-filed reports.
Why this severity: Low because the mis-classification is a disclosure error rather than a data breach, but it creates cumulative legal and reputational risk when discovered during enforcement.
app-store-privacy-data.privacy-declarations.data-linked-classificationsSee full patternBackground location is the single most scrutinised permission on both iOS and Android. Apple manually reviews every app that requests 'Always' location access, and Google Play requires a separate background location declaration form plus a valid use-case explanation. Under GDPR Art.5(1)(c), collecting precise background location when coarse foreground location would suffice violates the data-minimisation principle (CWE-359). CCPA §1798.100 treats geolocation as sensitive personal information requiring explicit disclosure. Apps that silently collect location in background services or push notification handlers — without user-visible features justifying it — face rejection, removal, and regulatory exposure.
Why this severity: Medium because background location without justification triggers manual review on both platforms and violates GDPR data-minimisation, but exploitation requires a device already running the app.
app-store-privacy-data.privacy-declarations.location-usage-justificationSee full patternApple's ATT framework (external apple-att-framework) requires explicit user authorisation before any app reads the IDFA or allows tracking SDKs to access cross-app identifiers. Initialising AdMob, Meta, or AppsFlyer in `AppDelegate.application(_:didFinishLaunchingWithOptions:)` before the ATT prompt resolves means those SDKs read IDFA before consent — a direct violation of Apple's policy, GDPR Art.7 (consent must precede processing), and CCPA §1798.120 (right to opt out of sale/sharing). Apple has rejected apps for this pattern since iOS 14.5 and continues to enforce it on every submission. Missing `NSUserTrackingUsageDescription` in `Info.plist` causes the app to crash when the ATT dialog is triggered.
Why this severity: Critical because ATT violations cause immediate binary rejection, and reading IDFA before consent is grounds for enforcement action by Apple and regulators under GDPR Art.7.
app-store-privacy-data.tracking-advertising.att-before-trackingSee full patternThe `NSUserTrackingUsageDescription` wording is a legal representation of your tracking scope that users see before deciding whether to grant ATT consent. Understating the scope — claiming 'for measurement' when the SDK also performs cross-app behavioural targeting — misleads users and violates GDPR Art.5(1)(a) (lawfulness, fairness, transparency) and CCPA §1798.100. Apple cross-checks the description against the App Store nutrition labels 'Data Used to Track You' section, and mismatches trigger rejection. Declaring SKAdNetwork-only measurement while simultaneously passing IDFA to the same ad network makes the privacy-preserving claim false.
Why this severity: High because misrepresenting the tracking scope in the usage description is a policy violation that triggers rejection and grounds for app removal if discovered post-publication.
app-store-privacy-data.tracking-advertising.att-label-consistencySee full patternDisplaying an ATT prompt but reading the tracking identifier before the user responds — or transmitting a device-persistent ID to a third party before ATT resolves — defeats the purpose of the consent mechanism entirely. Under GDPR Art.7, consent must precede processing; under CCPA §1798.120, users have the right to opt out before sharing occurs. OWASP 2021 A04 (Insecure Design) applies when consent architecture is structurally bypassed. Apple's App Review detects this pattern with instrumentation: they see SDK network calls before the ATT dialog appears in their testing, and apps fail review for it.
Why this severity: High because transmitting tracking identifiers before ATT consent is obtained constitutes pre-consent data collection that violates Apple policy, GDPR Art.7, and CCPA §1798.120.
app-store-privacy-data.tracking-advertising.no-tracking-before-consentSee full patternPrivacy-preserving attribution via SKAdNetwork (external apple-skadnetwork) allows iOS ad campaigns to measure install effectiveness without requiring individual user tracking. Apps that have ad SDKs but neither SKAdNetwork configured nor ATT properly implemented are relying on IDFA-based cross-app tracking with no fallback — meaning they generate zero attribution data for the majority of users who deny ATT consent (typically 60–80% on consumer apps). This is both a compliance gap (GDPR Art.25 data protection by design, CCPA §1798.120) and a business problem: campaigns cannot be measured for most of your iOS users.
Why this severity: Medium because the absence of privacy-preserving attribution does not itself constitute a violation, but signals an incomplete consent infrastructure that likely has ATT compliance gaps.
app-store-privacy-data.tracking-advertising.privacy-preserving-attributionSee full patternTest ad unit IDs left in production builds display test ads instead of real ads, generating zero revenue while consuming real user impressions. More critically, a missing `GADApplicationIdentifier` in `Info.plist` or `AndroidManifest.xml` causes the AdMob SDK to crash at initialisation — your app fails immediately on launch for all users running production builds. For Kids category apps, unconfigured `tagForChildDirectedTreatment` violates COPPA §312.5 and GDPR Art.8 by allowing behavioural targeting of children, which carries FTC penalties up to $50,120 per violation per day.
Why this severity: Low because test IDs cause revenue loss and configuration crashes rather than security breaches, but the crash from a missing Application ID is a complete production failure.
app-store-privacy-data.tracking-advertising.ad-sdk-complianceSee full patternApple has required in-app account deletion for all apps that allow account creation since June 2022 (external apple-guideline-5.1.1). Apps that route users to an email address or external form for deletion are rejected on every submission, not just the first. Beyond the rejection risk, failing to provide in-app deletion violates GDPR Art.17 (right to erasure) and CCPA §1798.105 (right to deletion) — both of which require that deletion be initiatable by the user without unreasonable friction. 'Email us to delete your account' does not satisfy either regulation.
Why this severity: Critical because Apple rejects every submission that creates user accounts without in-app deletion — there is no exception or deferral, and GDPR Art.17 requires erasure be technically possible.
app-store-privacy-data.data-handling.in-app-account-deletionSee full patternGDPR Art.17 requires that deletion requests be completed — not just acknowledged — within 30 days of a verified request. CCPA §1798.105 has the same 45-day window. Apps that soft-delete (set a `deleted_at` timestamp) without a scheduled hard-delete job leave personal data in the database indefinitely, making every policy claiming 'data deleted upon request' a false statement. Equally critical: third-party analytics and ad SDKs retain user data independently — if your deletion flow does not call their opt-out or data-deletion APIs, that data persists in external systems even after your database is clean.
Why this severity: High because a deletion mechanism that does not complete within the regulatory timeframe creates direct GDPR Art.17 and CCPA §1798.105 liability regardless of intent.
app-store-privacy-data.data-handling.deletion-timeframe-complianceSee full patternGDPR Art.5(1)(c) requires data minimisation — collecting only what is strictly necessary for a specified purpose. Requesting contact list access in an app with no contact-sharing feature, or logging 40+ analytics event types for an app with a narrow feature set, is over-collection that creates unnecessary legal exposure without any product benefit. Excessive analytics events are a common attack surface for data leak scenarios: each event payload is another opportunity for inadvertent PII capture. Store reviewers also flag permission requests without a demonstrable feature need, resulting in rejection.
Why this severity: Info because over-collection is a risk signal rather than an active exploit — it increases legal surface area and reviewer scrutiny without constituting a breach by itself.
app-store-privacy-data.data-handling.data-collection-minimizationSee full patternAn in-app 'Delete Account' button that calls a stub function or only deletes the auth record — leaving profile tables, user content, and uploaded files intact — creates GDPR Art.17 and CCPA §1798.105 liability while giving users false confidence their data is gone. The more dangerous failure: an unauthenticated deletion endpoint (OWASP A01, CWE-284, CWE-639) allows any caller to delete any user's account by guessing or enumerating user IDs — a complete access-control failure that turns a privacy feature into a weaponisable IDOR vulnerability.
Why this severity: Medium because an incomplete server-side deletion endpoint creates regulatory liability and, if unauthenticated, becomes an IDOR vulnerability enabling account destruction at scale.
app-store-privacy-data.data-handling.server-side-deletion-apiSee full patternAsyncStorage in React Native is unencrypted plaintext on every platform — its contents are readable from a jailbroken iOS device or from any app on Android versions below 4.4. Storing auth tokens, JWTs, or user PII in AsyncStorage means a single compromised device exposes session credentials that can be replayed to impersonate the user with full account access. OWASP 2021 A02 (Cryptographic Failures), CWE-312 (Cleartext Storage of Sensitive Information), HIPAA §164.312(a)(2)(iv), and NIST SP 800-218 PW.4 all require sensitive data at rest to be encrypted using platform-provided mechanisms (Keychain on iOS, Keystore on Android).
Why this severity: High because auth tokens in AsyncStorage are accessible to any process on a compromised device, enabling session hijacking and full account takeover without a network attack.
app-store-privacy-data.data-handling.sensitive-data-encryptionSee full patternGDPR Art.13(2)(a) requires disclosure of 'the period for which the personal data will be stored, or if that is not possible, the criteria used to determine that period.' Without a retention disclosure, every EU user your app collects data from is covered by an incomplete privacy notice — a regulatory gap that multiplies with each new user. Beyond GDPR, App Store Connect and Google Play Console both require a live, accessible privacy policy URL before you can submit — an incomplete policy causes submission failure even if it exists.
Why this severity: Low because absent retention disclosure is a regulatory gap that creates legal risk at scale but does not enable direct data exfiltration or account compromise.
app-store-privacy-data.data-handling.data-retention-disclosureSee full patternAn inaccurate age rating misdirects parental controls and content filtering on both platforms. Apple and Google use age ratings to enforce parental control features: a gambling mechanic in a 4+ app bypasses every parental restriction a parent has set. COPPA §312.2 and CCPA §1798.120(c) extend protections to users under 13 — an under-rated app with data collection features triggers child-privacy obligations the developer may not have implemented. Apple rejects apps with gambling or casino mechanics rated below 17+; Google will restrict or remove apps where the rating does not match detected content.
Why this severity: Low because rating mismatches are correctable during review, but a Kids category app with gambling mechanics or ad SDKs is an escalation to COPPA violation territory.
app-store-privacy-data.children-sensitive.age-rating-accuracySee full patternCOPPA §312.5 prohibits collecting personal information from children under 13 without verifiable parental consent. FTC fines for COPPA violations reach $50,120 per violation per day — and each user interaction that collects a persistent identifier counts as a separate violation. GDPR Art.8 (GDPR-K) sets the EU threshold at 16 in most member states. Apple permanently removes apps that violate Kids category policies; reinstatement is not guaranteed. Initialising Firebase Analytics or a behavioural ad SDK in a Kids category app without child-safe configuration is a textbook COPPA violation — it is not a configuration oversight, it is an illegal act at scale.
Why this severity: High because COPPA violations carry FTC civil penalties per violation per day and result in permanent App Store removal, not just rejection.
app-store-privacy-data.children-sensitive.kids-category-coppa-gdprSee full patternBehavioural advertising targeting minors is prohibited under COPPA §312.2, GDPR Art.8, CCPA §1798.120(c), and the UK Children's Code. These are not soft guidelines — they are enforced with fines, app removal, and in the US, FTC consent decrees. Apps that serve interest-based ads to all users without age verification treat every minor as an adult for advertising purposes. The failure is structural: if the ad SDK does not have a 'user may be a minor' code path, and the app does not pass age data to the SDK, behavioural targeting of minors is happening by default for every minor who uses the app.
Why this severity: High because serving behavioural ads to minors without age verification violates COPPA, GDPR Art.8, and CCPA §1798.120(c) simultaneously — each serving event is a potential separate violation.
app-store-privacy-data.children-sensitive.no-behavioral-ads-minorsSee full patternAge gates that present a 'Are you 21? Yes/No' dialog are trivially bypassed by any child with a tap. Platform policies reject these as inadequate verification: Apple requires that alcohol, gambling, and adult content apps have a 17+ rating AND demonstrate real age verification. COPPA §312.5 and CCPA §1798.120(c) require that data collection from minors be gated on verifiable parental consent — a client-side yes/no check satisfies neither. A gambling feature with a 12+ rating and no age gate will be rejected by Apple and may trigger Play Store removal.
Why this severity: Medium because a trivially bypassable age gate does not protect minors from restricted content and fails platform review, but exploitation requires a child to actively use the app.
app-store-privacy-data.children-sensitive.age-gate-restricted-contentSee full patternCOPPA §312.5 and §312.7 prohibit using ad networks in children's apps that collect persistent identifiers or engage in behavioural targeting unless those networks are COPPA-certified. Using AdMob mediation in a Kids app without explicitly disabling non-certified mediated networks means you cannot guarantee which network actually served an ad — and if a non-certified network served it, you are liable. The FTC has pursued enforcement actions against app developers for exactly this configuration gap, resulting in consent decrees and civil penalties.
Why this severity: Info because the certification status of ad SDKs is an advisory finding — the hard pass/fail is covered by the COPPA/GDPR-K check — but uncertified networks in Kids apps are a direct FTC enforcement risk.
app-store-privacy-data.children-sensitive.coppa-certified-ad-sdksSee full patternMixed-audience apps — educational apps, family apps, messaging platforms rated E for Everyone — carry COPPA and GDPR Art.8 obligations for their minor users even when the app is not in the Kids category. Without parental controls, a child's account is functionally identical to an adult's: they can access the same data collection, make the same purchases, and interact with the same social features. Parental control mechanisms are also a market differentiator: parents choose apps that give them oversight, and their absence is a reason to avoid an app for family use.
Why this severity: Info because the absence of parental controls is a risk signal and market gap rather than an active violation — the hard compliance requirements are covered by the COPPA/GDPR-K check.
app-store-privacy-data.children-sensitive.parental-controls-mixed-audiencesSee full patternThe EU Digital Markets Act (external eu-dma-2022-1925) grants EU users new rights around alternative payment processing on iOS as of 2024, creating new technical requirements and App Store review processes for apps targeting EEA users. Apps with existing GDPR consent flows also face ongoing obligations: GDPR Art.30 requires a Records of Processing Activities (RoPA), Art.37 may require a Data Protection Officer for apps processing data at scale, and Art.6 requires a documented legal basis for every processing activity. Undocumented legal bases are among the most common enforcement findings in GDPR audits.
Why this severity: Info because EU regulatory signals require awareness and documentation rather than immediate code changes, but undocumented processing activities create enforcement risk that grows with user volume.
app-store-privacy-data.risk-indicators.eu-dma-payment-rulesSee full patternRun this audit in your AI coding tool (Claude Code, Cursor, Bolt, etc.) and submit results here for scoring and benchmarks.
Open App Store Privacy & Data Compliance Audit