All 22 checks with why-it-matters prose, severity, and cross-references to related audits.
Over-permissioned apps are a primary reason users uninstall or leave one-star reviews. Under GDPR Art. 5(1)(c) (data minimisation) and CCPA §1798.100, you may only collect data necessary for declared purposes — declaring a CAMERA permission when no camera feature exists is both a regulatory red flag and an app store rejection risk. Apple's permission best-practices guidelines make explicit that reviewers check for permissions without corresponding feature code. On Android, Google Play surfaces declared permissions on your store listing; users see every permission before they download. Unused permissions also expand your attack surface: a compromised SDK can silently exploit a declared permission even if your own code never calls it.
Why this severity: Critical because an unused declared permission grants third-party SDKs a silent attack vector and exposes the developer to GDPR data-minimisation violations and app store rejection without any corresponding user benefit.
mobile-permissions-privacy.permission-requests.necessary-permissionsSee full patternApple has required a Privacy Manifest (`PrivacyInfo.xcprivacy`) for all iOS app submissions since May 2024. An app without one — or with an incomplete one — is rejected at review. Beyond compliance, the manifest forces an audit of every SDK that touches protected APIs (NSUserDefaults, file timestamps, disk space, system boot time, active keyboard). GDPR Art. 13 requires informing users of the legal basis and purpose of data collection before collection begins; the Privacy Manifest is Apple's enforcement mechanism for that disclosure at the SDK layer. Missing reason codes for third-party SDKs (e.g., Firebase, Amplitude) are the most common rejection cause, because those SDKs access protected APIs on your behalf even if your own code does not.
Why this severity: Critical because a missing or incomplete Privacy Manifest causes immediate App Store rejection and signals unchecked SDK data access that violates GDPR Art. 13 disclosure requirements.
mobile-permissions-privacy.permission-requests.privacy-manifest-iosSee full patternUsers who see an unexpected system permission dialog — with no context about why an app needs access — deny at significantly higher rates than users who receive an explanation first. iOS only allows one native permission request per permission type: if the user denies the cold prompt, the app can never ask again via the system dialog. GDPR Art. 7 requires consent to be freely given, informed, and specific; a system dialog with no preceding rationale does not satisfy the informed requirement. Google Play's permission UX guidelines similarly require that the app explain the purpose before the native prompt fires. A denied permission that could have been granted represents a permanently lost capability — the user must manually navigate to Settings to re-enable it.
Why this severity: High because a denied permission from a cold, unexplained prompt is permanent on iOS and permanently degrades the feature — the only recovery requires the user to find and change a system-level setting.
mobile-permissions-privacy.permission-requests.permission-rationaleSee full patternRequesting all permissions on app launch is one of the most reliable ways to tank user retention. Apple's Human Interface Guidelines and GDPR Art. 5(1)(c) both require that data access be tied to a specific user action and purpose — not collected speculatively. When a user sees a camera permission request before they have navigated anywhere near a camera feature, they have no frame of reference for why the app needs it, so they deny. On Android, Google Play reviews flag apps that request permissions at launch without clear justification, and repeated violations result in policy strikes. Any permission requested in a root-level `useEffect(() => {}, [])` is a request that was never triggered by user intent.
Why this severity: High because launch-time permission requests eliminate the user's ability to understand the purpose of the request, directly reducing grant rates and creating GDPR Art. 5(1)(c) data-minimisation violations.
mobile-permissions-privacy.permission-requests.point-of-useSee full patternAccessing device hardware through undocumented native paths instead of the platform's official permission API means the app bypasses the OS-level permission gate entirely. OWASP Mobile Top 10 A01 (Improper Credential Usage) includes improper access to device hardware as a credential misuse vector. CWE-272 (Least Privilege Violation) is directly triggered when an app reads from a sensor without first verifying it has been granted access. Beyond correctness, bypassing `react-native-permissions` or `expo-permissions` means your code will behave unpredictably when users revoke permissions mid-session, because the native module has no revocation-aware wrapper — the call will silently fail or crash instead of returning a denied status.
Why this severity: High because bypassing the standard permission API means the app can silently access hardware after a user revokes permission mid-session, with no graceful fallback — a direct OWASP A01 and CWE-272 violation.
mobile-permissions-privacy.permission-requests.sensitive-permissions-apiSee full patternRequesting always-on background location when only foreground location is needed is one of the most common grounds for App Store rejection and a clear GDPR Art. 5(1)(c) data-minimisation violation. Apple's location always permission (`NSLocationAlwaysAndWhenInUseUsageDescription`) triggers a two-step authorization flow that explicitly warns users their location is tracked in the background — users decline this at far higher rates than when-in-use. On Android 10+, background location requires an additional `ACCESS_BACKGROUND_LOCATION` permission that triggers a separate, more alarming system dialog. Apps requesting always-on location for features like map display or restaurant search — where when-in-use is sufficient — face Google Play policy strikes and user complaints.
Why this severity: High because requesting always-on location for features that only need foreground location violates GDPR Art. 5(1)(c), triggers heightened Apple and Google review scrutiny, and causes significantly higher permission denial rates.
mobile-permissions-privacy.permission-requests.location-granularitySee full patternAsyncStorage on React Native is unencrypted, stored in plaintext on the device filesystem, and accessible to any process with root access or via backup extraction on Android. Storing auth tokens, API keys, or PII in AsyncStorage violates GDPR Art. 32 (security of processing) and NIST SP 800-53 SC-28 (protection of information at rest). On a jailbroken iOS device or a rooted Android device — which a meaningful fraction of users run — the AsyncStorage database file is readable without authentication. OWASP Mobile Top 10 A02 (Insecure Data Storage) lists unencrypted AsyncStorage for tokens as a critical vulnerability. A compromised auth token allows full account takeover; a compromised API key enables billing fraud.
Why this severity: Critical because auth tokens in plaintext AsyncStorage are recoverable by any process with filesystem access on rooted/jailbroken devices, enabling complete account takeover — a direct OWASP A02 and GDPR Art. 32 violation.
mobile-permissions-privacy.privacy-compliance.no-cleartext-sensitiveSee full patternPlatform secure storage (iOS Keychain, Android Keystore) is backed by hardware security modules on modern devices — credentials stored there cannot be extracted even with root access because the decryption key never leaves the secure enclave. CWE-312 (Cleartext Storage of Sensitive Information) and CWE-522 (Insufficiently Protected Credentials) both apply when an app stores tokens outside these APIs. OWASP A02 and NIST SC-28 require encryption at rest for all authentication credentials. Apps without Keychain/Keystore usage have no encrypted credential store by definition — every stored secret is as secure as the device filesystem, which is not secure at all on rooted or cloned devices. This is the class of vulnerability that drives credential-stuffing campaigns after device theft.
Why this severity: Critical because the total absence of Keychain or Keystore usage means no credential is encrypted at rest, making stored tokens recoverable via filesystem access on any rooted or backed-up device — a direct CWE-312 and OWASP A02 violation.
mobile-permissions-privacy.privacy-compliance.keychain-keystoreSee full patternCollecting analytics or behavioral data from users in the EU without a consent UI violates GDPR Art. 7 (conditions for consent) and the ePrivacy Directive Art. 5(3). In California, CCPA §1798.100 requires disclosure and opt-out rights before selling or sharing personal information. Apps that initialize analytics SDKs unconditionally — before checking persisted consent — are non-compliant regardless of whether they display a banner later. Google Play's Data Safety section and Apple's App Store privacy label are public-facing disclosures; regulators and users can compare them against actual SDK behavior. A consent UI that does not gate SDK initialization is cosmetically compliant but legally exposed: the DPA enforcement record consistently shows that the banner must actually stop data flow, not just record a preference.
Why this severity: High because analytics SDK initialization before consent is persisted and checked is a direct GDPR Art. 7 violation, and regulators have fined companies specifically for consent UIs that do not actually stop data collection.
mobile-permissions-privacy.privacy-compliance.consent-uiSee full patternGDPR Art. 17 (right to erasure) and CCPA §1798.105 (right to delete) both create legal obligations for apps that collect personal data from users in the EU or California. Google Play Data Safety now explicitly asks whether users can request data deletion, and apps that answer 'yes' in the Data Safety form but provide no mechanism are flagged by reviewers. An 'account deletion' that only signs the user out — without deleting server-side records — is not deletion: it is de-authentication. If a user files a GDPR deletion request and your app's only mechanism is sign-out, you are non-compliant. The backend deletion must remove or anonymize all personal data: profile, activity logs, uploaded content, and third-party data processor records.
Why this severity: High because lacking a data deletion mechanism creates direct legal liability under GDPR Art. 17 and CCPA §1798.105, and Google Play Data Safety discrepancies can result in listing removal.
mobile-permissions-privacy.privacy-compliance.data-deletionSee full patternGDPR Art. 13, CCPA §1798.100, and COPPA §312.4 all require that a privacy policy be provided to users before or at the time of data collection. Apple's App Store Review Guidelines (5.1.1) mandate a privacy policy URL in app metadata for any app that collects user or device data — apps without one are rejected at review. Google Play similarly requires a privacy policy link in the store listing for apps with any data collection. A placeholder URL (e.g., `https://example.com/privacy`) satisfies neither the legal disclosure requirement nor the app store technical requirement — reviewers check that the URL resolves to actual policy content. COPPA additionally requires specific disclosures when an app may be used by children under 13.
Why this severity: High because a missing privacy policy URL causes App Store and Google Play rejection at submission, and its absence during data collection creates direct GDPR Art. 13 and CCPA §1798.100 violations.
mobile-permissions-privacy.privacy-compliance.privacy-policy-linkedSee full patternCrash reporting SDKs by default capture the full application state at the time of a crash — including Redux stores, navigation state, and any API response bodies that were in memory. If your Redux state holds an auth token or your last API response includes user PII, that data ships to Sentry, Crashlytics, or Bugsnag's servers without any user consent or awareness. GDPR Art. 32 requires appropriate technical measures to protect personal data during processing — sending PII to a third-party crash analytics service without scrubbing is a data transfer that must be disclosed and minimized. CWE-532 (Insertion of Sensitive Information Into Log File) applies directly. OWASP A09 (Security Logging and Monitoring Failures) includes over-logging as a failure mode.
Why this severity: Medium because PII in crash logs requires authenticated access to the crash analytics dashboard to exploit, but represents a GDPR Art. 32 violation and creates data processor obligations with the crash reporting vendor.
mobile-permissions-privacy.privacy-compliance.crash-logs-sanitizedSee full patternGDPR Art. 28 requires a Data Processing Agreement with every third-party vendor that processes personal data on your behalf. Apple's Privacy Manifest requirement (since May 2024) means that any third-party SDK accessing protected APIs must be declared — if the SDK is not in your iOS privacy manifest, Apple rejects the submission. Google Play Data Safety requires disclosure of every SDK that collects or shares data, including the data types and purposes. An SDK that is configured to collect aggressively (e.g., full session replay, uncapped event logging) without a user-facing opt-out mechanism can create CCPA violations independently of your own app's consent UI — the SDK acts on your behalf, and its data collection is attributed to you.
Why this severity: Medium because an undisclosed data-handling SDK creates Google Play Data Safety inaccuracies and GDPR Art. 28 violations, but exploitation requires the SDK to actually transmit data — not merely be present in dependencies.
mobile-permissions-privacy.privacy-compliance.sdk-privacySee full patternGDPR Art. 13 requires that data subjects be informed of the categories of data collected and the purposes of processing at the time of collection — not buried in a long-form privacy policy users will never read. CCPA §1798.100 similarly requires 'notice at collection' that discloses data types before or at the point where data is collected. An app that collects location data and sends it to a backend without ever explaining this to the user in context — even if the privacy policy mentions it — fails the informed-at-collection requirement. Google Play Data Safety's prominence means that mismatches between declared data collection and actual SDK behavior are publicly visible to users before they install.
Why this severity: Medium because undisclosed data collection violates GDPR Art. 13 notice requirements and CCPA §1798.100 notice-at-collection, but the immediate data-breach risk is lower than cleartext storage or missing consent gating.
mobile-permissions-privacy.data-handling.collection-disclosureSee full patterniOS 14 introduced photo library limited access: users can grant access to a specific selection of photos rather than their entire library. An app that requests full photo library access for a feature that only needs one photo — such as a profile picture upload — forces users to choose between granting access to their entire camera roll or denying the feature entirely. This is a GDPR Art. 5(1)(c) data-minimisation violation. Apple's App Store reviewers check whether apps use the full library permission when limited access would suffice; requesting more than necessary is grounds for a guideline violation. CWE-272 (Least Privilege Violation) applies when an app acquires more access than its operation requires.
Why this severity: Low because the over-permission is scoped to photo access rather than more sensitive data types, but it remains a GDPR Art. 5(1)(c) data-minimisation violation and a common App Store Review guideline failure.
mobile-permissions-privacy.data-handling.photo-library-limitedSee full patternGDPR Art. 7(3) grants users the right to withdraw consent at any time, and withdrawal must be as easy as giving consent. CCPA §1798.120 grants California consumers the right to opt out of the sale or sharing of personal information. An analytics opt-out toggle that exists in Settings but does not actually stop data collection is deceptive — and regulators have specifically targeted this pattern, where consent signals are recorded but not enforced. The ePrivacy Directive Art. 5(3) further requires that tracking cookies or equivalent tracking mechanisms require prior consent in EU member states. An analytics SDK that runs regardless of the user's stored preference treats user choice as cosmetic.
Why this severity: Low because the user retains visibility of the broken opt-out, and enforcement requires regulators to test the feature — but a non-functional opt-out is a direct GDPR Art. 7(3) and CCPA §1798.120 violation.
mobile-permissions-privacy.data-handling.analytics-opt-outSee full patternBiometric data (Face ID scans, fingerprint templates) is classified as sensitive biometric data under GDPR Art. 9 and is subject to heightened processing restrictions. The platform's official `LocalAuthentication` API (iOS) and `BiometricPrompt` (Android) are specifically designed so that biometric data never leaves the secure enclave — they return only a boolean pass/fail. Custom biometric implementations that attempt to capture or process biometric data directly violate GDPR Art. 9 and OWASP A07 (Identification and Authentication Failures). CWE-287 (Improper Authentication) applies when biometric auth lacks proper fallback to passcode, leaving users locked out on unenrolled devices. An implementation without hardware-availability and enrollment checks crashes on devices without biometric sensors.
Why this severity: Medium because a missing LocalAuthentication API creates crash risks on devices without biometric sensors and, for custom implementations, potential GDPR Art. 9 violations around biometric data processing.
mobile-permissions-privacy.data-handling.biometric-auth-apiSee full patternApple's App Tracking Transparency (ATT) framework, introduced in iOS 14.5, requires explicit user authorization before an app can access the Identifier for Advertisers (IDFA) and use it for cross-app tracking. GDPR Art. 7(3) and CCPA §1798.120 extend this right to reset or limit tracking to all users in regulated regions. An app that displays ads but provides no mechanism for users to reset their advertising ID or access ad preference settings treats the user as a passive tracking target with no control. Google Play Data Safety requires disclosure of advertising ID usage; apps that collect it without providing reset capability face Data Safety accuracy flags. Apple ATT compliance is enforced at the SDK layer — ad SDKs that bypass ATT face App Store removal.
Why this severity: Low because the failure is a missing user control rather than active data exfiltration, but Apple ATT non-compliance and GDPR Art. 7(3) right-to-withdraw violations can result in ad SDK termination and app removal.
mobile-permissions-privacy.data-handling.ad-id-resetSee full patternA permission denial is a user's explicit choice to restrict access — an app that crashes in response to that choice punishes the user for exercising their privacy preference. This is both a UX failure and a reliability issue captured by CWE-391 (Unchecked Error Condition) and ISO 25010 reliability fault-tolerance requirements. On iOS, once a user denies a permission, the native dialog never appears again — the app must handle the BLOCKED state and guide the user to Settings. An app that shows a generic error message with no Settings deep-link leaves the user with no path forward. The pattern of crashing on denial also causes 1-star reviews specifically citing privacy concerns, compounding the trust damage.
Why this severity: Medium because permission-denial crashes create immediate user-facing failures and, for users exercising GDPR consent withdrawal, punish a legally protected user action with an application crash.
mobile-permissions-privacy.graceful-degradation.permission-denial-handlingSee full patternAn unhandled promise rejection from a permission-gated API call causes a visible red screen in development and a silent crash in production — neither outcome is acceptable. CWE-391 (Unchecked Error Condition) and CWE-755 (Improper Handling of Exceptional Conditions) directly describe this pattern. Apps that call `Geolocation.getCurrentPosition()`, `Contacts.getAll()`, or `Camera.takePicture()` without a preceding permission check assume the permission is permanently granted — but users can revoke permissions mid-session in iOS and Android Settings. The revocation takes effect immediately: the next call to the native API will fail with an authorization error, and if the call is not wrapped in a try-catch, the app crashes. This is the leading cause of crash loops reported after users update their privacy settings.
Why this severity: Medium because a crash on permission denial is reproducible by any user who revokes a permission mid-session — a common action — causing complete feature unavailability and crash reports that surface in App Store reviews.
mobile-permissions-privacy.graceful-degradation.no-crash-on-denialSee full patternAssuming a permission granted at app install remains in effect for the lifetime of the app is incorrect on both iOS and Android. Users can revoke permissions at any time in system Settings — the next API call after revocation will fail with an authorization error. OWASP A01 (Broken Access Control) includes improper assumption of access grants as a sub-class. CWE-272 (Least Privilege Violation) applies when code accesses hardware without verifying current authorization state. Beyond correctness, skip-the-check patterns are fragile across OS upgrades: iOS 15 introduced 'last week you used' notifications that prompt users to reconsider permissions, and Android 11 introduced auto-reset for apps unused for months — both of which silently revoke permissions that code was assuming were permanent.
Why this severity: Low because the failure mode — accessing a revoked permission — produces a permission-denied error rather than data exposure, but uncaught errors cascade into crashes that are the top user complaint for permission-gated features.
mobile-permissions-privacy.graceful-degradation.permission-status-checkSee full patternApple Privacy Nutrition Labels (introduced iOS 14) and Google Play Data Safety (required since 2022) are public-facing disclosures that users read before installing. A privacy label that lists a permission the app does not use — or omits a permission the app does use — is a misrepresentation that App Store and Play Store reviewers actively check. Under GDPR Art. 13, the information provided to users must be accurate and complete at the time of collection; a label that claims the app does not access location while the app requests location permission is a false disclosure. Google Play explicitly states that apps with inaccurate Data Safety forms may have their updates rejected or listings suspended. Phantom permissions (declared but unused) also appear on the store listing and reduce install conversion.
Why this severity: Low because the privacy label is a disclosure rather than a data collection mechanism, but inaccurate labels are grounds for Play Store suspension and GDPR Art. 13 false-disclosure findings.
mobile-permissions-privacy.graceful-degradation.app-store-privacy-labelSee full patternRun this audit in your AI coding tool (Claude Code, Cursor, Bolt, etc.) and submit results here for scoring and benchmarks.
Open Mobile Permissions & Privacy Audit