A stray robots: { index: false } in app/layout.tsx or a middleware-injected X-Robots-Tag: noindex header can deindex the entire site overnight without any code deploying fresh errors — the most common cause is a staging-guard that was never environment-gated. Once pages drop from the index, recovery takes weeks even after the tag is removed, and organic revenue vanishes during the wait.
Medium because unintended noindex can deindex the site but is detectable and reversible within a recrawl cycle.
Gate every noindex directive on process.env.VERCEL_ENV !== 'production' and keep the guard co-located with the directive. In app/layout.tsx:
export const metadata = {
robots: process.env.VERCEL_ENV === 'production' ? undefined : { index: false, follow: false }
}
Add a CI check that greps for unguarded noindex strings before merge.
seo-advanced.crawlability.no-unintended-noindexmediumrobots: { index: false }, <meta name="robots" content="noindex">, and X-Robots-Tag: noindex across every page component and middleware. For each occurrence, check whether it is guarded by an environment check (e.g., process.env.NODE_ENV !== 'production'). Enumerate any noindex directives on production public pages.noindex directives exist on production public pages. Any noindex usage is guarded by environment checks or applies only to legitimately non-indexable pages (admin, auth). Report: "X noindex directives found; 0 affect production public pages."noindex directive on a production public page without an environment guard.noindex directive is present in a shared layout component (e.g., app/layout.tsx) that applies to all pages, even if individual pages don't explicitly set it."Blog listing page in app/blog/page.tsx has robots: { index: false } without environment check".app/blog/page.tsx or the affected page component, ensure noindex is conditional on environment.