Client-only rendering of product descriptions, article bodies, and pricing leaves Googlebot with empty shells — the rendering budget times out before your JavaScript finishes, so the page gets indexed with zero substantive content. User-agent-based cloaking goes further and earns manual action penalties under Google's Spam Policies, which can deindex the entire domain. Either failure mode silently collapses organic traffic on your highest-value commerce and content routes.
Critical because cloaking triggers manual penalties that deindex entire domains and tank organic revenue.
Move critical content into Server Components or static generation so it exists in the HTML source before hydration. Never branch rendering on req.headers['user-agent']. Use app/products/[slug]/page.tsx as a server component:
export default async function ProductPage({ params }) {
const product = await fetchProduct(params.slug)
return <article><h1>{product.name}</h1><p>{product.description}</p></article>
}
ID: seo-advanced.crawlability.no-cloaking
Severity: critical
What to look for: Enumerate all pages that render critical content (product descriptions, article text, pricing information). For each page, check whether the content is present in the raw HTML source before JavaScript execution. Count all instances of user-agent detection patterns (e.g., navigator.userAgent, req.headers['user-agent'] comparisons, bot-detection middleware) and any conditional rendering that serves different content to bots vs. users.
Pass criteria: At least 90% of critical content is present in server-rendered HTML source or uses SSR/SSG rendering that Googlebot can index. Zero user-agent detection patterns that serve different content to bots and users. Report: "X of Y critical pages have server-rendered content; 0 cloaking patterns found."
Fail criteria: More than 10% of critical content is only rendered client-side with JavaScript that Googlebot may not execute, or at least 1 user-agent-based cloaking pattern detected.
Do NOT pass when: Content appears server-rendered but uses display:none or visibility:hidden CSS that hides it from users while showing it to crawlers — this is reverse cloaking.
Skip (N/A) when: The project is a fully static site (no JavaScript-heavy content) generated at build time.
Cross-reference: For JavaScript bundle optimization that affects crawler rendering, the Performance Deep Dive audit covers bundle analysis in detail.
Detail on fail: "Product description rendered only after JavaScript loads — not present in HTML source for Googlebot" or "1 user-agent detection pattern in middleware serves minimal content to search engines".
Remediation: Googlebot can execute JavaScript, but cloaking violates search engine guidelines. Render critical content server-side in app/products/[slug]/page.tsx:
// Server Component — content is in HTML source
export default function ProductPage() {
return (
<div>
<h1>Product Name</h1>
<p>This description is in the HTML source.</p>
</div>
)
}