AI crawlers from OpenAI, Anthropic, Google, and Perplexity overwhelmingly fetch raw HTML without executing JavaScript — a single-page app that renders content in useEffect presents them with an empty <div id="root"></div> and nothing to index or cite. The same empty shell also harms search rankings and first-paint metrics, compounding the visibility loss across both SEO and GEO surfaces simultaneously.
High because client-only rendering makes primary content completely invisible to non-JS-executing crawlers across every scraped page.
Move primary content into server components and reserve 'use client' for interactive widgets only. In Next.js App Router, server components are the default — keep headings, body copy, and product descriptions in them, then wrap only buttons, forms, or modals in client components. Edit src/app/page.tsx and any route rendering marketing content.
export default function Page() {
return <main><h1>Your Product</h1><p>Description here...</p></main>
}
ID: geo-readiness.ai-crawler-access.content-server-rendered
Severity: high
What to look for: Determine whether main page content (headings, body text, product descriptions) is present in the initial HTML response. Count all page-level route files (e.g., page.tsx, page.jsx). For Next.js App Router: server components are SSR by default. Check whether page-level components that contain primary content have 'use client' directives AND fetch/render their content entirely client-side. For other frameworks: check if content is in static HTML files, server-rendered templates, or only appears after JavaScript execution. Count how many content pages are server-rendered vs. client-only.
Pass criteria: Count all public content page routes. At least 90% of public content pages must have primary content (headings, body text, meta tags) present in the server-rendered HTML. For Next.js App Router: pages using server components (default) pass. Pages with 'use client' that contain only interactive UI elements (forms, modals) while the main content is in server components also pass. Do NOT pass when the homepage or any primary marketing page renders its main content entirely client-side, even if other pages are server-rendered.
Fail criteria: Primary content on at least 1 key page (homepage, marketing, blog, docs) is only rendered client-side — the initial HTML response is an empty shell (<div id="root"></div>) that requires JavaScript to populate. AI crawlers typically do not execute JavaScript. Report: "X of Y content pages are client-only rendered".
Skip (N/A) when: Project uses a static site generator (Astro, Hugo, Jekyll, Eleventy) where all content is pre-rendered by definition.
Detail on fail: "Homepage content is entirely client-rendered — initial HTML contains only a mount point div. 0 of 1 key pages have SSR content. AI crawlers will see no content." or "3 of 5 blog pages use 'use client' and fetch all content via useEffect — no SSR fallback"
Remediation: AI crawlers typically fetch the raw HTML response without executing JavaScript. Move content rendering to the server:
In Next.js App Router, use server components (the default) for content:
// This is a server component by default — content is in the HTML
export default function Page() {
return <main><h1>Your Product</h1><p>Description here...</p></main>
}
If you need interactivity, keep the content in a server component and wrap only the interactive parts in 'use client' components.