Search engines discover URLs by crawling links, which works for small sites but leaves entire sections of larger sites — blog archives behind pagination, product detail pages behind filter UIs, dynamically routed pages with no inbound links — effectively invisible until the crawler happens across them weeks or months later. A sitemap hands Google, Bing, and downstream AI crawlers (Perplexity, ChatGPT, Claude) an explicit list of canonical URLs with lastModified timestamps, which is the single fastest way to get new content indexed and to signal when existing content has been updated. AI coding tools skip sitemap generation because it is a convention, not a default — Next.js's app/sitemap.ts file is purely opt-in, and scaffolds never include one. The failure case this catches is the empty sitemap: a file that exists but contains only the homepage, which is worse than missing because search engines then treat every other URL as less important than the one listed.
Low because Google can still discover most linked pages via crawling, but sitemaps measurably accelerate indexing of dynamic routes and are a trivial two-file addition in every mainstream framework.
For Next.js App Router, create app/sitemap.ts:
export default function sitemap() {
return [{ url: 'https://example.com', lastModified: new Date(), priority: 1 }]
}
Deeper remediation guidance and cross-reference coverage for this check lives in the seo-fundamentals Pro audit — run that after applying this fix for a more exhaustive pass on the same topic.
project-snapshot.seo.has-sitemaplowapp/sitemap.ts, app/sitemap.xml, pages/sitemap.xml.tsx, public/sitemap.xml, or framework plugins (next-sitemap, @astrojs/sitemap, vite-plugin-sitemap)."Sitemap source: {file or plugin}; URLs included: N (estimated).""No sitemap.ts, sitemap.xml, or sitemap plugin detected".app/sitemap.ts:
export default function sitemap() {
return [{ url: 'https://example.com', lastModified: new Date(), priority: 1 }]
}