All 22 checks with why-it-matters prose, severity, and cross-references to related audits.
Search engines use the title tag as the primary signal for what a page is about and display it as the clickable headline in results. Pages without a title tag, or pages that all share one identical title, collapse into a single entry in Google's index and compete with each other for ranking. The result is lost organic traffic, cannibalized rankings across your own pages, and poor click-through rates on the few results that do surface.
Why this severity: Critical because missing or duplicate titles directly prevent indexing and kill organic discovery across the entire site.
seo-fundamentals.meta-tags.title-presentSee full patternWhen a page has no meta description, Google synthesizes a snippet from random on-page text — often a navigation fragment, a cookie banner, or a stray sentence that destroys click-through rate. A hand-written description controls what prospects read below your title in results, drives qualified clicks, and becomes the default preview when the page is shared without Open Graph tags. Missing descriptions cede that 160-character sales pitch to whatever the crawler grabs first.
Why this severity: High because a missing description silently hands copywriting control to an algorithm and depresses click-through on every query.
seo-fundamentals.meta-tags.meta-descriptionSee full patternMissing or misconfigured viewport metadata causes mobile browsers to render your page at desktop width, then shrink it — text becomes unreadable without zooming, tap targets collapse, and Google's mobile-first index penalizes the page in rankings. Worse, setting `user-scalable=no` with `maximum-scale=1` violates WCAG 2.2 SC 1.4.4 (Resize Text), which requires users to be able to scale text to 200% without loss of content. Sites with pinch-to-zoom disabled routinely fail WCAG audits and see elevated bounce rates from mobile visitors.
Why this severity: High because a missing or broken viewport tag directly degrades mobile usability and triggers ranking demotion in Google's mobile-first index, harming organic traffic at scale.
seo-fundamentals.meta-tags.viewport-configuredSee full patternWithout a declared charset, browsers fall back to heuristics — usually Windows-1252 or Latin-1 on legacy configs — and render multi-byte UTF-8 content as mojibake. Smart quotes, em dashes, emoji, and non-English characters turn into garbage sequences like `’` or `é`. The Google crawler follows the same fallback and indexes the corrupted text, which then shows up in search results. This is also a minor XSS signal: ambiguous encoding lets attackers smuggle script tags through characters the browser misinterprets.
Why this severity: Medium because modern frameworks usually inject it automatically, but when missing it breaks rendering and indexing immediately.
seo-fundamentals.meta-tags.charset-declaredSee full patternThe same page reached via `/about`, `/about/`, `/about?ref=twitter`, and `https://www.yoursite.com/about` is four separate URLs to Google, and PageRank splits across all of them. Duplicate indexing dilutes ranking signals, triggers canonical-selection on Google's side (often picking a URL you did not intend), and on e-commerce sites causes faceted URLs to consume crawl budget that should go to product pages. A canonical tag consolidates all signals to the version you choose.
Why this severity: Medium because duplicate-URL dilution compounds silently — no error surfaces, but ranking authority leaks on every crawl.
seo-fundamentals.meta-tags.canonical-urlSee full patternThe `lang` attribute on the `<html>` element is the primary signal search engines use to serve your page to the correct language audience. Without it, Google may incorrectly categorize or exclude your content from language-segmented search results, reducing organic reach in your target market. Screen readers depend on the `lang` value to select the right pronunciation engine; an absent or wrong language code causes WCAG 2.2 SC 3.1.1 (Language of Page) to fail, potentially exposing your product to accessibility compliance liability.
Why this severity: Low because incorrect language declaration rarely causes functional breakage but does reduce SEO precision and introduces an accessibility compliance gap that compounds across all pages.
seo-fundamentals.meta-tags.lang-attributeSee full patternGoogle displays roughly 50-60 characters of the title in desktop results and 40-50 on mobile before truncating with an ellipsis. A six-character title like "Home" wastes every pixel of headline real estate and signals zero keyword intent; a 90-character title gets sliced mid-word, so the meaningful half of your headline never reaches the user. Either extreme depresses click-through rate even when you rank on page one.
Why this severity: Info because the page is still indexed and discoverable — only the CTR and pixel efficiency of the SERP listing are degraded.
seo-fundamentals.meta-tags.title-lengthSee full patternThe SERP description snippet is your only free 150-character ad copy above the fold for an organic listing. A 40-character description leaves empty space that Google may refill with scraped text you did not write; a 240-character description gets truncated mid-sentence, often on a comma or preposition, so the reader never sees the call to action. Both failure modes leak clicks to competitors whose descriptions fit the viewport cleanly.
Why this severity: Info because the description is still present and the page still indexes — the cost is measured in click-through rate, not visibility.
seo-fundamentals.meta-tags.description-lengthSee full patternGoogle treats the H1 as the primary topical signal for a page — stronger than the title tag in many ranking experiments because the H1 sits in body content rather than head metadata. A page with zero H1s gives the crawler nothing to anchor semantic relevance against, so it falls back to the largest visible text, which is often a navigation element or a sidebar heading. A page with multiple H1s (one from the layout, one from the content) dilutes the signal and fails WCAG 2.2 SC 1.3.1 (Info and Relationships) screen-reader expectations, which assistive tech uses to let users jump to the page's main subject.
Why this severity: Critical because the H1 is a top-three ranking signal and a hard accessibility requirement for screen-reader navigation.
seo-fundamentals.content-structure.h1-presentSee full patternSearch engines parse heading tags to build a content outline for your page — a skipped level (H1 → H3) tells crawlers the structure is broken, which can depress keyword relevance signals for the skipped content. WCAG 2.2 SC 1.3.1 (Info and Relationships) and SC 2.4.6 (Headings and Labels) require heading levels to convey actual document structure, not just visual styling. A heading hierarchy violation in a screen reader context causes users to jump unexpectedly between content levels, breaking navigation for keyboard-only and assistive technology users. This also flags on automated accessibility audits, which matters increasingly for enterprise and government procurement.
Why this severity: High because heading hierarchy violations degrade both crawler content parsing and assistive technology navigation simultaneously, compounding SEO and accessibility impact across every page that has the defect.
seo-fundamentals.content-structure.heading-hierarchySee full patternImages without meaningful alt text are invisible to search engine crawlers — Google Image Search cannot index them, and they contribute no keyword signal to the surrounding page content. This is also the most frequently failed WCAG criterion: WCAG 2.2 SC 1.1.1 (Non-text Content) requires every content image to have a text alternative that conveys equivalent information. Screen reader users hear nothing for images missing alt, or hear raw filenames like "IMG_2034.jpg" that provide no useful context. In regulated industries (healthcare, finance, education), missing alt text on content images is an ADA liability vector that has resulted in class-action settlements.
Why this severity: High because missing alt text simultaneously removes image content from search indexing and creates an ADA/WCAG 1.1.1 accessibility barrier that exposes the product to legal liability.
seo-fundamentals.content-structure.image-alt-textSee full patternSearch engines use semantic HTML elements — `<main>`, `<nav>`, `<article>`, `<section>`, `<header>`, `<footer>` — to identify the primary content region, navigation landmarks, and page sections. A layout built entirely from `<div>` elements provides no structural signal; crawlers treat every block of text as equally weighted. WCAG 2.2 SC 1.3.1 (Info and Relationships) requires that structure conveyed visually is also expressed in the markup — missing `<main>` means keyboard users and screen reader users cannot jump directly to the main content, breaking a core navigation pattern.
Why this severity: Medium because semantic HTML absence degrades both crawler content weighting and assistive technology landmark navigation, but unlike broken links or missing meta tags, it does not block indexing outright.
seo-fundamentals.content-structure.semantic-htmlSee full patternSearch crawlers discover pages by following internal links from pages they already know. An orphaned page — one with no inbound link from anywhere else on the site — is invisible to the crawler unless it is listed in the sitemap, and even then it will rank poorly because PageRank flows through links. Internal links also pass anchor-text context that tells Google what a destination page is about; a homepage with no nav and no body links signals that the site has no content depth worth crawling.
Why this severity: Low because the impact scales with site size — a single-page site is unaffected, but a multi-page site silently orphans subpages.
seo-fundamentals.content-structure.internal-linksSee full patternA missing robots.txt means crawlers operate without guidance: Googlebot and other search engines will crawl everything by default, including staging pages, admin routes, or duplicate parameter URLs that dilute crawl budget and can create duplicate-content penalties. A misconfigured `Disallow: /` blocks all crawlers from indexing the entire site, which causes complete de-indexing — the site disappears from search results within days. RFC 9309 formalizes the robots.txt protocol; non-compliance means crawler behavior is undefined across different search engines.
Why this severity: High because a missing robots.txt wastes crawl budget on non-canonical URLs, and a `Disallow: /` misconfiguration causes complete search engine de-indexing of the site.
seo-fundamentals.discoverability.robots-txtSee full patternA sitemap gives Googlebot, Bingbot, and AI crawlers an explicit inventory of every URL you want indexed along with each page's last-modified timestamp. Without one, discovery relies entirely on internal linking — which fails for deep routes, newly published pages, and orphaned content. On a new site the sitemap is often the difference between a week and two months before the first crawl. It also speeds reindexing: updating `lastModified` on a page prompts the crawler to revisit, bypassing the standard rediscovery delay.
Why this severity: High because missing sitemaps directly delay indexing of new pages and block efficient recrawling on established sites.
seo-fundamentals.discoverability.sitemap-existsSee full patternThe `Sitemap:` directive in `robots.txt` is how crawlers you have never heard of — Bing, Yandex, DuckDuckGo, Perplexity, ChatGPT's crawler, and every GEO/AI indexer — discover your sitemap without you submitting it through a dashboard. Google Search Console covers Googlebot alone. If the directive is missing, those crawlers fall back to link discovery only, and any orphaned route never gets indexed outside Google. One line of config unlocks distribution across every search engine and AI system that obeys the standard.
Why this severity: Medium because Google still finds the sitemap via Search Console, but non-Google crawlers and AI indexers miss it entirely.
seo-fundamentals.discoverability.sitemap-in-robotsSee full patternAn unconditional `noindex` tag is the single most destructive SEO defect possible: it removes the page from every search engine, permanently, until the directive is lifted and the page is recrawled. Common failure modes include a staging-site `robots: { index: false }` copied to production, a tutorial snippet left in `app/layout.tsx` that poisons every child route, and CMS preview flags that leak into published content. Recovery is not instant — even after removing the tag, pages often take weeks to return to prior rankings.
Why this severity: Critical because a single unconditional noindex directive at the layout level can deindex the entire site.
seo-fundamentals.discoverability.no-noindex-productionSee full patternHash-based routing (`/#/about`) is invisible to every search crawler: the entire site is treated as a single URL because servers do not receive anything after the `#`. Query-string routing (`/?page=about`) is indexable but ambiguous — Google often merges parameterized URLs into one canonical, splits PageRank across permutations, and rate-limits crawling. Clean path-based URLs are also more shareable, more memorable, and easier to parse in analytics reports.
Why this severity: Low because modern frameworks default to clean URLs — only legacy SPAs and misconfigured custom routers trigger this.
seo-fundamentals.discoverability.clean-urlsSee full patternOpen Graph tags control the preview card that renders when your URL is pasted into Slack, Discord, LinkedIn, Facebook, iMessage, WhatsApp, and every other platform that implements the OG protocol. Without `og:image` specifically, the link unfurls as a plain text snippet — no thumbnail, no visual hook — which drops click-through rates in social feeds by roughly 2-3x compared to cards with images. Launch tweets, blog shares, and newsletter links all degrade simultaneously.
Why this severity: High because missing OG tags silently cripple every share event on every platform — the cost is invisible until you ship.
seo-fundamentals.social-sharing.og-tagsSee full patternFacebook, LinkedIn, and iMessage upscale OG images below 1200x630 to fit their card templates, which produces blurry thumbnails on retina displays — exactly the screens the most active social users are reading on. Images under 600x315 are rejected entirely on some platforms and fall back to no-image cards. Specifying `width` and `height` in metadata also lets scrapers render the correct aspect ratio without fetching the image first, which matters for chat platforms that render previews synchronously.
Why this severity: Medium because shares still unfurl with smaller images, but the preview quality is degraded on every retina device.
seo-fundamentals.social-sharing.og-image-sizeSee full patternTwitter/X reads its own `twitter:*` meta tags in preference to Open Graph before falling back to OG. Explicit `twitter:card` set to `summary_large_image` yields the full-width image card that dominates the feed; falling back to OG works but denies you the ability to use a different image or headline on Twitter than on Facebook — a frequent want when your Twitter audience responds to different framing than your LinkedIn audience. Missing both OG and Twitter tags produces a bare URL with no preview, which performs measurably worse on the algorithm.
Why this severity: Low because Twitter/X falls back to Open Graph tags, so shares still unfurl — only Twitter-specific customization is lost.
seo-fundamentals.social-sharing.twitter-cardSee full patternThe favicon appears in browser tabs, bookmark bars, history menus, the Android home screen when a user adds the site, iOS home screen when the user adds to home screen, and — increasingly — in Google search results next to the domain name. A missing favicon displays a generic globe icon or a blank square, which signals "unfinished" to every visitor and reduces the likelihood that a user recognizes your tab among ten others. It is a five-minute fix that contributes to every single session's brand recall.
Why this severity: Low because the site still functions without a favicon — only brand recognition and tab/bookmark UX are degraded.
seo-fundamentals.social-sharing.favicon-presentSee full patternRun this audit in your AI coding tool (Claude Code, Cursor, Bolt, etc.) and submit results here for scoring and benchmarks.
Open SEO Fundamentals Audit