Core Web Vitals targets are only useful if you know whether you're meeting them in production. Lab scores from a single Lighthouse run miss real-user variance across device classes, connection speeds, and geographic regions. Without monitoring — a reportWebVitals hook, web-vitals instrumentation, or Lighthouse CI asserting thresholds in CI — regressions introduced by new features or third-party script additions go undetected until Google Search Console flags a ranking demotion. ISO-25010 time-behaviour quality cannot be assured without a measurement mechanism. Teams that don't measure CWV routinely discover they've been failing for months when they first run a Lighthouse audit.
Info because absent monitoring doesn't immediately harm users, but it guarantees that performance regressions go undetected until they've already damaged search rankings.
Add CWV measurement at two levels: CI assertions to catch regressions before deploy, and real-user instrumentation to catch regressions in production.
npm install --save-dev @lhci/cli
// .lighthouserc.json
{
"ci": {
"assert": {
"assertions": {
"largest-contentful-paint": ["error", { "maxNumericValue": 2500 }],
"cumulative-layout-shift": ["error", { "maxNumericValue": 0.1 }],
"interaction-to-next-paint": ["warn", { "maxNumericValue": 200 }]
}
}
}
}
// app/layout.tsx — real-user measurement
export function reportWebVitals(metric: NextWebVitalsMetric) {
// Send to your analytics destination
console.log(metric.name, metric.value)
}
Run npx lhci autorun in CI to block deploys that regress LCP or CLS past threshold.
ID: marketing-page-speed.core-web-vitals.cwv-monitoring
Severity: info
What to look for: Check for CWV measurement or CI assertion configuration: Lighthouse CI config files (.lighthouserc.json, .lighthouserc.yaml, lighthouserc.js), performance budget files (budget.json), web-vitals npm package usage (web-vitals in package.json), Next.js reportWebVitals export in app/layout.tsx or pages/_app.tsx, or Google Search Console integration evidence. Count all instances found and enumerate each.
Pass criteria: Any of the above is present — the project has a mechanism to measure or assert CWV in CI or production. At least 1 implementation must be confirmed.
Fail criteria: No CWV measurement, budget, or CI assertion configured. The team has no visibility into whether CWV thresholds are being met.
Skip (N/A) when: Never — observability into CWV applies to all marketing sites.
Detail on fail: "No Lighthouse CI config, performance budget, or web-vitals instrumentation found. CWV status is unmeasured."
Remediation: Add CWV measurement at two levels:
# 1. CI assertion with Lighthouse CI
npm install --save-dev @lhci/cli
// .lighthouserc.json
{
"ci": {
"assert": {
"assertions": {
"largest-contentful-paint": ["warn", { "maxNumericValue": 2500 }],
"cumulative-layout-shift": ["error", { "maxNumericValue": 0.1 }],
"interaction-to-next-paint": ["warn", { "maxNumericValue": 200 }]
}
}
}
}
// 2. Real user measurement with web-vitals
// app/layout.tsx (Next.js)
export function reportWebVitals(metric) {
console.log(metric) // replace with your analytics destination
}