Non-runnable code examples are an AI-slop hallmark: imports that point at fictitious packages, function names that never existed, parameter signatures that drifted two versions ago. When a developer pastes your example and it throws Module not found or TypeError: x is not a function, they lose trust in every other claim in your docs. This is a reference-integrity defect that wastes evaluator time and signals an unmaintained project, tanking adoption and inviting support tickets for problems your docs created.
High because broken examples destroy trust on first paste and block integration outright.
Replace placeholder imports like from 'your-package' with the actual published package name, verify every function and parameter name against the current source, and run each example in a scratch file before committing. A simple validator:
node -e "const { createClient } = require('your-actual-package'); console.log(typeof createClient)"
If the import throws, the example is wrong. Wire a docs-test CI job against docs/**/*.md to catch drift before it ships.
ID: developer-documentation.content-quality.runnable-examples
Severity: high
What to look for: Examine code examples throughout the documentation. Check that they use actual import paths from the package (not pseudocode like import { thing } from 'your-lib'), reference real function names that exist in the codebase, and could be copied into a file and executed without modification (aside from credentials/config). This is the "paste test" -- could a developer paste this code and have it work? Before evaluating, extract and quote the first code example you find and assess whether it would run without modification if pasted into a fresh environment. Count all instances found and enumerate each.
Pass criteria: Code examples use the package's actual import paths and function names. Examples are syntactically valid and could be run after installing the package (modulo environment-specific config like API keys). Variable names and function signatures match the actual API. At least 1 implementation must be confirmed.
Fail criteria: Examples use placeholder imports (from 'your-package'), reference functions that don't exist or have been renamed, use incorrect parameter names, or are pseudocode that cannot be executed. Common AI-generated pattern: examples that look plausible but import from modules that don't exist. Do NOT pass if examples require undocumented environment variables, missing imports, or unavailable dependencies to run.
Skip (N/A) when: The project has no code examples in its documentation.
Detail on fail: Example: "README example imports from 'your-package' instead of the actual package name 'acme-sdk'" or "Example calls client.search() but the actual API method is client.query()" or "Code examples use Python 2 syntax but the project requires Python 3.10+"
Remediation: Update examples to use real package names and verify they work. The fastest way to validate: copy each example into a test file and run it:
# Create a test file from the README example
node -e "const { createClient } = require('your-actual-package'); console.log(typeof createClient)"
If the import fails, the example is wrong.