Sequential await calls inside a loop over user-supplied data are simultaneously a latency bomb and a cost bomb: 100 items × 200ms each = 20 seconds of wall time, and each iteration holds an HTTP connection open. When the input array is user-controlled with no length cap — for example req.body.urls.forEach(url => fetch(url)) — the number of outbound requests is directly attacker-controlled. CWE-770 applies: an unbounded array fed into a loop of external calls gives an attacker leverage over your third-party API rate limits, egress costs, and server-side connection count. Even internal N+1 patterns inflate database query counts and slow down all concurrent users.
High because user-controlled array length maps directly to unbounded external API calls, enabling cost abuse and self-inflicted rate-limit bans with no cap.
Replace sequential await-in-loop patterns with bounded concurrency using p-map. Combine with a schema-level array length cap to prevent the input from growing arbitrarily.
import { z } from 'zod'
import pMap from 'p-map'
const ScrapeSchema = z.object({
urls: z.array(z.string().url()).max(50),
})
export async function POST(req: Request) {
const { urls } = ScrapeSchema.parse(await req.json())
const results = await pMap(
urls,
async (url) => {
const res = await fetch(url, { signal: AbortSignal.timeout(5000) })
return res.text()
},
{ concurrency: 5 },
)
return Response.json({ results })
}
The concurrency cap of 5 prevents saturating your outbound connection pool regardless of input size.
ID: ai-slop-cost-bombs.unbounded-operations.n-plus-one-loop-warning
Severity: high
What to look for: Walk source files for fetch(, axios., ky., got(, third-party SDK method calls inside for, while, forEach, .map(, .filter(, .reduce( blocks. For each match, count all loop-bound API call sites and verify the call is wrapped in Promise.all([...]), Promise.allSettled([...]), OR the loop body contains a pacing call (await new Promise(r => setTimeout(r, ...)), pMap(items, fn, { concurrency: ... }), bottleneck.schedule(, pThrottle().
Pass criteria: 100% of loop-bound external API calls are either batched or paced. Report: "X loop-bound API calls inspected, Y batched/paced, 0 sequential-unbounded."
Fail criteria: At least 1 sequential await fetch( (or equivalent) inside a loop with no pacing.
Do NOT pass when: A loop iterates over a user-supplied array (req.body.urls.forEach(url => fetch(url))) — even if there's a small delay, an unbounded array is an unbounded number of calls.
Skip (N/A) when: No external API call patterns found in source files.
Detail on fail: "2 unbounded API loops: req.body.urls.forEach(url => fetch(url)) in src/app/api/scrape/route.ts (1000-item array becomes 1000 sequential requests), users.map(u => axios.get(...)) in src/lib/sync.ts (sequential N+1 fetch)"
Remediation: Sequential awaits in loops compound — 100 items × 200ms each = 20 seconds of latency, AND your origin gets hammered with serial requests. Use bounded concurrency:
// Bad: sequential, unbounded
for (const url of urls) {
await fetch(url)
}
// Good: bounded concurrency
import pMap from 'p-map'
await pMap(urls, async (url) => fetch(url), { concurrency: 5 })
Always validate that the input array length is bounded — req.body.urls should go through schema validation with .max(50).