An API with no batch endpoints forces consumers to implement N+1 request loops for common multi-item operations — deleting 50 selected items requires 50 sequential DELETE calls. Each request adds latency, each adds load, and on high-latency connections the cumulative cost is severe. iso-25010:2011 performance-efficiency.time-behaviour is the direct impact: an operation that could complete in one round-trip takes N round-trips because the API surface was not designed for the real usage pattern the client requires.
Low because missing batch endpoints degrade performance for multi-item operations without causing correctness failures, and the impact scales with collection size.
Add batch endpoints for the most common multi-item operations your UI already supports. Always enforce a maximum batch size:
// Batch delete: DELETE /api/items/batch
// Body: { ids: ["id1", "id2", "id3"] }
// Batch create: POST /api/items/batch
// Body: { items: [{ name: "A" }, { name: "B" }] }
// Enforce a maximum in every batch handler:
const MAX_BATCH_SIZE = 100
if (ids.length > MAX_BATCH_SIZE) {
return apiError('BATCH_TOO_LARGE', `Maximum ${MAX_BATCH_SIZE} items per batch`, 400)
}
Prioritize batch endpoints for operations the UI already performs in loops — look for for...of or .map() around fetch() calls in your client code as the signal.
ID: api-design.developer-ergonomics.batch-operations
Severity: low
What to look for: Identify scenarios where a client would need to make multiple sequential requests to accomplish a common task. Look for: (1) UI patterns that operate on multiple items (bulk select, batch delete, mass update) with no corresponding batch API endpoint; (2) Detail endpoints that require fetching a list then individually fetching each item; (3) Related resources that require separate requests per item (fetching a user's orders requires GET /users/:id then GET /orders?userId=:id for each). Check whether bulk/batch endpoints exist for the most common multi-item operations.
Pass criteria: Count all multi-item workflows in the application (bulk select, batch delete, mass update). Either (a) batch endpoints exist for at least 1 common multi-item operation (batch create, batch update, batch delete), or (b) the API design genuinely doesn't require batch operations (all common workflows operate on single resources). If batch endpoints exist, they enforce a maximum batch size of no more than 100 items.
Fail criteria: Common workflows require N+1 requests that could be served by a batch endpoint (e.g., for (const id of ids) await fetch(\/api/items/${id}`, { method: 'DELETE' })`). Client code shows loops making sequential API calls for what should be a single operation.
Skip (N/A) when: The API exclusively operates on single resources by design, with no multi-item workflows. Signal: no UI or client code suggesting multi-select or bulk operations.
Detail on fail: Describe the N+1 pattern (e.g., "Deleting selected items requires individual DELETE /api/items/:id calls. The UI supports multi-select but no batch delete endpoint exists. Deleting 50 items requires 50 requests."). Max 500 chars.
Remediation: Add batch endpoints for common multi-item operations:
// Batch delete:
// DELETE /api/items/batch
// Body: { ids: ["id1", "id2", "id3"] }
// Batch create:
// POST /api/items/batch
// Body: { items: [{ name: "A" }, { name: "B" }] }
// Always enforce a maximum:
const MAX_BATCH_SIZE = 100
if (ids.length > MAX_BATCH_SIZE) {
return apiError('BATCH_TOO_LARGE', `Maximum ${MAX_BATCH_SIZE} items per batch`, 400)
}