Technology
Duplicate JSON keys in NestJS: why it’s dangerous, what others miss, and how @pas7/nestjs-strict-json fixes it
Most Node/Nest stacks silently accept duplicate keys in JSON (last-one-wins). That’s a real security and correctness footgun. Here’s the threat model, competition landscape, and why PAS7 Studio built @pas7/nestjs-strict-json.

What you'll get from this post
This is a practical deep-dive for backend engineers: threat model, how duplicate keys behave in the wild, what popular tools do (and don’t), and the approach we took in @pas7/nestjs-strict-json.
• Why duplicate keys exist at all, and why standards warn you about them. [1]
• Why “last-one-wins” parsing is a correctness + security risk. [2][3]
• Competition overview: what libraries like secure-json-parse solve (and what they don’t). [4]
• How @pas7/nestjs-strict-json fits into a NestJS app without turning your API into a science project.
• Recommended rollout strategy: strict mode without breaking production clients overnight.
Why duplicate keys are a real problem (not pedantry)
JSON object member names are *supposed* to be unique for predictable behavior. The JSON standard explicitly warns that when names aren’t unique, implementations can behave unpredictably, and some will keep the last value. That’s not hypothetical — it happens in real parsers and real APIs. [1]
The footgun is simple: different layers may interpret the same payload differently. Your gateway might see the first value, your app might see the last value, your logging might capture something else, and your signature/verification step can be tricked if it serializes/normalizes differently. This class of issues is well documented in security research on JSON interoperability. [3]
Threat model: where duplicate keys hurt you
Duplicate keys are a classic “looks harmless” input trick. Here are common failure modes we’ve seen teams run into:
• Authorization confusion: a proxy or middleware checks one value, the app uses another (e.g., role / scope / isAdmin).
• Validation bypass: schema validation evaluates one representation, business logic consumes another.
• Audit/logging mismatch: incident response becomes harder when logs don’t match the effective parsed payload.
• WAF / cache / signature inconsistencies: security tools normalize differently than runtime parsers (interoperability bugs). [3]
Competition and why it doesn't fully cover this
There are excellent libraries that harden JSON parsing, but many focus on a different attack surface than duplicate keys.
secure-json-parse (Fastify ecosystem)
This is a widely used drop-in parser focused on prototype poisoning risks like __proto__ and constructor.prototype. It’s great for that specific category, but it’s not “duplicate-keys strictness” by default. Think of it as *prototype safety*, not *semantic uniqueness* of keys. [4]
Typical NestJS stacks (Express/Fastify defaults)
Most setups rely on standard JSON parsing behavior and then apply validation (DTO/class-validator/zod). But if duplicates are collapsed during parsing, validators only see the final shape — which can hide the trick that the payload contained duplicates in the first place.
Custom middleware / manual checks
Teams sometimes patch this with ad-hoc middleware. That usually leads to inconsistent behavior across routes, brittle integration with adapters (Express vs Fastify), and missing test coverage. We wanted a clean, reusable Nest-native approach.
@pas7/nestjs-strict-json
Our focus: detect and reject JSON with duplicate keys early and consistently, with predictable behavior across your NestJS app. Same goal as “strict input contracts”: fail fast, fail loudly, and keep observability sane.
What the standard actually says
RFC 8259 warns that object member names should be unique, and when they aren’t, the behavior is not interoperable. In other words: “it parses” is not the bar — *predictability across tooling* is the bar. [1]
Security research goes further: differences in parsing/normalization across components can create real vulnerabilities (JSON interoperability issues). Duplicate keys are a frequent ingredient in these mismatches. [3]
How @pas7/nestjs-strict-json helps in practice
The intent is simple: ensure request JSON is a *single unambiguous map* (no duplicate keys) before it reaches controllers, DTO validation, or business logic.
• Reject duplicate keys early (before validation/business logic).
• Keep behavior consistent across environments (local, staging, production).
• Make failures explicit and debuggable (clear error surface instead of silent overwrites).
• Nest-friendly integration (a drop-in that feels like Nest, not a random script).
Rollout strategy (don't break prod by accident)
If you already have clients in the wild, strict input checks should be rolled out carefully. A safe path:
• Start in report-only mode (log occurrences of duplicates) for a short window.
• Contact/patch the top offending clients; duplicates are often accidental serializer bugs.
• Enable strict rejection gradually per route group (auth/admin first).
• Make it a platform policy: “duplicate keys are invalid input”. Cite it in your API contract.
About PAS7 Studio (why we open-sourced it)
We build performance-first web products and backend systems, and we keep running into the same issue: teams spend real time debugging “impossible states” caused by ambiguous input. Strict JSON is a small guardrail that prevents a whole class of issues.
That’s why we open-sourced @pas7/nestjs-strict-json: to make strictness easy, consistent, and testable in NestJS projects.
Sources and references
Key standards and security references used in this post.
FAQ
The JSON standard warns that non-unique names lead to unpredictable/interoperable behavior; many implementations accept it and keep the last value. That mismatch is exactly why it’s risky. [1][2]
secure-json-parse is primarily about prototype poisoning protection (__proto__/constructor.prototype). It’s a great layer for that category, but it’s not the same as enforcing unique keys in JSON objects. [4]
Validation usually runs *after* parsing, so it often only sees the final object shape after duplicates are collapsed. Strict JSON validation aims to catch the ambiguity earlier.
Start with report-only logging of duplicate-key payloads, fix the top clients, then enable strict rejection gradually per route group. Treat it as an API contract rule.
Want safer NestJS APIs? Start with strict input contracts
If your backend is “secure by DTO”, strict parsing is the missing first step: validate after you ensure the payload itself is unambiguous.