Duplicate Keys in JSON

Learn how JSON parsers handle duplicate keys in objects, why the behavior is undefined by the spec, and best practices for avoiding duplicate key issues in your data.

Syntax

Detailed Explanation

The JSON specification (RFC 8259) states that object keys "SHOULD be unique" but does not strictly forbid duplicate keys. This means a document like {"name": "Alice", "name": "Bob"} is technically valid JSON that most parsers will accept without error, but the behavior when duplicates are encountered is not standardized across implementations.

What the spec says:

RFC 8259 Section 4 states: "The names within an object SHOULD be unique." The use of "SHOULD" (rather than "MUST") in RFC terminology means that duplicates are allowed but not recommended. The earlier RFC 7159 explicitly noted that when names are not unique, "the behavior of software that receives such an object is unpredictable."

How different parsers handle duplicates:

  • JavaScript (JSON.parse): The last value wins. JSON.parse('{"a":1,"a":2}') produces {a: 2}.
  • Python (json.loads): The last value wins, same behavior as JavaScript.
  • Java (Jackson): By default, the last value wins. Can be configured to throw an error.
  • Go (encoding/json): The last value wins when unmarshaling into a map.
  • jq (command-line tool): The last value wins.
  • Some XML-to-JSON converters: May produce duplicate keys when converting XML with repeated element names.

While "last value wins" is the most common behavior, it is not guaranteed. Some parsers or validators may throw an error, and some may keep the first value instead. Relying on any specific behavior is unsafe.

Security implications:

Duplicate keys have been exploited in security vulnerabilities. When two systems parse the same JSON document with duplicate keys but keep different values (first vs. last), they can disagree about the document's content. This parsing differential has been used in HTTP request smuggling, authentication bypasses, and authorization flaws. An attacker might craft JSON with {"admin": false, "admin": true}, knowing that the authorization check reads the first value while the application uses the second.

How duplicate keys occur:

Duplicates rarely appear in hand-written JSON. They most commonly arise from:

  • Merging multiple JSON objects programmatically without deduplication
  • XML-to-JSON conversion of documents with repeated elements
  • Code generation bugs that serialize the same field twice
  • Manual editing where a key is added without noticing it already exists

Common mistakes developers make:

The biggest mistake is not checking for duplicate keys at all, assuming they cannot occur. Another mistake is relying on the "last value wins" behavior as a feature — this creates fragile code that may break when a different parser or library version is used. Some developers intentionally use duplicate keys as a hack for adding "comments" to JSON ({"_comment": "...", "key": "value"}), which is valid but creates confusion if the comment key is accidentally used twice with different intents.

Best practices:

Treat duplicate keys as errors in your application. Use a JSON linter or validator that flags duplicates (many do, despite the spec's permissiveness). When generating JSON programmatically, use Map or dictionary data structures that enforce key uniqueness. If you are parsing untrusted JSON, validate for duplicate keys as a security measure. In code review, watch for manual JSON construction that could accidentally produce duplicates.

Use Case

Adding a duplicate-key check to a JSON validation pipeline that processes third-party webhook payloads, preventing potential security issues from parsing differentials.

Try It — JSON Formatter

Open full tool