Number Precision in JSON
Understand JSON number precision limits, IEEE 754 floating-point issues, and how large integers lose accuracy. Learn safe practices for financial and ID data in JSON.
Detailed Explanation
JSON numbers follow the syntax defined in RFC 8259: optional sign, integer part, optional decimal fraction, and optional exponent. Crucially, the JSON specification does not impose any limits on the range or precision of numbers. However, real-world parsers use IEEE 754 double-precision floating-point representation, which introduces precision limits that every developer must understand.
IEEE 754 double-precision limits:
JavaScript's Number type (and most JSON parsers) uses 64-bit IEEE 754 doubles, which provide:
- Integer precision: Exact representation up to 2^53 - 1 (9,007,199,254,740,991), known as
Number.MAX_SAFE_INTEGER - Maximum value: Approximately 1.8 x 10^308
- Smallest positive value: Approximately 5 x 10^-324
- Decimal precision: Approximately 15-17 significant digits
The integer precision problem:
Integers larger than 2^53 lose precision when parsed as JavaScript numbers:
JSON.parse('{"id": 9007199254740993}')
// Result: { id: 9007199254740992 } — off by one!
This is a critical issue for systems that use 64-bit integer IDs. Twitter famously encountered this problem and started including both numeric and string representations of tweet IDs in their API responses ("id" and "id_str").
The decimal precision problem:
Floating-point arithmetic produces well-known rounding errors:
JSON.parse('{"price": 0.1}') + JSON.parse('{"price": 0.2}')
// Result: 0.30000000000000004
This is not a JSON issue but a fundamental property of IEEE 754 representation. Numbers like 0.1 and 0.2 cannot be represented exactly in binary floating-point.
Solutions for large integers:
- String encoding: Send large integers as strings:
{"id": "9007199254740993"} - BigInt (JavaScript): Use a reviver function to convert string IDs to BigInt values
- Custom parsers: Libraries like
lossless-jsonandjson-bigintparse large numbers without losing precision
Solutions for decimal precision:
- Integer representation: Store monetary values as the smallest unit (cents, not dollars):
{"amount_cents": 1999}instead of{"amount": 19.99} - String representation:
{"amount": "19.99"}preserves the exact decimal representation - Decimal libraries: Use libraries like
decimal.jsorbig.jsfor arbitrary-precision arithmetic
Common mistakes developers make:
The most dangerous mistake is using JSON numbers for financial calculations without accounting for floating-point imprecision. Rounding errors of fractions of a cent accumulate over millions of transactions. Another mistake is using numeric types for large identifiers like Snowflake IDs (used by Discord, Twitter, and many distributed systems), which silently lose precision. Developers also sometimes send numbers with trailing zeros (1.50) expecting them to be preserved, but JSON parsers normalize 1.50 to 1.5.
Best practices:
Represent monetary values as integers in the smallest currency unit. Use strings for IDs that may exceed 2^53. Document whether numeric fields are integers or decimals in your API schema. Test with boundary values around Number.MAX_SAFE_INTEGER. Consider using JSON Schema's format keyword to annotate the intended precision of numeric fields.
Use Case
Avoiding silent data corruption in a payment API by encoding transaction amounts as integer cents rather than floating-point dollars, preventing IEEE 754 rounding errors.