JSON Trailing Comma — Invisible SyntaxError at Position 284
Trailing comma after last property gives SyntaxError at pos 284.
- JSON has 6 value types: string (double quotes only), number, boolean (true/false), null, object, array. No Date, undefined, or function.
- Object: { "key": "value" } — keys MUST be double-quoted. No trailing comma after last property.
- Array: [1, 2, 3] — ordered list. Return [] for empty lists, not null (null crashes .forEach).
- Production trap: JSON.stringify() drops undefined values silently — your keys disappear with no error.
- Silent killer: price as "1999" (string) instead of 1999 (number) → "1999" + 499 = "1999499", not 2498. No error, just wrong math.
- Biggest mistake: adding comments to JSON (
// comment) — JSON has no comment syntax. Strips comments before parsing.
Think of JSON like a standardised packing slip that every warehouse in the world agrees to read. It doesn't matter if the sender is a Python server in Berlin or a JavaScript app in Tokyo — as long as the packing slip follows the exact same format, both sides can unpack it without confusion. Objects are the labelled compartments on that slip ('quantity: 3, item: shoes'), and arrays are the numbered rows when you have multiple items. The format is ruthlessly strict — one missing comma or one wrong type of quote and the entire slip is rejected at the door.
A single trailing comma in a JSON config file took down a fintech startup's entire deployment pipeline for four hours. Not a logic error. Not a race condition. One comma after the last property in an object — invisible to the eye, fatal to the parser. That's the world you're stepping into.
JSON — JavaScript Object Notation — is the lingua franca of the modern web. REST APIs send it. Configuration files are written in it. Databases like MongoDB store it. Mobile apps receive it. You cannot build anything networked in 2026 without touching JSON constantly. The problem isn't that it's complicated. It's that it looks deceptively simple, so people get sloppy, and sloppy JSON doesn't degrade gracefully — it throws a hard error and stops everything cold.
By the end of this, you'll be able to write valid JSON from scratch without second-guessing yourself, read a raw JSON payload and instantly spot what's wrong with it, understand exactly why each syntax rule exists, and debug the specific parser errors that make junior devs spend an hour staring at code that looks perfectly fine.
What JSON Actually Is — and Why the Rules Are So Unforgiving
Before JSON existed, developers exchanged data using XML. It looked like HTML — verbose, nested tags everywhere, an absolute nightmare to parse and read. A simple user record might take 20 lines of XML. The same data in JSON takes 5. JSON was designed by Douglas Crockford in the early 2000s as a minimal, human-readable data format that any language could parse with a trivial amount of code.
The strictness isn't arbitrary bureaucracy. JSON is designed to be parsed by machines across every programming language on the planet. If the format allowed ambiguity — JavaScript-style trailing commas, single quotes, unquoted keys — every parser would need to handle edge cases differently. Two services would argue about what a payload means. Data corruption follows. The strict rules are what make universal interoperability possible.
JSON only knows six value types: strings, numbers, booleans (true/false), null, objects, and arrays. That's it. No dates. No functions. No undefined. No comments. If you're trying to put a JavaScript Date object directly into JSON, you're going to have a bad time — and we'll cover that. First, understand the two structural building blocks everything else is built from: objects and arrays.
A key insight: JSON is a data interchange format, not a programming language. It's meant to be read by machines, not written by hand. The strictness is a feature, not a bug. Every time you manually edit a JSON file, you're at risk of introducing syntax errors. Use tools (linters, formatters, schema validators) to help you.
{'name': 'Alice'} is perfectly fine in JavaScript code. But when you send that string to JSON.parse(), it throws SyntaxError: Unexpected token ' in JSON at position 1. JSON requires double quotes for keys and string values. No exceptions. If you're building a JSON string by hand, always use double quotes.require()'.require() for .json files tolerates single quotes? Actually, Node's JSON parser is strict — it fails. But if they were using a custom loader, it might have worked. The production environment used a strict parser. The config failed to load.JSON Objects: The Key-Value Store That Powers Every API Response
A JSON object is a collection of key-value pairs wrapped in curly braces. The key is always a double-quoted string. The value is any of the six legal JSON types. Key and value are separated by a colon. Pairs are separated by commas. The last pair gets no trailing comma — this is the rule that bites people constantly.
Why no trailing comma? Because JSON was designed to be a strict subset of a specific version of JavaScript from 2001. Trailing commas weren't valid JavaScript then. The spec was frozen, and that decision became permanent. Every JSON parser in every language since then has inherited this constraint. Like it or not, that's the deal.
Objects can nest inside objects. A user object can contain an address object. That address object can contain a geo object. There's no technical depth limit — but if you're nesting more than three or four levels deep in a production API response, that's a design smell. You're probably shipping more structure than the client needs, which wastes bandwidth and makes the payload harder to consume.
null vs missing key: There's a meaningful difference between "discountCode": null and not including the discountCode key at all. null means "this field exists and has no value" — the API is explicitly telling you that no discount was applied. A missing key means "we have no information about this field" — maybe the discount system didn't respond, or maybe the field is optional. Pick one convention per field and document it. Mixing them silently breaks consumers who check if (response.discountCode) expecting a falsy value.
fraudScore key and sometimes sent "fraudScore": null.if (response.fraudScore > 0.7) — this threw Cannot read property 'fraudScore' of undefined when the key was missing.if (response.hasOwnProperty('fraudScore') && response.fraudScore > 0.7).JSON Arrays: Ordered Lists and the Gotchas Hidden Inside Them
A JSON array is an ordered, comma-separated list of values wrapped in square brackets. The values don't have to be the same type — an array can hold strings, numbers, objects, other arrays, nulls, whatever you want. In practice, mixing types in a production array is a terrible idea because every consumer has to handle it defensively, but the spec allows it.
Arrays are zero-indexed. The first item is at index 0. This matters when you're debugging a parser error and the error message tells you the problem is 'at position 0 in the array' — it means the first element.
Where arrays get genuinely tricky in production is arrays of objects. This is the pattern behind almost every list endpoint in any REST API you'll ever call. A GET /products endpoint returns an array of product objects. A GET /orders endpoint returns an array of order objects. Each object in the array must individually follow all JSON object rules — double-quoted keys, no trailing commas, no comments. I've seen a team waste a full sprint debugging an import feature because a third-party vendor was sending an array where one object out of 500 had a trailing comma. The other 499 parsed fine. That one object silently corrupted the batch.
Empty array vs null: [] is an empty array. null is a null value. If your API returns "products": null when there are no results, every consumer has to add a null check before calling .forEach() or .map(). Someone will forget, causing TypeError: Cannot read properties of null (reading 'forEach') in production. Always return [] for empty lists. Always.
"items": null for empty cart responses.response.items.map(item => ...) because 'the API documentation said items is always an array'.const items = response.items || [];The Exact JSON Errors That Break Production Code — and How to Fix Them
JSON errors aren't subtle. The parser hits something invalid and throws immediately with a SyntaxError. The problem is the error message tells you where in the string the parser gave up — not where the actual mistake is. If your JSON is 800 lines long and the error says 'position 4721', good luck finding it without knowing what to look for.
Here are the six mistakes I've personally seen break production systems. Not hypothetical mistakes — real incidents, real error messages, real fixes. These aren't sorted by frequency. They're sorted by how long it takes a junior developer to spot them without knowing they exist.
The nastiest one isn't a syntax error at all. It's type coercion — sending a price as the string '1999' instead of the number 1999. The JSON is perfectly valid, it parses without error, and then your checkout service calculates a total of '1999' + '499' = '1999499' instead of 2498. That specific bug caused a real e-commerce platform to charge customers the wrong amount for six hours before anyone noticed. No error. No alert. Just wrong numbers.
The solution is schema validation. Tools like Zod (TypeScript), Ajv (JavaScript), or Pydantic (Python) validate the shape and types of JSON at the boundary before your business logic touches it. A simple z.object({ price: would have caught the string price immediately. Don't parse JSON and trust it. Validate it.z.number() })
JSON.stringify() doesn't throw when it hits undefined, functions, or Symbol values — it silently drops them. If your object has a method or an undefined field, those keys disappear without a word. Always log the stringified output in development and compare it against the original object shape. The bug you can't see is worse than the bug that throws.console.log(JSON.stringify(response)) for debugging.JSON.stringify() converts to null at the circular point.null for a nested object, but it was the logging that corrupted the data, not the API.flatted or safe-json-stringify for debug logs.JSON.stringify() preserves your object structure. It drops undefined, functions, Symbols, and minimises circular references.The Trailing Comma That Killed the Deployment Pipeline
SyntaxError: Unexpected token } in JSON at position 284. The error points to the closing brace of a large JSON config file. The JSON looks correct to the human eye — all braces match, all quotes are double, keys are quoted.{"key": "value"}. The file after: {"key": "value",}. The trailing comma is invisible when you scan the file quickly.
JSON.parse() enforces the spec strictly. No trailing commas allowed, even though JavaScript allows them in object literals since ES5. The parser stopped at that comma and threw an error pointing to the closing brace of the entire object — not the comma itself.
The team never noticed during development because their local environment used a different config loader that was tolerant of trailing commas. The production environment used a strict JSON parser that failed.jq empty config.json. This fails fast and points to the exact line of the trailing comma.
2. Configured the team's editors (VS Code) to show trailing commas as errors via setting json.trailingCommas to error.
3. Added a pre-commit hook: lint-staged runs jsonlint on all JSON files before commit.
4. Documented the rule: "JSON does not allow trailing commas. JavaScript object literals do. They are different syntaxes."
5. For all future config parsing, used a JSON5 parser that allows trailing commas in development, but enforced strict JSON in production.- Trailing commas are invisible, fatal, and silent until the parser hits them. Lint them automatically.
- Error messages point to the closing bracket, not the offending comma — this makes manual debugging painful.
- Local and production environments must use the same JSON parser. A tolerant parser hides bugs until deploy.
- Add
jq empty config.jsonto every CI pipeline that touches JSON files. It costs 100ms and saves hours. - Use editor settings to mark trailing commas as errors. VS Code:
"json.trailingCommas": "error".
SyntaxError: Unexpected token } in JSON at position 284 — error points to closing braceSyntaxError: Unexpected token ' in JSON at position 1' with " in keys and string values. Single quotes inside strings are fine — they're just characters, not delimiters.SyntaxError: Unexpected token n in JSON at position 1{name: "Alice"} should be {"name": "Alice"}. The error points to the first character of the unquoted key ('n').undefined values. JSON.stringify() silently drops keys with undefined values. Check if your source object contains undefined — replace with null if intentional absence, or "" if empty string is acceptable."1999" instead of 1999). Validate schema with Zod or Ajv at API boundary. Enforce that price fields are numbers, not strings. Add explicit Number() coercion on ingestion.Key takeaways
jq empty in CI.undefined values in objects are silently dropped by JSON.stringify(). Your keys vanish with no error. Replace with null if you need the field to exist.[] for empty lists, not null. null crashes .forEach() and .map(); [] doesn't."1999" instead of a number 1999 is valid JSON but corrupts arithmetic. Validate schemas at API boundaries with Zod or Ajv.$comment field.Common mistakes to avoid
7 patternsAdding a trailing comma after the last property in an object or the last element in an array
jq empty file.json) in CI to catch this automatically. Configure your editor to show trailing commas as errors.Using single quotes for keys or string values instead of double quotes
sed "s/'/\"/g" file.json — but be careful not to escape single quotes inside strings.Leaving an unquoted key in an object
{"name": "Alice"}. Invalid: {name: "Alice"}. Many developers copy JavaScript object literals directly into JSON files — this is guaranteed to break production.Putting comments inside JSON (// or /* */)
$comment keyword (allowed by the spec specifically for this purpose), or keep documentation in a separate markdown file. Never put // or /* in production JSON — it will cause a hard parse failure.Passing objects with undefined values through JSON.stringify() without handling them
JSON.stringify(obj, (key, val) => val === undefined ? null : val). Or use null instead of undefined in the original object—null is preserved in JSON, undefined is not.Storing numbers as strings in JSON (e.g., price as "1999" instead of 1999)
"1999" + 499 = "1999499" instead of 2498. This is especially dangerous in financial systems.z.object({ price: z.number() }). Never trust that a field from an external API is the right type — add explicit coercion: const price = Number(parsed.price).Returning null for empty arrays instead of []
.forEach() or .map() on the array crashes with TypeError: Cannot read properties of null (reading 'forEach'). Production outage at exactly the moment the list becomes empty.[] for empty lists. If you must return null for legacy reasons, document it clearly and add client-side fallback: const items = response.items || []. Better yet, fix the API to return [].Interview Questions on This Topic
What's the difference between JSON.stringify() and JSON.parse() in terms of data loss? Walk me through the exact set of JavaScript values that survive a round-trip (stringify then parse) unchanged, which ones change type, and which ones disappear entirely.
JSON.stringify() converts a JavaScript value to a JSON string. JSON.parse() reverses the process. But not all values survive intact:
Survive unchanged: strings, numbers, booleans (true/false), null, arrays of these, objects with string keys and these values.
Change type:
- Date objects become ISO 8601 strings ("2024-01-01T00:00:00.000Z"). They don't become Date objects on parse — they stay strings. You must manually new Date(parsed.isoString).
- NaN, Infinity, -Infinity become null when stringified.
Disappear entirely (silent data loss):
- undefined values — the entire key-value pair is omitted, not included as null.
- Function objects — omitted entirely.
- Symbol values — omitted entirely.
Throw an error (no output):
- Circular references — TypeError: Converting circular structure to JSON.
- BigInt values — TypeError: Do not know how to serialize a BigInt (you can add a toJSON method to handle them).
The safe round-trip subset is JSON's six data types: string, number, boolean, null, object (with string keys), array. Anything else either changes type, disappears, or throws an error.Frequently Asked Questions
That's JS Basics. Mark it forged?
6 min read · try the examples if you haven't