Senior 6 min · March 29, 2026

JSON Trailing Comma — Invisible SyntaxError at Position 284

Trailing comma after last property gives SyntaxError at pos 284.

N
Naren · Founder
Plain-English first. Then code. Then the interview question.
About
 ● Production Incident 🔎 Debug Guide
Quick Answer
  • JSON has 6 value types: string (double quotes only), number, boolean (true/false), null, object, array. No Date, undefined, or function.
  • Object: { "key": "value" } — keys MUST be double-quoted. No trailing comma after last property.
  • Array: [1, 2, 3] — ordered list. Return [] for empty lists, not null (null crashes .forEach).
  • Production trap: JSON.stringify() drops undefined values silently — your keys disappear with no error.
  • Silent killer: price as "1999" (string) instead of 1999 (number) → "1999" + 499 = "1999499", not 2498. No error, just wrong math.
  • Biggest mistake: adding comments to JSON (// comment) — JSON has no comment syntax. Strips comments before parsing.
Plain-English First

Think of JSON like a standardised packing slip that every warehouse in the world agrees to read. It doesn't matter if the sender is a Python server in Berlin or a JavaScript app in Tokyo — as long as the packing slip follows the exact same format, both sides can unpack it without confusion. Objects are the labelled compartments on that slip ('quantity: 3, item: shoes'), and arrays are the numbered rows when you have multiple items. The format is ruthlessly strict — one missing comma or one wrong type of quote and the entire slip is rejected at the door.

A single trailing comma in a JSON config file took down a fintech startup's entire deployment pipeline for four hours. Not a logic error. Not a race condition. One comma after the last property in an object — invisible to the eye, fatal to the parser. That's the world you're stepping into.

JSON — JavaScript Object Notation — is the lingua franca of the modern web. REST APIs send it. Configuration files are written in it. Databases like MongoDB store it. Mobile apps receive it. You cannot build anything networked in 2026 without touching JSON constantly. The problem isn't that it's complicated. It's that it looks deceptively simple, so people get sloppy, and sloppy JSON doesn't degrade gracefully — it throws a hard error and stops everything cold.

By the end of this, you'll be able to write valid JSON from scratch without second-guessing yourself, read a raw JSON payload and instantly spot what's wrong with it, understand exactly why each syntax rule exists, and debug the specific parser errors that make junior devs spend an hour staring at code that looks perfectly fine.

What JSON Actually Is — and Why the Rules Are So Unforgiving

Before JSON existed, developers exchanged data using XML. It looked like HTML — verbose, nested tags everywhere, an absolute nightmare to parse and read. A simple user record might take 20 lines of XML. The same data in JSON takes 5. JSON was designed by Douglas Crockford in the early 2000s as a minimal, human-readable data format that any language could parse with a trivial amount of code.

The strictness isn't arbitrary bureaucracy. JSON is designed to be parsed by machines across every programming language on the planet. If the format allowed ambiguity — JavaScript-style trailing commas, single quotes, unquoted keys — every parser would need to handle edge cases differently. Two services would argue about what a payload means. Data corruption follows. The strict rules are what make universal interoperability possible.

JSON only knows six value types: strings, numbers, booleans (true/false), null, objects, and arrays. That's it. No dates. No functions. No undefined. No comments. If you're trying to put a JavaScript Date object directly into JSON, you're going to have a bad time — and we'll cover that. First, understand the two structural building blocks everything else is built from: objects and arrays.

A key insight: JSON is a data interchange format, not a programming language. It's meant to be read by machines, not written by hand. The strictness is a feature, not a bug. Every time you manually edit a JSON file, you're at risk of introducing syntax errors. Use tools (linters, formatters, schema validators) to help you.

io/thecodeforge/json/JsonValueTypes.jsJAVASCRIPT
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
// io.thecodeforge — JavaScript tutorial

// The six legal JSON value types — know these cold.
// Every valid JSON document is built from exactly these primitives.

const validJsonValues = {
  // 1. String — MUST use double quotes. Single quotes = invalid JSON.
  productName: "Running Shoe",

  // 2. Number — integer or decimal, no quotes around it.
  priceInCents: 9999,
  weightKg: 0.85,

  // 3. Boolean — lowercase only. True/False with capitals = invalid JSON.
  inStock: true,
  onSale: false,

  // 4. Null — lowercase only. Represents intentional absence of a value.
  discountCode: null,

  // 5. Object — a nested collection of key-value pairs (covered next section).
  dimensions: {
    lengthCm: 30,
    widthCm: 12
  },

  // 6. Array — an ordered list of values (covered after objects).
  availableSizes: [7, 8, 9, 10, 11]
};

// Convert a JavaScript object to a JSON string — this is what gets sent over the wire.
const jsonString = JSON.stringify(validJsonValues, null, 2);
console.log(jsonString);

// Parse a JSON string back into a JavaScript object — this is what your API receives.
const parsedBack = JSON.parse(jsonString);
console.log(parsedBack.productName); // "Running Shoe"
console.log(typeof parsedBack.inStock); // "boolean" — not a string
Output
{
"productName": "Running Shoe",
"priceInCents": 9999,
"weightKg": 0.85,
"inStock": true,
"onSale": false,
"discountCode": null,
"dimensions": {
"lengthCm": 30,
"widthCm": 12
},
"availableSizes": [
7,
8,
9,
10,
11
]
}
Running Shoe
boolean
Single Quotes Are Valid JavaScript, Invalid JSON
Writing {'name': 'Alice'} is perfectly fine in JavaScript code. But when you send that string to JSON.parse(), it throws SyntaxError: Unexpected token ' in JSON at position 1. JSON requires double quotes for keys and string values. No exceptions. If you're building a JSON string by hand, always use double quotes.
Production Insight
A team stored a JSON config file with single quotes because 'it worked in Node.js require()'.
Node.js's require() for .json files tolerates single quotes? Actually, Node's JSON parser is strict — it fails. But if they were using a custom loader, it might have worked. The production environment used a strict parser. The config failed to load.
The fix? Use a JSON linter in CI. Validate both the config file AND the environment's parser behaviour.
Rule: Never assume your production JSON parser is as tolerant as your development environment. Test with the same parser.
Key Takeaway
JSON has exactly 6 value types: string (double quotes), number, boolean (true/false), null, object, array.
No undefined. No functions. No comments. No trailing commas. No single quotes.
Strictness = interoperability across languages. Embrace it, don't fight it.

JSON Objects: The Key-Value Store That Powers Every API Response

A JSON object is a collection of key-value pairs wrapped in curly braces. The key is always a double-quoted string. The value is any of the six legal JSON types. Key and value are separated by a colon. Pairs are separated by commas. The last pair gets no trailing comma — this is the rule that bites people constantly.

Why no trailing comma? Because JSON was designed to be a strict subset of a specific version of JavaScript from 2001. Trailing commas weren't valid JavaScript then. The spec was frozen, and that decision became permanent. Every JSON parser in every language since then has inherited this constraint. Like it or not, that's the deal.

Objects can nest inside objects. A user object can contain an address object. That address object can contain a geo object. There's no technical depth limit — but if you're nesting more than three or four levels deep in a production API response, that's a design smell. You're probably shipping more structure than the client needs, which wastes bandwidth and makes the payload harder to consume.

null vs missing key: There's a meaningful difference between "discountCode": null and not including the discountCode key at all. null means "this field exists and has no value" — the API is explicitly telling you that no discount was applied. A missing key means "we have no information about this field" — maybe the discount system didn't respond, or maybe the field is optional. Pick one convention per field and document it. Mixing them silently breaks consumers who check if (response.discountCode) expecting a falsy value.

io/thecodeforge/json/JsonObjects.jsJAVASCRIPT
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
// io.thecodeforge — JavaScript tutorial

// Real-world scenario: a checkout service returns this order confirmation
// payload to the frontend after a successful purchase.

const orderConfirmation = {
  "orderId": "ORD-2024-88421",
  "status": "confirmed",
  "totalAmountCents": 15498,
  "currency": "USD",

  // Nested object — customer info lives inside the order object.
  "customer": {
    "customerId": "USR-10042",
    "email": "alex.morgan@example.com",
    "loyaltyTier": "gold"
  },

  // Nested object with its own nested object — shipping details.
  "shippingAddress": {
    "street": "742 Evergreen Terrace",
    "city": "Springfield",
    "stateCode": "IL",
    "postalCode": "62701",
    // Nested geo coordinates for the delivery routing service.
    "geo": {
      "latitude": 39.7817,
      "longitude": -89.6501
    }
  },

  // Boolean flag the frontend uses to decide whether to show a gift message UI.
  "isGiftOrder": false,

  // Null means no promo was applied — explicitly stated, not just absent.
  "promoCode": null
  // ^^^ No trailing comma here. This is the last property. Add one and JSON.parse blows up.
};

// Access nested properties using dot notation after parsing.
const jsonPayload = JSON.stringify(orderConfirmation);
const parsed = JSON.parse(jsonPayload);

console.log(parsed.customer.email);              // "alex.morgan@example.com"
console.log(parsed.shippingAddress.geo.latitude); // 39.7817
console.log(parsed.promoCode);                   // null
console.log(parsed.promoCode === null);           // true — null is explicit, not undefined
Output
alex.morgan@example.com
39.7817
null
true
null vs. Omitting the Key Entirely
There's a meaningful difference between setting a key to null and not including the key at all. null says 'this field exists and has no value.' A missing key says 'we have no information about this field.' In a product API, discountCode: null means 'no discount was applied.' Omitting discountCode entirely might mean the discount system didn't respond in time. Pick one convention per field and document it — mixing them silently destroys consumers.
Production Insight
A payment gateway API sometimes omitted the fraudScore key and sometimes sent "fraudScore": null.
A consumer used if (response.fraudScore > 0.7) — this threw Cannot read property 'fraudScore' of undefined when the key was missing.
The fix: check for key existence first: if (response.hasOwnProperty('fraudScore') && response.fraudScore > 0.7).
Rule: Document null vs missing in your API spec. Use the same convention across all endpoints. Add schema validation that enforces it.
Key Takeaway
Objects are key-value pairs: keys double-quoted, values any JSON type.
No trailing commas after the last property. This rule is non-negotiable.
null means 'explicitly absent'. Missing key means 'no information' — they are not the same.

JSON Arrays: Ordered Lists and the Gotchas Hidden Inside Them

A JSON array is an ordered, comma-separated list of values wrapped in square brackets. The values don't have to be the same type — an array can hold strings, numbers, objects, other arrays, nulls, whatever you want. In practice, mixing types in a production array is a terrible idea because every consumer has to handle it defensively, but the spec allows it.

Arrays are zero-indexed. The first item is at index 0. This matters when you're debugging a parser error and the error message tells you the problem is 'at position 0 in the array' — it means the first element.

Where arrays get genuinely tricky in production is arrays of objects. This is the pattern behind almost every list endpoint in any REST API you'll ever call. A GET /products endpoint returns an array of product objects. A GET /orders endpoint returns an array of order objects. Each object in the array must individually follow all JSON object rules — double-quoted keys, no trailing commas, no comments. I've seen a team waste a full sprint debugging an import feature because a third-party vendor was sending an array where one object out of 500 had a trailing comma. The other 499 parsed fine. That one object silently corrupted the batch.

Empty array vs null: [] is an empty array. null is a null value. If your API returns "products": null when there are no results, every consumer has to add a null check before calling .forEach() or .map(). Someone will forget, causing TypeError: Cannot read properties of null (reading 'forEach') in production. Always return [] for empty lists. Always.

io/thecodeforge/json/JsonArrays.jsJAVASCRIPT
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
// io.thecodeforge — JavaScript tutorial

// Real-world scenario: a product catalog API returns a page of results.
// This is the most common JSON shape you'll encounter — an array of objects.

const catalogPage = {
  "page": 1,
  "pageSize": 3,
  "totalResults": 1482,

  // Array of objects — each element is a complete product record.
  "products": [
    {
      "productId": "SKU-001",
      "name": "Trail Runner X9",
      "priceInCents": 12999,
      "categories": ["footwear", "outdoor", "running"], // Array inside an object inside an array.
      "available": true
    },
    {
      "productId": "SKU-002",
      "name": "Merino Wool Sock Pack",
      "priceInCents": 2499,
      "categories": ["footwear", "accessories"],
      "available": true
    },
    {
      "productId": "SKU-003",
      "name": "Compression Sleeve",
      "priceInCents": 1799,
      "categories": ["recovery"],
      "available": false
      // No trailing comma — this is the last property in the last object.
    }
    // No trailing comma after the last object in the array either.
  ]
};

const jsonString = JSON.stringify(catalogPage, null, 2);
const parsed = JSON.parse(jsonString);

// Iterate over the array of objects — the bread and butter of frontend development.
parsed.products.forEach((product, index) => {
  // Template literals for readable output — note the zero-based index.
  console.log(`[${index}] ${product.name} — $${(product.priceInCents / 100).toFixed(2)} — ${product.available ? 'In Stock' : 'Out of Stock'}`);
});

// Safely access a nested array inside an object inside an array.
console.log(parsed.products[0].categories[1]); // "outdoor" — zero-indexed all the way down
Output
[0] Trail Runner X9 — $129.99 — In Stock
[1] Merino Wool Sock Pack — $24.99 — In Stock
[2] Compression Sleeve — $17.99 — Out of Stock
outdoor
Return [], Not null, for Empty Lists
When a list is empty, always return an empty array [] — never null or omit the key. If your products API returns "products": null when there are no results, every consumer has to add a null check before calling .forEach() or .map(), and someone will forget. That someone will cause an Uncaught TypeError: Cannot read properties of null (reading 'forEach') in production at 11pm on a Friday. Return []. Always.
Production Insight
A third-party API returned "items": null for empty cart responses.
The frontend team used response.items.map(item => ...) because 'the API documentation said items is always an array'.
It wasn't. Production crashed. The vendor changed their API without updating the docs.
The fix: defensive coding: const items = response.items || [];
Rule: Never trust external APIs to always return arrays. Add a fallback to [] on every external data fetch.
Key Takeaway
Array = ordered list, zero-indexed. Use for collections of the same type.
Empty list = [] not null. null crashes .forEach() and .map().
Trailing commas in arrays are also forbidden. Last element: no comma.

The Exact JSON Errors That Break Production Code — and How to Fix Them

JSON errors aren't subtle. The parser hits something invalid and throws immediately with a SyntaxError. The problem is the error message tells you where in the string the parser gave up — not where the actual mistake is. If your JSON is 800 lines long and the error says 'position 4721', good luck finding it without knowing what to look for.

Here are the six mistakes I've personally seen break production systems. Not hypothetical mistakes — real incidents, real error messages, real fixes. These aren't sorted by frequency. They're sorted by how long it takes a junior developer to spot them without knowing they exist.

The nastiest one isn't a syntax error at all. It's type coercion — sending a price as the string '1999' instead of the number 1999. The JSON is perfectly valid, it parses without error, and then your checkout service calculates a total of '1999' + '499' = '1999499' instead of 2498. That specific bug caused a real e-commerce platform to charge customers the wrong amount for six hours before anyone noticed. No error. No alert. Just wrong numbers.

The solution is schema validation. Tools like Zod (TypeScript), Ajv (JavaScript), or Pydantic (Python) validate the shape and types of JSON at the boundary before your business logic touches it. A simple z.object({ price: z.number() }) would have caught the string price immediately. Don't parse JSON and trust it. Validate it.

io/thecodeforge/json/JsonErrorDiagnostics.jsJAVASCRIPT
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
// io.thecodeforge — JavaScript tutorial

// Six real JSON mistakes with their exact error messages and fixes.
// Run each JSON.parse() call individually to see the error — they're isolated below.

// ─── MISTAKE 1: Trailing comma after the last property ───────────────────────
const trailingComma = '{"name": "Alice", "age": 30,}';
// SyntaxError: Unexpected token } in JSON at position 28
// Fix: Remove the comma after 30. JSON doesn't allow trailing commas. Ever.
try {
  JSON.parse(trailingComma);
} catch (e) {
  console.log('Mistake 1:', e.message);
}

// ─── MISTAKE 2: Single quotes instead of double quotes ───────────────────────
const singleQuotes = "{'name': 'Alice'}";
// SyntaxError: Unexpected token ' in JSON at position 1
// Fix: Replace all single quotes with double quotes.
try {
  JSON.parse(singleQuotes);
} catch (e) {
  console.log('Mistake 2:', e.message);
}

// ─── MISTAKE 3: Unquoted key ─────────────────────────────────────────────────
const unquotedKey = '{name: "Alice"}';
// SyntaxError: Unexpected token n in JSON at position 1
// Fix: Wrap the key in double quotes: {"name": "Alice"}
try {
  JSON.parse(unquotedKey);
} catch (e) {
  console.log('Mistake 3:', e.message);
}

// ─── MISTAKE 4: Comment inside JSON ──────────────────────────────────────────
const withComment = '{"name": "Alice" /* the admin user */}';
// SyntaxError: Unexpected token / in JSON at position 17
// Fix: JSON has no comment syntax. Strip all comments before parsing.
try {
  JSON.parse(withComment);
} catch (e) {
  console.log('Mistake 4:', e.message);
}

// ─── MISTAKE 5: undefined as a value ─────────────────────────────────────────
// undefined is a JavaScript concept — it doesn't exist in JSON.
// JSON.stringify() silently drops keys with undefined values.
const objectWithUndefined = {
  username: "alice",
  sessionToken: undefined // This key will vanish during serialisation.
};
const serialised = JSON.stringify(objectWithUndefined);
console.log('Mistake 5 — undefined silently dropped:', serialised);
// Output: {"username":"alice"} — sessionToken is GONE. No error. No warning.
// Fix: Use null for intentional absence. Use a default string like "" if the
// field must always be present.

// ─── MISTAKE 6: Type confusion — price as string instead of number ────────────
const orderA = JSON.parse('{"itemPriceCents": "1999"}'); // String — looks fine.
const orderB = JSON.parse('{"itemPriceCents": 499}');    // Number — correct.

// This is the silent killer. No parse error. Wrong result.
const wrongTotal = orderA.itemPriceCents + orderB.itemPriceCents;
console.log('Mistake 6 — string + number:', wrongTotal); // '1999499' — not 2498!

// Fix: Always validate and coerce types on ingestion.
const correctTotal = Number(orderA.itemPriceCents) + orderB.itemPriceCents;
console.log('Mistake 6 — fixed:', correctTotal); // 2498
Output
Mistake 1: Unexpected token '}' is not valid JSON
Mistake 2: Unexpected token '\'' is not valid JSON
Mistake 3: Unexpected token 'n' is not valid JSON
Mistake 4: Unexpected token '/' is not valid JSON
Mistake 5 — undefined silently dropped: {"username":"alice"}
Mistake 6 — string + number: 1999499
Mistake 6 — fixed: 2498
The Silent Killer: JSON.stringify() Eats Your Data
JSON.stringify() doesn't throw when it hits undefined, functions, or Symbol values — it silently drops them. If your object has a method or an undefined field, those keys disappear without a word. Always log the stringified output in development and compare it against the original object shape. The bug you can't see is worse than the bug that throws.
Production Insight
A team logged API responses using console.log(JSON.stringify(response)) for debugging.
The response contained a circular reference (the request object itself), which JSON.stringify() converts to null at the circular point.
They were debugging why the API seemed to return null for a nested object, but it was the logging that corrupted the data, not the API.
The fix: use a circular-safe stringifier like flatted or safe-json-stringify for debug logs.
Rule: Never assume JSON.stringify() preserves your object structure. It drops undefined, functions, Symbols, and minimises circular references.
Key Takeaway
Trailing commas → error points to closing brace. Always lint.
Single quotes/unquoted keys → error at position 1. Use double quotes everywhere.
undefined → key silently disappears. Replace with null if the field is optional, or omit the key entirely if not needed.
Type confusion → no error, wrong data. Validate schemas at API boundaries.
● Production incidentPOST-MORTEMseverity: high

The Trailing Comma That Killed the Deployment Pipeline

Symptom
Deployment pipeline fails with SyntaxError: Unexpected token } in JSON at position 284. The error points to the closing brace of a large JSON config file. The JSON looks correct to the human eye — all braces match, all quotes are double, keys are quoted.
Assumption
The team assumed the error was a network issue or a corrupted artifact. They re-ran the pipeline 15 times. They checked IAM permissions, S3 bucket policies, and Lambda timeouts. No one thought to look at the trailing comma because 'the file hasn't changed in months'.
Root cause
A developer added a new property to a JSON config file and accidentally left a trailing comma after the previous last property. The file before: {"key": "value"}. The file after: {"key": "value",}. The trailing comma is invisible when you scan the file quickly. JSON.parse() enforces the spec strictly. No trailing commas allowed, even though JavaScript allows them in object literals since ES5. The parser stopped at that comma and threw an error pointing to the closing brace of the entire object — not the comma itself. The team never noticed during development because their local environment used a different config loader that was tolerant of trailing commas. The production environment used a strict JSON parser that failed.
Fix
1. Added a JSON linting step to the CI pipeline: jq empty config.json. This fails fast and points to the exact line of the trailing comma. 2. Configured the team's editors (VS Code) to show trailing commas as errors via setting json.trailingCommas to error. 3. Added a pre-commit hook: lint-staged runs jsonlint on all JSON files before commit. 4. Documented the rule: "JSON does not allow trailing commas. JavaScript object literals do. They are different syntaxes." 5. For all future config parsing, used a JSON5 parser that allows trailing commas in development, but enforced strict JSON in production.
Key lesson
  • Trailing commas are invisible, fatal, and silent until the parser hits them. Lint them automatically.
  • Error messages point to the closing bracket, not the offending comma — this makes manual debugging painful.
  • Local and production environments must use the same JSON parser. A tolerant parser hides bugs until deploy.
  • Add jq empty config.json to every CI pipeline that touches JSON files. It costs 100ms and saves hours.
  • Use editor settings to mark trailing commas as errors. VS Code: "json.trailingCommas": "error".
Production debug guideSymptom → Action mapping for common JSON parsing failures5 entries
Symptom · 01
SyntaxError: Unexpected token } in JSON at position 284 — error points to closing brace
Fix
The error is almost always a trailing comma before that closing brace. Scan the object for a comma after the last property. Use a JSON linter (jq, jsonlint) to find the exact line, not the position number.
Symptom · 02
SyntaxError: Unexpected token ' in JSON at position 1
Fix
Single quotes used instead of double quotes. JSON spec requires double quotes for keys and string values. Replace all ' with " in keys and string values. Single quotes inside strings are fine — they're just characters, not delimiters.
Symptom · 03
SyntaxError: Unexpected token n in JSON at position 1
Fix
Unquoted key. JSON keys must be wrapped in double quotes. {name: "Alice"} should be {"name": "Alice"}. The error points to the first character of the unquoted key ('n').
Symptom · 04
JSON string parses successfully, but data is missing some keys
Fix
The original object had undefined values. JSON.stringify() silently drops keys with undefined values. Check if your source object contains undefined — replace with null if intentional absence, or "" if empty string is acceptable.
Symptom · 05
API returns valid JSON but math operations produce wrong results (e.g., 1999 + 499 = 1999499)
Fix
A numeric field was serialised as a string ("1999" instead of 1999). Validate schema with Zod or Ajv at API boundary. Enforce that price fields are numbers, not strings. Add explicit Number() coercion on ingestion.
★ JSON Debug Cheat SheetFirst 5 minutes of JSON error diagnosis. Commands that find invisible mistakes.
Trailing comma — error points to closing brace
Immediate action
Run JSON through a linter that points to exact line
Commands
jq empty suspicious.json
jsonlint -v suspicious.json
Fix now
Remove comma after last property in object or last element in array.
Single quotes or unquoted keys+
Immediate action
Validate JSON syntax with strict parser
Commands
echo '{"name": "Alice"}' | python -m json.tool
node -e "console.log(JSON.parse(process.argv[1]))" "$JSON_STRING"
Fix now
Replace ' with " for keys and string delimiters. Wrap keys in double quotes.
Undefined values silently dropped+
Immediate action
Log stringified output and compare shapes
Commands
console.log('Before stringify:', originalObject); console.log('After stringify:', JSON.stringify(originalObject));
JSON.stringify(obj, (key, val) => val === undefined ? null : val)
Fix now
Replace undefined with null before stringification, or use replacer function.
Price corruption — string instead of number+
Immediate action
Validate JSON schema at API boundary
Commands
curl -s $API_ENDPOINT | jq '.price' | head -5
jq 'if .price | type != "number" then "ERROR: price is not a number" else "OK" end' payload.json
Fix now
Add schema validation: Zod z.object({ price: z.number() }) or Ajv. Coerce on ingestion: const price = Number(parsed.price). Never trust types.
Comments inside JSON+
Immediate action
Strip comments before parsing
Commands
grep -n '//' config.json
sed 's|//.*$||' config.json | jq empty
Fix now
Remove comments. JSON has no comment syntax. Use separate schema documentation.
JSON Object vs JSON Array
AspectJSON Object {}JSON Array []
StructureKey-value pairs — each value has a nameOrdered list — values accessed by numeric index
Key requirementKeys must be double-quoted stringsNo keys — position is the identifier
Order guaranteeOrder not guaranteed by spec (though most parsers preserve it)Order is guaranteed and meaningful
Access patternparsed.customer.emailparsed.products[0].name
Best forA single entity with named properties (one user, one order)Multiple entities of the same type (list of users, list of orders)
Empty state{} — empty object, zero properties[] — empty array, zero elements
Nested inside each otherObjects can contain arrays as property valuesArrays can contain objects as elements
Type mixing allowedEach property can have a different typeElements can be different types — but don't do it in production

Key takeaways

1
Trailing commas are invisible, fatal, and point to the wrong line in error messages. Lint them automatically with jq empty in CI.
2
Single quotes and unquoted keys are valid in JavaScript but invalid in JSON. Use double quotes everywhere. No exceptions.
3
undefined values in objects are silently dropped by JSON.stringify(). Your keys vanish with no error. Replace with null if you need the field to exist.
4
Return [] for empty lists, not null. null crashes .forEach() and .map(); [] doesn't.
5
A price as a string "1999" instead of a number 1999 is valid JSON but corrupts arithmetic. Validate schemas at API boundaries with Zod or Ajv.
6
JSON has no comment syntax. Comments cause parse failures. Use a separate documentation file or JSON Schema's $comment field.
7
The safe round-trip subset is JSON's six types
string, number, boolean, null, object, array. Everything else transforms or disappears.

Common mistakes to avoid

7 patterns
×

Adding a trailing comma after the last property in an object or the last element in an array

Symptom
SyntaxError: Unexpected token } in JSON at position N. The error message points to the closing brace, not the comma itself — making it hard to find.
Fix
Remove the trailing comma. Use a JSON linter (jq empty file.json) in CI to catch this automatically. Configure your editor to show trailing commas as errors.
×

Using single quotes for keys or string values instead of double quotes

Symptom
SyntaxError: Unexpected token ' in JSON at position 1. The parser chokes on the very first character.
Fix
Replace all single quotes with double quotes for keys and string delimiters. Single quotes inside strings (e.g., "O'Malley") are fine — they're just characters, not delimiters. Use a JSON linter or a simple regex find-replace: sed "s/'/\"/g" file.json — but be careful not to escape single quotes inside strings.
×

Leaving an unquoted key in an object

Symptom
SyntaxError: Unexpected token n in JSON at position 1. The error message shows the first character of the unquoted key.
Fix
Wrap all keys in double quotes. Valid: {"name": "Alice"}. Invalid: {name: "Alice"}. Many developers copy JavaScript object literals directly into JSON files — this is guaranteed to break production.
×

Putting comments inside JSON (// or /* */)

Symptom
SyntaxError: Unexpected token / in JSON at position N. JSON has no comment syntax — the parser sees the slash and throws.
Fix
Remove all comments. If you need to document a JSON file, use JSON Schema with the $comment keyword (allowed by the spec specifically for this purpose), or keep documentation in a separate markdown file. Never put // or /* in production JSON — it will cause a hard parse failure.
×

Passing objects with undefined values through JSON.stringify() without handling them

Symptom
No error. The serialised JSON is missing keys that you expected to be present. Silent data loss.
Fix
Replace undefined with null before stringification using a replacer function: JSON.stringify(obj, (key, val) => val === undefined ? null : val). Or use null instead of undefined in the original object—null is preserved in JSON, undefined is not.
×

Storing numbers as strings in JSON (e.g., price as "1999" instead of 1999)

Symptom
No parse error. The API appears to work. But arithmetic operations produce incorrect results: "1999" + 499 = "1999499" instead of 2498. This is especially dangerous in financial systems.
Fix
Enforce schema validation with Zod or Ajv at the API boundary. Define that price fields must be numbers: z.object({ price: z.number() }). Never trust that a field from an external API is the right type — add explicit coercion: const price = Number(parsed.price).
×

Returning null for empty arrays instead of []

Symptom
Client code that expects .forEach() or .map() on the array crashes with TypeError: Cannot read properties of null (reading 'forEach'). Production outage at exactly the moment the list becomes empty.
Fix
Always return [] for empty lists. If you must return null for legacy reasons, document it clearly and add client-side fallback: const items = response.items || []. Better yet, fix the API to return [].
INTERVIEW PREP · PRACTICE MODE

Interview Questions on This Topic

Q01SENIOR
What's the difference between JSON.stringify() and JSON.parse() in terms...
Q02SENIOR
You're receiving a JSON webhook from a third-party vendor that sometimes...
Q03SENIOR
Your team maintains a large JSON configuration file used by multiple ser...
Q01 of 03SENIOR

What's the difference between JSON.stringify() and JSON.parse() in terms of data loss? Walk me through the exact set of JavaScript values that survive a round-trip (stringify then parse) unchanged, which ones change type, and which ones disappear entirely.

ANSWER
JSON.stringify() converts a JavaScript value to a JSON string. JSON.parse() reverses the process. But not all values survive intact: Survive unchanged: strings, numbers, booleans (true/false), null, arrays of these, objects with string keys and these values. Change type: - Date objects become ISO 8601 strings ("2024-01-01T00:00:00.000Z"). They don't become Date objects on parse — they stay strings. You must manually new Date(parsed.isoString). - NaN, Infinity, -Infinity become null when stringified. Disappear entirely (silent data loss): - undefined values — the entire key-value pair is omitted, not included as null. - Function objects — omitted entirely. - Symbol values — omitted entirely. Throw an error (no output): - Circular references — TypeError: Converting circular structure to JSON. - BigInt values — TypeError: Do not know how to serialize a BigInt (you can add a toJSON method to handle them). The safe round-trip subset is JSON's six data types: string, number, boolean, null, object (with string keys), array. Anything else either changes type, disappears, or throws an error.
FAQ · 5 QUESTIONS

Frequently Asked Questions

01
Why does JSON.parse keep throwing SyntaxError even though my JSON looks correct?
02
What's the difference between a JSON object and a JSON array?
03
How do I handle a JavaScript Date object in JSON since JSON doesn't have a date type?
04
Can JSON handle circular references, and what happens in production if an object with one gets serialised?
05
What's the difference between `null` and omitting a key entirely in a JSON API response?
🔥

That's JS Basics. Mark it forged?

6 min read · try the examples if you haven't

Previous
Object Methods in JavaScript
16 / 16 · JS Basics
Next
Closures in JavaScript