How to Fix 'JSON parsing error during development' in LangGraph (TypeScript)
When you see JSON parsing error during development in a LangGraph TypeScript app, it usually means one node returned data that LangGraph tried to serialize or parse as JSON, but the payload wasn’t valid JSON. In practice, this shows up during graph execution, checkpointing, tool calls, or when a model/node returns a string that looks like JSON but isn’t actually parseable.
The fix is almost always in the shape of the value you return from a node, tool, or model wrapper. In TypeScript apps, the most common failure is returning a raw object/string mismatch instead of a plain JSON-safe object.
The Most Common Cause
The #1 cause is returning a non-JSON-safe value from a LangGraph node, then letting LangGraph try to persist or pass it through state.
Typical symptoms:
- •
SyntaxError: Unexpected token ... in JSON at position ... - •
Error: Failed to deserialize checkpoint - •
InvalidUpdateErrorwhen the graph expects structured state - •A message like
JSON parsing error during developmentin your local logs
Wrong vs right pattern
| Broken code | Fixed code |
|---|---|
| ```ts | |
| import { StateGraph } from "@langchain/langgraph"; |
type State = { result?: string; };
const graph = new StateGraph<State>({ channels: { result: { value: (x) => x, default: () => "" }, }, });
graph.addNode("fetchData", async () => {
const data = { ok: true, count: 3 };
return JSON.stringify(data); // ❌ returns string when downstream expects object/state
});
|ts
import { StateGraph } from "@langchain/langgraph";
type State = { result?: { ok: boolean; count: number }; };
const graph = new StateGraph<State>({ channels: { result: { value: (x) => x, default: () => ({ ok: false, count: 0 }) }, }, });
graph.addNode("fetchData", async (): Promise<Partial<State>> => { const data = { ok: true, count: 3 }; return { result: data }; // ✅ plain JSON-serializable object });
If your node returns `JSON.stringify(...)`, you are often double-encoding data. LangGraph wants structured values; don’t pre-stringify unless the next step explicitly needs a string.
A second version of this mistake is returning class instances, `Map`, `Set`, `Date`, or functions inside state. Those may look fine in TypeScript but they are not safe checkpoint payloads.
## Other Possible Causes
### 1. Invalid JSON coming from an LLM response
If you ask the model for JSON and parse it directly, one extra comma breaks everything.
```ts
const text = await llm.invoke(prompt);
const parsed = JSON.parse(text.content as string); // ❌ fails if model adds markdown fences
Fix by stripping fences and validating before parsing.
const raw = String(text.content ?? "");
const cleaned = raw.replace(/^```json\s*/i, "").replace(/```$/i, "");
const parsed = JSON.parse(cleaned); // ✅ only after cleanup
2. Returning unsupported values in graph state
LangGraph checkpoints need serializable state. These values are common offenders:
return {
userId,
createdAt: new Date(), // risky depending on serialization path
cache: new Map(), // ❌ not plain JSON
};
Use plain objects and ISO strings:
return {
userId,
createdAt: new Date().toISOString(),
cache: {}, // ✅ plain object if you need persistence
};
3. Mismatched schema between node output and state definition
If your node returns one shape but the graph channel expects another, you can get runtime errors like:
- •
InvalidUpdateError - •
Cannot read properties of undefined - •serialization failures during checkpoint writes
type State = { messages: string[] };
graph.addNode("badNode", async () => {
return { message: "hello" }; // ❌ wrong key
});
Fix the key and type:
graph.addNode("goodNode", async (): Promise<Partial<State>> => {
return { messages: ["hello"] }; // ✅ matches schema
});
4. Tool output is not normalized before returning to the graph
Tool calls often return raw text or SDK-specific objects.
const toolResult = await myTool.invoke(input);
return toolResult; // ❌ may include nested non-serializable fields
Normalize it first:
const toolResult = await myTool.invoke(input);
return {
toolOutput: typeof toolResult === "string" ? toolResult : JSON.parse(JSON.stringify(toolResult)),
};
How to Debug It
- •
Log the exact node output
- •Add a console log right before each
return. - •Check whether you’re returning a string, object, class instance, or something with nested unsupported values.
- •Add a console log right before each
- •
Validate with
JSON.stringifylocally- •If this throws, LangGraph will likely fail too.
- •Example:
const candidate = getStateUpdate(); console.log(JSON.stringify(candidate)); - •If it fails here, fix the payload shape first.
- •
Check the node signature against the graph state
- •Confirm the returned keys match your channel names.
- •Look for accidental wrapping like
{ result }vs{ results }.
- •
Temporarily remove persistence/checkpointing
- •If the error disappears without a checkpointer, the issue is serialization-related.
- •That narrows it down to one of:
- •non-serializable state
- •invalid LLM output being stored
- •mismatched update shape
Prevention
- •Keep LangGraph state plain and boring:
- •objects
- •arrays
- •strings
- •numbers
- •booleans
- •Parse model output only after cleaning it:
- •strip markdown fences
- •validate against Zod or a schema before storing it in state
- •Add a serialization test for every node that returns persisted data:
expect(() => JSON.stringify(nodeOutput)).not.toThrow();
If you want one rule to remember: LangGraph state should look like something you’d safely put into Redis or Postgres as JSON. If it wouldn’t survive that trip cleanly, it will eventually break during development.
Keep learning
- •The complete AI Agents Roadmap — my full 8-step breakdown
- •Free: The AI Agent Starter Kit — PDF checklist + starter code
- •Work with me — I build AI for banks and insurance companies
By Cyprian Aarons, AI Consultant at Topiax.
Want the complete 8-step roadmap?
Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.
Get the Starter Kit