How to Fix 'JSON parsing error when scaling' in LangGraph (TypeScript)
When LangGraph throws a JSON parsing error when scaling, it usually means one of your nodes is returning data that can’t be serialized cleanly between steps. In practice, this shows up when you move from a single-node test to a multi-node graph, or when you start using StateGraph with checkpointing, streaming, or remote execution.
The common pattern is simple: something in your state is no longer plain JSON. That can be a Date, Map, class instance, undefined, circular object, or an AI message shape that doesn’t match the graph state schema.
The Most Common Cause
The #1 cause is returning non-JSON-serializable values from a node in a StateGraph. LangGraph expects state updates to be plain JSON-compatible objects because they get merged, persisted, and sometimes shipped across process boundaries.
Here’s the broken pattern:
| Broken | Fixed |
|---|---|
Returns Date and class instances directly | Converts values to strings/plain objects |
| Mutates state with non-serializable fields | Returns a clean partial state update |
import { StateGraph, Annotation } from "@langchain/langgraph";
class Customer {
constructor(public id: string, public createdAt: Date) {}
}
const GraphState = Annotation.Root({
customer: Annotation<Customer | null>(),
});
const graph = new StateGraph(GraphState)
.addNode("loadCustomer", async () => {
// ❌ Broken: Date + class instance inside graph state
return {
customer: new Customer("c_123", new Date()),
};
})
.addEdge("__start__", "loadCustomer")
.addEdge("loadCustomer", "__end__")
.compile();
Fixed version:
import { StateGraph, Annotation } from "@langchain/langgraph";
const GraphState = Annotation.Root({
customer: Annotation<{ id: string; createdAt: string } | null>(),
});
const graph = new StateGraph(GraphState)
.addNode("loadCustomer", async () => {
// ✅ Fixed: plain JSON object only
return {
customer: {
id: "c_123",
createdAt: new Date().toISOString(),
},
};
})
.addEdge("__start__", "loadCustomer")
.addEdge("loadCustomer", "__end__")
.compile();
If you’re seeing errors like:
- •
SyntaxError: Unexpected token o in JSON at position 1 - •
TypeError: Converting circular structure to JSON - •
Error during serialization of state - •
InvalidUpdateErrorafter a node returns data
this is usually the root cause.
Other Possible Causes
1. Returning full AI message objects instead of plain message content
If you’re using @langchain/core/messages, don’t shove arbitrary nested objects into state unless your reducer/schema expects them.
// ❌ Broken
return {
messages: [aiMessage], // aiMessage may contain metadata that breaks serialization
};
// ✅ Fixed
return {
messages: [
{
role: "assistant",
content: aiMessage.content,
},
],
};
If you are using LangChain message classes end-to-end, make sure your graph state is typed for those exact message types and your persistence layer supports them.
2. Using undefined, functions, or symbols in state
JSON drops or rejects these values depending on where they appear.
// ❌ Broken
return {
result: undefined,
transform: () => "x",
};
// ✅ Fixed
return {
result: null,
};
Keep node outputs limited to:
- •strings
- •numbers
- •booleans
- •arrays
- •plain objects
- •
null
3. Circular references from API responses or SDK objects
Some SDK response objects include internal references that explode during serialization.
// ❌ Broken
const response = await someSdk.call();
return { response };
// ✅ Fixed
const response = await someSdk.call();
return {
response: {
id: response.id,
status: response.status,
data: response.data,
},
};
Do not pass raw SDK instances into graph state unless you’ve verified they serialize cleanly.
4. Checkpointing with incompatible storage payloads
If you use a checkpointer like MemorySaver, Redis, Postgres, or custom persistence, the bug may appear only when scaling because the state is being saved/restored between runs.
import { MemorySaver } from "@langchain/langgraph";
const checkpointer = new MemorySaver(); // fine for dev, but still needs serializable state
const app = graph.compile({ checkpointer });
If one node returns a non-serializable value, it may work in-memory for one step and fail later when the checkpoint tries to persist it.
How to Debug It
- •
Log each node’s return value before it hits the graph
- •Wrap node outputs with
console.dir(value, { depth: null }). - •Look for
Date, class instances, functions, nested SDK objects, or circular structures.
- •Wrap node outputs with
- •
Validate the payload with
JSON.stringifylocally- •If this throws, LangGraph will likely fail too.
- •Test each node output individually:
JSON.stringify(nodeOutput); - •
Check your state schema
- •Make sure your
Annotation.Root(...)types match what nodes actually return. - •A mismatch like
stringvs object often surfaces as parsing/serialization noise later.
- •Make sure your
- •
Disable checkpointing temporarily
- •If the error disappears without a checkpointer, the issue is likely in persistence/serialization.
- •Re-enable it after fixing the returned shapes.
Prevention
- •Keep LangGraph state strictly JSON-safe.
- •Convert all external objects into plain DTOs before returning them from nodes.
- •Add a small serialization test for every node output:
expect(() => JSON.stringify(output)).not.toThrow();
If you want one rule to follow: never return anything from a LangGraph node that you wouldn’t store directly in Redis as JSON. That catches most scaling-time parsing failures before they reach production.
Keep learning
- •The complete AI Agents Roadmap — my full 8-step breakdown
- •Free: The AI Agent Starter Kit — PDF checklist + starter code
- •Work with me — I build AI for banks and insurance companies
By Cyprian Aarons, AI Consultant at Topiax.
Want the complete 8-step roadmap?
Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.
Get the Starter Kit