How to Fix 'chain execution stuck in production' in LangGraph (TypeScript)
When a LangGraph chain gets “stuck” in production, it usually means the graph never reaches a terminal condition or one of the nodes keeps re-entering the same state. In TypeScript, this often shows up as an execution that hangs until your request times out, or as GraphRecursionError: Recursion limit reached when the graph keeps looping.
Most of the time, this is not a LangGraph bug. It’s a state transition problem, a missing stop condition, or an async node that never resolves.
The Most Common Cause
The #1 cause is a node that returns state in a way that keeps routing back to itself forever.
In LangGraph, this usually happens when your conditional edge never returns a terminal path like END, or your reducer keeps appending messages without changing the decision input.
Broken vs fixed
| Broken pattern | Fixed pattern |
|---|---|
| Node keeps returning the same state shape and router always sends it back to the same node | Node updates a completion flag and router can exit to END |
| No explicit stop condition | Explicit stop condition based on state |
import { StateGraph, END } from "@langchain/langgraph";
type State = {
messages: string[];
done?: boolean;
};
const graph = new StateGraph<State>();
graph.addNode("worker", async (state) => {
return {
messages: [...state.messages, "processed"],
// done is never set
};
});
graph.addConditionalEdges("worker", (state) => {
// Always routes back to worker
return "worker";
});
graph.setEntryPoint("worker");
import { StateGraph, END } from "@langchain/langgraph";
type State = {
messages: string[];
done: boolean;
};
const graph = new StateGraph<State>();
graph.addNode("worker", async (state) => {
const nextMessages = [...state.messages, "processed"];
return {
messages: nextMessages,
done: true,
};
});
graph.addConditionalEdges("worker", (state) => {
return state.done ? END : "worker";
});
graph.setEntryPoint("worker");
The key difference is simple: the router must be able to observe progress. If every iteration looks identical to the router, you’ve built an infinite loop.
Other Possible Causes
1. A node returns a never-resolving Promise
If an async tool call hangs, the whole chain appears stuck.
// Broken
graph.addNode("fetchData", async () => {
await new Promise(() => {}); // never resolves
return { ok: true };
});
// Fixed
graph.addNode("fetchData", async () => {
const controller = new AbortController();
const timeout = setTimeout(() => controller.abort(), 10_000);
try {
const res = await fetch("https://api.example.com/data", {
signal: controller.signal,
});
return { ok: res.ok };
} finally {
clearTimeout(timeout);
}
});
2. Your reducer appends forever and blows up state
LangGraph state reducers should be deterministic and bounded. If you keep pushing into messages without trimming or stopping, execution can look stuck before it eventually fails.
// Broken
messages: (prev, next) => [...prev, ...next]
// Fixed
messages: (prev, next) => {
const merged = [...prev, ...next];
return merged.slice(-20); // keep last N messages only
}
3. Conditional routing returns an invalid label
If your router returns a value that doesn’t match any edge, execution may fail with errors like:
- •
InvalidUpdateError - •
ValueError: Unknown node - •silent no-op behavior depending on how you wired the graph
// Broken
graph.addConditionalEdges("router", (state) => {
return state.needsTool ? "tools" : "finish"; // "finish" not mapped
});
// Fixed
graph.addConditionalEdges("router", (state) => {
return state.needsTool ? "tools" : END;
});
4. Streaming consumer never drains the iterator
This one shows up in production when you use .stream() but don’t consume it correctly.
// Broken
const stream = await app.stream(input);
// nothing reads from stream -> request appears stuck
// Fixed
for await (const chunk of await app.stream(input)) {
console.log(chunk);
}
If you’re using serverless functions or HTTP handlers, make sure the stream is actually iterated or piped to the response.
How to Debug It
- •
Check whether you hit recursion protection
- •Look for
GraphRecursionError: Recursion limit reached. - •If you see it, your graph is looping. The fix is in routing logic, not infrastructure.
- •Look for
- •
Log every node transition
- •Print node name, input keys, and output keys.
- •You want to see whether state changes between iterations.
graph.addNode("worker", async (state) => {
console.log("[worker] input", { done: state.done, count: state.messages.length });
const output = { ...state, done: true };
console.log("[worker] output", { done: output.done, count: output.messages.length });
return output;
});
- •Temporarily lower recursion limit
- •This makes loops fail fast instead of hanging forever.
- •Use it while debugging only.
const result = await app.invoke(input, {
recursionLimit: 5,
});
- •Isolate external calls
- •Replace LLM/tool calls with hardcoded returns.
- •If the graph finishes with mocked nodes but hangs with real ones, the issue is network latency, retries, or an unresolved promise.
Prevention
- •Always design one explicit terminal path with
END. - •Put timeouts on every external call inside nodes.
- •Keep graph state small and bounded; trim message history early.
- •Add logging around routers so you can see why each transition was chosen.
- •Write one test that asserts termination for every major path in the graph.
If you’re seeing chain execution stuck in production, treat it like a control-flow bug first. In LangGraph TypeScript apps, stuck chains are almost always caused by bad routing, unresolved async work, or unbounded state growth — and all three are fixable once you trace the node transitions end to end.
Keep learning
- •The complete AI Agents Roadmap — my full 8-step breakdown
- •Free: The AI Agent Starter Kit — PDF checklist + starter code
- •Work with me — I build AI for banks and insurance companies
By Cyprian Aarons, AI Consultant at Topiax.
Want the complete 8-step roadmap?
Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.
Get the Starter Kit