How to Fix 'chain execution stuck during development' in LangChain (TypeScript)
When you see chain execution stuck during development in a LangChain TypeScript app, it usually means the chain is waiting on something that never resolves. In practice, this shows up when you have an async step that does not return, a tool call that hangs, or a callback/stream handler that blocks the run.
This is common during local development because the code path looks fine at a glance, but one async boundary is wrong. The result is a RunnableSequence, AgentExecutor, or custom chain that never completes.
The Most Common Cause
The #1 cause is an async function inside RunnableLambda, tool logic, or a custom chain step that forgets to return the value.
A very common pattern is wrapping work in async and doing the side effect, but not returning the final output. LangChain waits forever for the step to resolve into something usable by the next runnable.
| Broken pattern | Fixed pattern |
|---|---|
async () => { await doWork(); } | async () => { const result = await doWork(); return result; } |
RunnableLambda.from(async () => { ... }) without return | RunnableLambda.from(async () => { ...; return output; }) |
import { RunnableLambda, RunnableSequence } from "@langchain/core/runnables";
const brokenStep = RunnableLambda.from(async (input: string) => {
await fetch("https://example.com/audit", {
method: "POST",
body: JSON.stringify({ input }),
});
// Missing return here causes downstream steps to receive undefined
});
const fixedStep = RunnableLambda.from(async (input: string) => {
await fetch("https://example.com/audit", {
method: "POST",
body: JSON.stringify({ input }),
});
return input.trim();
});
const chain = RunnableSequence.from([
brokenStep,
// downstream runnables may stall or fail with undefined input
]);
If this happens inside a tool, the failure can look like:
- •
Error: Tool execution timed out - •
TypeError: Cannot read properties of undefined - •A run that never reaches
onChainEnd
The fix is simple: every async runnable must resolve with a value.
import { tool } from "@langchain/core/tools";
import { z } from "zod";
export const brokenTool = tool(
async ({ query }) => {
await logQuery(query);
// no return
},
{
name: "broken_tool",
description: "Logs a query",
schema: z.object({ query: z.string() }),
}
);
export const fixedTool = tool(
async ({ query }) => {
await logQuery(query);
return `Logged query: ${query}`;
},
{
name: "fixed_tool",
description: "Logs a query",
schema: z.object({ query: z.string() }),
}
);
Other Possible Causes
1. A model call is waiting on invalid streaming configuration
If you enable streaming but never consume the stream correctly, the chain can appear stuck.
const llm = new ChatOpenAI({
modelName: "gpt-4o-mini",
streaming: true,
});
Fix by either consuming tokens with callbacks or disabling streaming during debugging.
const llm = new ChatOpenAI({
modelName: "gpt-4o-mini",
streaming: false,
});
2. A recursive agent loop never terminates
This often shows up with AgentExecutor when the agent keeps calling tools and never reaches a final answer.
const executor = AgentExecutor.fromAgentAndTools({
agent,
tools,
maxIterations: 3,
});
If maxIterations is missing or too high, your agent may loop indefinitely. Set a hard cap while debugging.
3. A callback handler throws and blocks completion
Custom handlers attached through CallbackManager can break execution if they throw inside handleLLMNewToken, handleChainEnd, or similar methods.
class BrokenHandler {
handleLLMNewToken(token: string) {
throw new Error(`Handler failed on token ${token}`);
}
}
Make handlers defensive:
class SafeHandler {
handleLLMNewToken(token: string) {
try {
console.log(token);
} catch (err) {
console.error("Callback error", err);
}
}
}
4. You are awaiting a promise that never resolves
This happens in custom integrations more than in LangChain itself.
const waitForever = new Promise(() => {});
await waitForever;
If you wrap external APIs, add timeouts:
const timeout = <T>(promise: Promise<T>, ms = 10000) =>
Promise.race([
promise,
new Promise<never>((_, reject) =>
setTimeout(() => reject(new Error("Timeout waiting for dependency")), ms)
),
]);
How to Debug It
- •
Reduce the chain to one runnable
- •Replace your full pipeline with one
RunnableLambda. - •If it completes, add steps back one by one until it hangs.
- •Replace your full pipeline with one
- •
Log every boundary
- •Log before and after each async call.
- •If you see
before tool callbut notafter tool call, that’s your hang point.
- •
Disable streaming and callbacks
- •Turn off
streaming, custom handlers, and tracing. - •If the issue disappears, the bug is in your event handling layer.
- •Turn off
- •
Set hard timeouts
- •Wrap external API calls and tool executions with timeouts.
- •This turns “stuck” into an actionable error like:
- •
Error: Timeout waiting for dependency - •
Tool execution timed out after 10000ms
- •
Prevention
- •Always return explicitly from async runnables, tools, and custom chain steps.
- •Add timeouts around every external dependency:
- •HTTP calls
- •database queries
- •vector store requests
- •During development:
- •keep
maxIterationslow on agents - •disable streaming until the core flow works
- •keep
If you build LangChain TypeScript apps this way, “stuck during development” stops being mysterious. It becomes a normal async bug with a short debug path.
Keep learning
- •The complete AI Agents Roadmap — my full 8-step breakdown
- •Free: The AI Agent Starter Kit — PDF checklist + starter code
- •Work with me — I build AI for banks and insurance companies
By Cyprian Aarons, AI Consultant at Topiax.
Want the complete 8-step roadmap?
Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.
Get the Starter Kit