How to Fix 'chain execution stuck during development' in LlamaIndex (TypeScript)
When LlamaIndex says your chain execution is stuck during development, it usually means the agent/chain started running but never reached a terminal step. In TypeScript, this often shows up when an async tool call never resolves, a callback handler waits forever, or your chain is missing the final await/return path.
In practice, this happens most often while wiring QueryEngine, ChatEngine, or custom tools into an Express route, Next.js API route, or a local script. The symptom is the same: no response, no thrown exception, just a hanging process.
The Most Common Cause
The #1 cause is an async function that never resolves inside a tool or custom chain step.
This is common when you wrap external I/O — database calls, HTTP requests, file reads — and forget to return the promise result or accidentally create a deadlock by calling a callback-based API incorrectly.
Broken vs fixed pattern
| Broken | Fixed |
|---|---|
| Tool handler starts work but never returns | Tool handler returns resolved value |
| Chain waits forever on unresolved promise | Chain completes and emits output |
import { FunctionTool } from "llamaindex";
const brokenTool = FunctionTool.from(async () => {
fetch("https://api.example.com/customer/123"); // missing await + return
// execution continues, but the tool returns undefined immediately
});
const fixedTool = FunctionTool.from(async () => {
const res = await fetch("https://api.example.com/customer/123");
const data = await res.json();
return JSON.stringify(data);
});
A more realistic failure is in custom query code:
// Broken
const answer = await queryEngine.query({
query: "Summarize policy 123",
});
// If queryEngine depends on a tool that never resolves, this hangs here.
// Fixed
const answer = await queryEngine.query({
query: "Summarize policy 123",
});
console.log(answer.toString());
If you see logs like:
- •
Running tool: customerLookup - •
AgentWorker.step() - •then nothing after that
you are almost always dealing with a stalled async boundary.
Other Possible Causes
1. Missing await on the top-level chain call
If you fire-and-forget the promise in a serverless route or script, it can look like the chain is stuck because your process exits or your request never flushes.
// Broken
const responsePromise = queryEngine.query({ query: "What is covered?" });
console.log("done"); // logs before the actual result
// Fixed
const response = await queryEngine.query({ query: "What is covered?" });
console.log(response.toString());
2. Callback handler not completing
If you added a custom callback manager and forgot to signal completion, LlamaIndex can keep waiting for lifecycle events.
import { CallbackManager } from "llamaindex";
// Broken: handler logs start but never closes out properly
const callbackManager = new CallbackManager({
onEventStart: async (event) => {
console.log("start", event);
},
onEventEnd: async (event) => {
// missing implementation or throws silently
},
});
// Fixed: ensure both start/end handlers resolve cleanly
const safeCallbackManager = new CallbackManager({
onEventStart: async (event) => console.log("start", event),
onEventEnd: async (event) => console.log("end", event),
});
3. Recursive tool loop with no stop condition
Agents can keep calling tools if your prompt or tool logic encourages repetition. You’ll see repeated traces like AgentWorker / ToolCall / AgentWorker until it times out.
// Broken pattern: tool asks model to call itself again
const prompt = `
Use the search tool until you're fully certain.
If uncertain, call search again.
`;
// Fixed pattern: define a hard stop condition
const promptFixed = `
Use at most one search call.
If data is insufficient, say so.
`;
4. Streaming response not consumed
If you use streaming APIs and never read the stream, some frameworks appear frozen because backpressure blocks completion.
// Broken
const stream = await queryEngine.query({ query: "Explain claim status" });
// stream exists but isn't consumed
// Fixed
const stream = await queryEngine.query({ query: "Explain claim status" });
for await (const chunk of stream) {
process.stdout.write(chunk.delta ?? "");
}
How to Debug It
- •
Turn on verbose logging
- •Enable LlamaIndex debug output and print every tool invocation.
- •Look for the last successful event before the hang.
- •If you stop at
AgentWorker, inspect the next tool boundary.
- •
Isolate the chain from external I/O
- •Replace real HTTP/database calls with hardcoded values.
- •If the hang disappears, your issue is in the integration layer.
- •Reintroduce one dependency at a time.
- •
Test each tool directly
- •Call your
FunctionToolhandler as a normal async function. - •Confirm it returns within a timeout.
- •If it hangs standalone, it will hang inside LlamaIndex too.
- •Call your
- •
Add explicit timeouts
- •Wrap suspicious calls with
Promise.race. - •This gives you a real error instead of an endless wait.
- •Wrap suspicious calls with
function withTimeout<T>(promise: Promise<T>, ms = 5000): Promise<T> {
return Promise.race([
promise,
new Promise<T>((_, reject) =>
setTimeout(() => reject(new Error(`Timeout after ${ms}ms`)), ms)
),
]);
}
Prevention
- •Keep every tool function pure where possible: accept input, return output, no hidden side effects.
- •Add timeouts around network and database calls before they enter LlamaIndex.
- •Write one integration test per tool that asserts it resolves under load and failure conditions.
If you’re seeing chain execution stuck during development, don’t start by blaming LlamaIndex itself. In TypeScript projects, this is usually an unresolved promise, an incomplete callback lifecycle, or an agent prompt that allows infinite recursion.
Keep learning
- •The complete AI Agents Roadmap — my full 8-step breakdown
- •Free: The AI Agent Starter Kit — PDF checklist + starter code
- •Work with me — I build AI for banks and insurance companies
By Cyprian Aarons, AI Consultant at Topiax.
Want the complete 8-step roadmap?
Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.
Get the Starter Kit