How to Fix 'chain execution stuck' in LlamaIndex (TypeScript)
When LlamaIndex says chain execution stuck, it usually means your workflow never reached a terminal step, or a step is waiting on something that never resolves. In TypeScript, this most often shows up when you mix async steps, forget to return from a handler, or create a loop in your chain wiring.
In practice, this error appears during query pipelines, agent workflows, or custom Workflow/Chain implementations where the runtime is waiting for a promise that never settles.
The Most Common Cause
The #1 cause is an async step that does not return a value or does not resolve properly. In LlamaIndex TypeScript workflows, each step needs to either return the next payload or explicitly complete; if you await something forever, or forget return, the engine keeps waiting and eventually reports that execution is stuck.
Here’s the broken pattern:
import { Workflow } from "llamaindex";
const workflow = new Workflow();
workflow.addStep("retrieve", async (ctx) => {
const docs = await ctx.get("docs");
// BUG: missing return
await processDocs(docs);
});
workflow.addStep("respond", async (ctx) => {
const result = await ctx.get("retrieved");
return { answer: result };
});
And here’s the fixed version:
import { Workflow } from "llamaindex";
const workflow = new Workflow();
workflow.addStep("retrieve", async (ctx) => {
const docs = await ctx.get("docs");
const processed = await processDocs(docs);
return {
retrieved: processed,
next: "respond",
};
});
workflow.addStep("respond", async (ctx) => {
const result = await ctx.get("retrieved");
return { answer: result, done: true };
});
The difference is simple:
| Broken | Fixed |
|---|---|
| Step finishes without returning anything | Step returns the next payload |
| Runtime waits for completion signal that never arrives | Runtime gets an explicit transition |
Often caused by forgotten return in async handlers | Explicit next / done flow |
If you’re using a higher-level API like AgentWorkflow, the same rule applies: every tool call and intermediate step must resolve with a usable output.
Other Possible Causes
1. A promise that never resolves
A hanging network call, file read, or tool invocation will block the chain.
// Broken
await new Promise(() => {}); // never resolves
Fix it by adding timeouts around external calls:
const timeout = <T>(p: Promise<T>, ms: number) =>
Promise.race([
p,
new Promise<never>((_, reject) =>
setTimeout(() => reject(new Error("timeout")), ms)
),
]);
await timeout(fetchData(), 5000);
2. Circular step routing
If step A routes to B and B routes back to A with no exit condition, LlamaIndex will keep executing until it detects the workflow is stuck.
// Broken routing
if (needsMoreWork) return { next: "stepA" };
return { next: "stepB" };
Add a clear stop condition:
if (attempts >= 3) {
return { done: true };
}
return { next: "stepB" };
3. Tool function throws but error is swallowed
In agent flows, an exception inside a tool can look like a silent stall if you catch it and do nothing.
// Broken
try {
await tool.call(input);
} catch (e) {
// swallowed
}
Log and rethrow:
try {
await tool.call(input);
} catch (e) {
console.error("Tool failed:", e);
throw e;
}
4. Context keys don’t match between steps
A step waits for data under one key, but another step writes to a different key name.
// Broken
return { retrivedDocs: docs }; // typo
// Later:
const docs = await ctx.get("retrievedDocs");
Make the contract explicit:
type ChainState = {
retrievedDocs: string[];
};
return { retrievedDocs: docs };
How to Debug It
- •
Check the last step that logged output
- •Add logs before and after every step.
- •If you see “entered step” but not “exiting step”, that’s your hang point.
- •
Verify every async path returns
- •Look for missing
returninsideasynchandlers. - •Watch for branches like:
That branch may end without returning state.if (condition) { await doWork(); }
- •Look for missing
- •
Inspect tool calls and external IO
- •Wrap API calls with timeouts.
- •If the stuck execution disappears after adding a timeout, your issue is outside LlamaIndex.
- •
Reduce to a two-step workflow
- •Remove tools, retrievers, memory, and retries.
- •Keep only one producer step and one consumer step.
- •If the minimal version works, add components back one at a time until it breaks.
Prevention
- •Always make workflow transitions explicit:
- •Return
{ next: "stepName" }or{ done: true }
- •Return
- •Put timeouts on every external dependency:
- •HTTP calls, vector DB queries, file reads, model invocations
- •Type your state object:
- •Catch missing keys like
retrivedDocsvsretrievedDocsat compile time
- •Catch missing keys like
If you’re seeing chain execution stuck, start with the first thing I’d check in production code: an async handler that doesn’t return state. In TypeScript LlamaIndex workflows, that’s usually the real bug hiding behind the error message.
Keep learning
- •The complete AI Agents Roadmap — my full 8-step breakdown
- •Free: The AI Agent Starter Kit — PDF checklist + starter code
- •Work with me — I build AI for banks and insurance companies
By Cyprian Aarons, AI Consultant at Topiax.
Want the complete 8-step roadmap?
Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.
Get the Starter Kit