How to Fix 'async event loop error' in LlamaIndex (TypeScript)

By Cyprian AaronsUpdated 2026-04-21
async-event-loop-errorllamaindextypescript

If you’re seeing async event loop error in LlamaIndex TypeScript, it usually means you called an async API from a context that can’t properly await it. In practice, this shows up when initializing indices, querying engines, or loading documents inside code paths that mix sync and async incorrectly.

The common pattern is simple: you created a Promise, but never awaited it, or you invoked LlamaIndex inside a constructor, top-level sync function, or callback that swallows the async boundary. The result is usually a runtime failure like Error: Event loop is already running, UnhandledPromiseRejection, or a query that never resolves.

The Most Common Cause

The #1 cause is calling LlamaIndex async methods without await, especially around VectorStoreIndex.fromDocuments(), index.asQueryEngine(), or queryEngine.query().

Here’s the broken pattern:

BrokenFixed
```ts
import { Document, VectorStoreIndex } from "llamaindex";

const docs = [ new Document({ text: "Insurance policy covers fire damage." }), ];

// ❌ fromDocuments is async const index = VectorStoreIndex.fromDocuments(docs);

const queryEngine = index.asQueryEngine(); const response = queryEngine.query("What does the policy cover?"); console.log(response); |ts import { Document, VectorStoreIndex } from "llamaindex";

async function main() { const docs = [ new Document({ text: "Insurance policy covers fire damage." }), ];

// ✅ await the async factory const index = await VectorStoreIndex.fromDocuments(docs);

const queryEngine = index.asQueryEngine(); const response = await queryEngine.query("What does the policy cover?"); console.log(response.toString()); }

main().catch(console.error);


The broken version returns a Promise from `fromDocuments()`, then immediately calls methods on it as if it were a ready index. That’s where you’ll see errors like:

- `TypeError: index.asQueryEngine is not a function`
- `UnhandledPromiseRejectionWarning`
- `Error: Cannot read properties of undefined`
- `Error: Event loop is already running`

In LlamaIndex TypeScript, assume these are async unless the docs explicitly say otherwise:

- `VectorStoreIndex.fromDocuments()`
- `storageContextFromDefaults()`
- many retriever/query operations
- external model calls wrapped by agents or tools

## Other Possible Causes

### 1) Top-level code running in a sync entrypoint

If you call async LlamaIndex code directly at module scope, Node may not handle the lifecycle cleanly.

```ts
// Broken
const index = VectorStoreIndex.fromDocuments(docs);
const engine = index.asQueryEngine();
// Fixed
async function bootstrap() {
  const index = await VectorStoreIndex.fromDocuments(docs);
  const engine = index.asQueryEngine();
}

bootstrap().catch(console.error);

2) Mixing CommonJS and ESM incorrectly

LlamaIndex TypeScript projects often run into module issues when tsconfig.json and package settings disagree.

{
  "type": "module",
  "compilerOptions": {
    "module": "CommonJS"
  }
}

That mismatch can produce runtime weirdness that looks like an event loop issue but is really module loading failure. Keep these aligned:

{
  "type": "module",
  "compilerOptions": {
    "module": "NodeNext",
    "moduleResolution": "NodeNext"
  }
}

3) Forgetting to await tool/agent execution

Agents and tools often return Promises. If you log them directly, you’ll get unresolved objects instead of answers.

const response = agent.chat("Summarize this claim.");
console.log(response); // Promise { <pending> }
const response = await agent.chat("Summarize this claim.");
console.log(response.toString());

If you’re using classes like OpenAIAgent, ReActAgent, or tool runners, treat every call as async until proven otherwise.

4) Running inside a framework callback that isn’t async-safe

This happens in Express middleware, cron jobs, serverless handlers, or test hooks.

app.get("/query", (req, res) => {
  const result = engine.query(req.query.q as string);
  res.json(result);
});
app.get("/query", async (req, res) => {
  const result = await engine.query(req.query.q as string);
  res.json({ answer: result.toString() });
});

If your handler doesn’t return or await the promise chain correctly, Node can tear down work before LlamaIndex finishes.

How to Debug It

  1. Find the first async LlamaIndex call

    • Search for fromDocuments, query, chat, run, or any method returning a Promise.
    • Add explicit await everywhere until the error disappears.
  2. Check whether you’re using a real async boundary

    • Your code should look like:
      • async function main() { ... }
      • await main()
    • If everything is happening at module scope, move it into an async function.
  3. Inspect the actual runtime error

    • If you see:
      • UnhandledPromiseRejection
      • TypeError: ... is not a function
      • Cannot read properties of undefined
    • You likely used an unresolved Promise as if it were a concrete object.
  4. Verify Node and TypeScript module settings

    • Confirm your project uses one module system consistently.
    • Check:
      • "type" in package.json
      • "module" and "moduleResolution" in tsconfig.json
    • Mismatches often look like async failures but are loader issues.

Prevention

  • Wrap all LlamaIndex startup code in an explicit async main() and call it with .catch(console.error).
  • Treat every indexing/querying/agent call as async unless the API docs show otherwise.
  • Keep your Node module config consistent:
    • ESM with ESM
    • CommonJS with CommonJS
  • In server code, make route handlers and job processors fully async end-to-end.

If you standardize on one pattern — async bootstrap, awaited LlamaIndex calls, consistent module settings — this class of error goes away fast. The bug usually isn’t in LlamaIndex itself; it’s in how the promise chain is being handled around it.


Keep learning

By Cyprian Aarons, AI Consultant at Topiax.

Want the complete 8-step roadmap?

Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.

Get the Starter Kit

Related Guides