How to Fix 'async event loop error during development' in LlamaIndex (TypeScript)
If you’re seeing async event loop error during development while using LlamaIndex in TypeScript, you’re usually hitting a Node runtime mismatch, a bad async boundary, or code that works in one execution mode but not another. The error often shows up during local dev with ts-node, tsx, Next.js API routes, Jest, or when mixing top-level imports with async initialization.
In practice, this is rarely a “LlamaIndex bug”. It’s usually your app calling await in the wrong place, creating multiple loops/contexts, or running LlamaIndex code in an environment that doesn’t like long-lived async work.
The Most Common Cause
The #1 cause is running LlamaIndex initialization at module scope or inside a sync path that gets re-entered during development reloads.
With LlamaIndex TypeScript, classes like OpenAIEmbedding, VectorStoreIndex, and Document are fine. The problem is usually where you instantiate them.
Broken vs fixed pattern
| Broken | Fixed |
|---|---|
| Async work runs at import time or inside a sync handler | Async work runs inside an explicit async function |
| Can trigger event loop issues on hot reload / serverless / tests | Keeps initialization deterministic and isolated |
// ❌ Broken
import { Document, VectorStoreIndex } from "llamaindex";
const docs = [
new Document({ text: "Hello world" }),
];
// This looks innocent, but if the file is imported multiple times
// during dev reloads, you can hit async/event-loop problems.
const index = await VectorStoreIndex.fromDocuments(docs);
export function search() {
return index.asQueryEngine().query("What is this?");
}
// ✅ Fixed
import { Document, VectorStoreIndex } from "llamaindex";
const docs = [
new Document({ text: "Hello world" }),
];
let indexPromise: Promise<VectorStoreIndex> | null = null;
function getIndex() {
if (!indexPromise) {
indexPromise = VectorStoreIndex.fromDocuments(docs);
}
return indexPromise;
}
export async function search() {
const index = await getIndex();
const engine = index.asQueryEngine();
return await engine.query("What is this?");
}
The key change is simple:
- •no top-level
awaitfor app logic unless your runtime explicitly supports it cleanly - •cache the promise if initialization is expensive
- •keep query execution inside an
asyncfunction
If you’re using a request handler, the same rule applies:
// ❌ Broken
export function GET() {
const result = myQueryEngine.query("find policy terms");
return Response.json({ result });
}
// ✅ Fixed
export async function GET() {
const result = await myQueryEngine.query("find policy terms");
return Response.json({ result });
}
Other Possible Causes
1) Mixing CommonJS and ESM incorrectly
LlamaIndex TypeScript expects your module system to be consistent. If your project flips between require() and import, dev-time execution can get weird fast.
// package.json
{
"type": "module"
}
// ✅ ESM style
import { Document } from "llamaindex";
// ❌ CommonJS style in an ESM project
const { Document } = require("llamaindex");
If you use TypeScript, make sure these align:
{
"compilerOptions": {
"module": "NodeNext",
"moduleResolution": "NodeNext"
}
}
2) Running LlamaIndex code in Next.js server components or edge runtime
LlamaIndex relies on Node APIs that may not behave correctly in the Edge runtime.
// next.config.ts or route config
export const runtime = "nodejs"; // ✅ not "edge"
// ❌ Problematic in edge/server component contexts if it triggers async init oddly
import { VectorStoreIndex } from "llamaindex";
If you’re in Next.js:
- •use
runtime = "nodejs" - •keep indexing/querying in route handlers or server actions
- •avoid initializing indexes directly in React server components
3) Jest/Vitest environment not configured for async/ESM
Test runners can surface event loop issues because they sandbox modules differently than Node.
// vitest.config.ts
import { defineConfig } from "vitest/config";
export default defineConfig({
test: {
environment: "node",
globals: true,
threads: false,
},
});
Common failure pattern:
// ❌ importing and initializing at file scope in tests
import { VectorStoreIndex } from "llamaindex";
const index = await VectorStoreIndex.fromDocuments(docs);
Move setup into hooks:
let index: VectorStoreIndex;
beforeAll(async () => {
index = await VectorStoreIndex.fromDocuments(docs);
});
4) Multiple concurrent initializations of the same resources
If two requests race to build the same index, some dev setups will expose timing bugs as event loop errors.
// ❌ Two callers can create two indexes at once
let index: VectorStoreIndex | null = null;
export async function getOrCreateIndex() {
if (!index) {
index = await VectorStoreIndex.fromDocuments(docs);
}
return index;
}
Use a promise lock instead:
let indexPromise: Promise<VectorStoreIndex> | null = null;
export function getOrCreateIndex() {
if (!indexPromise) {
indexPromise = VectorStoreIndex.fromDocuments(docs);
}
return indexPromise;
}
How to Debug It
- •
Find the first real stack frame
- •Don’t stop at the generic message.
- •Look for frames involving
VectorStoreIndex.fromDocuments,RetrieverQueryEngine, or your framework entrypoint.
- •
Check whether any LlamaIndex code runs at import time
- •Search for top-level
await - •Search for
fromDocuments(...)outside functions - •Search for query calls outside request handlers
- •Search for top-level
- •
Confirm your runtime
- •In Next.js, verify you are not on Edge:
export const runtime = "nodejs"; - •In tests, confirm Node environment:
test: { environment: "node" }
- •In Next.js, verify you are not on Edge:
- •
Isolate the smallest repro
- •Create one file that only does:
- •create
Document - •build
VectorStoreIndex - •run one query
- •create
- •If that works, the issue is your framework boundary, not LlamaIndex itself.
- •Create one file that only does:
Prevention
- •Keep all LlamaIndex initialization inside explicit
asyncfunctions. - •Cache promises for expensive objects like indexes and retrievers.
- •Align your module system early:
- •ESM with
"type": "module" - •TypeScript with
"module": "NodeNext"
- •ESM with
- •In Next.js and similar frameworks, force Node runtime for LlamaIndex code paths.
- •Avoid building indexes at import time unless you fully control process startup.
If you want one rule to remember: don’t let LlamaIndex do real work while your module is loading. Put setup behind an async boundary, and most of these event loop errors disappear.
Keep learning
- •The complete AI Agents Roadmap — my full 8-step breakdown
- •Free: The AI Agent Starter Kit — PDF checklist + starter code
- •Work with me — I build AI for banks and insurance companies
By Cyprian Aarons, AI Consultant at Topiax.
Want the complete 8-step roadmap?
Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.
Get the Starter Kit