How to Fix 'callback not firing' in LlamaIndex (TypeScript)

By Cyprian AaronsUpdated 2026-04-21
callback-not-firingllamaindextypescript

If you’re seeing callback not firing in a LlamaIndex TypeScript app, it usually means your event handler is wired up, but the code path never actually reaches it. In practice, this shows up when using QueryEngine, Retriever, or custom tools where you expect CallbackManager events like onRetrieveStart, onLLMEnd, or onQueryEnd to run, but nothing happens.

Most of the time, the problem is not LlamaIndex itself. It’s one of a few wiring issues: wrong callback registration, using an API that bypasses callbacks, or forgetting to pass the callback manager into the object that actually executes the work.

The Most Common Cause

The #1 cause is attaching callbacks to the wrong object.

In LlamaIndex TypeScript, callbacks only fire if the component doing the work was created with the callback manager. If you add handlers on one instance but execute queries through another, you’ll get silent failure or missing events like:

  • CallbackManager.onEvent
  • LlamaDebugHandler
  • onRetrieveStart
  • onLLMEnd

Here’s the broken pattern:

BrokenFixed
```ts
import { CallbackManager } from "llamaindex";
import { OpenAI, VectorStoreIndex } from "llamaindex";

const callbackManager = new CallbackManager(); callbackManager.addHandler({ onEvent(event) { console.log("event:", event); }, });

const llm = new OpenAI({ model: "gpt-4o-mini" }); const index = await VectorStoreIndex.fromDocuments(docs);

// callback manager never attached to query engine const queryEngine = index.asQueryEngine({ llm });

await queryEngine.query("What is in these docs?"); |ts import { CallbackManager } from "llamaindex"; import { OpenAI, VectorStoreIndex } from "llamaindex";

const callbackManager = new CallbackManager(); callbackManager.addHandler({ onEvent(event) { console.log("event:", event); }, });

const llm = new OpenAI({ model: "gpt-4o-mini", callbackManager }); const index = await VectorStoreIndex.fromDocuments(docs);

const queryEngine = index.asQueryEngine({ llm, callbackManager, });

await queryEngine.query("What is in these docs?");


The fix is simple: attach the same `CallbackManager` to every object that participates in retrieval or generation.

A lot of developers miss this because they assume global registration exists. It doesn’t work that way in most LlamaIndex TypeScript paths.

## Other Possible Causes

### 1) You’re using an API path that doesn’t emit callbacks

Some methods are thin wrappers and may bypass the instrumentation path you expect.

```ts
// May not trigger your expected retrieval callbacks
const response = await index.asQueryEngine().query("Summarize this");

Use the instrumented path explicitly:

const queryEngine = index.asQueryEngine({ callbackManager });
const response = await queryEngine.query("Summarize this");

2) Your handler implements the wrong interface shape

If your handler methods don’t match what LlamaIndex expects, nothing will fire.

// Wrong: method names don't match expected callback hooks
callbackManager.addHandler({
  start(event) {
    console.log(event);
  },
});

Use the actual hook names supported by your version:

callbackManager.addHandler({
  onEvent(event) {
    console.log(event);
  },
});

If you’re using a version with more specific hooks, verify them against your installed package types. TypeScript should help here if you let it.

3) Async work is exiting before callbacks flush

If your process exits immediately after starting a query, logs can look like callbacks never fired.

void queryEngine.query("Hello");
// process may end before logs appear

Fix by awaiting the promise:

await queryEngine.query("Hello");

This matters in scripts, serverless functions, and test runners where execution ends fast.

4) You initialized one LLM but queried through another

This happens when a helper function creates its own default client internally.

const llm = new OpenAI({ model: "gpt-4o-mini", callbackManager });

const queryEngine = index.asQueryEngine();
// internally uses default LLM without your callback manager
await queryEngine.query("Explain clause 7");

Pass the configured model through:

const queryEngine = index.asQueryEngine({
  llm,
  callbackManager,
});

How to Debug It

  1. Confirm your handler is registered

    • Add a raw log inside onEvent.
    • If nothing prints, your handler is not attached to the active execution path.
  2. Check which object owns execution

    • Inspect whether OpenAI, VectorStoreIndex, RetrieverQueryEngine, or custom tools were created with the same callbackManager.
    • In TypeScript apps, one unconfigured instance is enough to break tracing.
  3. Force a known-good event

    • Use a simple query and a minimal handler.
    • If even queryEngine.query("test") produces no events, your wiring is wrong rather than your prompt or data.
  4. Verify package version and types

    • Run:
      npm ls llamaindex
      
    • Then check whether your installed version expects different callback hook names or constructor options.
    • A mismatch between docs and installed package is a common source of “it should work” bugs.

Prevention

  • Pass a single shared CallbackManager through every LlamaIndex component that does retrieval or generation.
  • Keep a tiny smoke test that asserts at least one callback fires on a dummy query.
  • Pin your llamaindex version and check its TypeScript types before copying examples from older docs or blog posts.

If you’re still stuck, reduce it to one file: create one CallbackManager, one OpenAI, one VectorStoreIndex, one asQueryEngine() call. If callbacks fire there, the bug is in how your app composes instances.


Keep learning

By Cyprian Aarons, AI Consultant at Topiax.

Want the complete 8-step roadmap?

Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.

Get the Starter Kit

Related Guides