How to Fix 'callback not firing during development' in LangChain (TypeScript)

By Cyprian AaronsUpdated 2026-04-21
callback-not-firing-during-developmentlangchaintypescript

If you see callback not firing during development in a LangChain TypeScript app, it usually means your callback handler is wired correctly in theory, but the code path you expect is never actually invoking it. In practice, this shows up during local testing with ChatOpenAI, RunnableSequence, agents, or custom tools when the callback manager isn’t attached where the model or runnable is actually executed.

The bug is almost always one of three things: the callback is attached to the wrong object, the async flow exits early, or dev tooling is changing how the code runs. Here’s how to fix it without guessing.

The Most Common Cause

The #1 cause is attaching callbacks to a wrapper object and assuming they propagate automatically. In LangChain JS/TS, callbacks need to be passed to the runnable/model invocation path that actually emits events.

Broken vs fixed

Broken patternFixed pattern
Callback passed to constructor or outer chain onlyCallback passed to .invoke() / .stream() / actual execution call
Wrapper created, but inner runnable never receives handlersHandlers attached at the exact execution boundary
import { ChatOpenAI } from "@langchain/openai";
import { CallbackManager } from "@langchain/core/callbacks/manager";

const cb = CallbackManager.fromHandlers({
  handleLLMStart: async () => console.log("LLM started"),
  handleLLMEnd: async () => console.log("LLM ended"),
});

// ❌ Broken: callback may not fire as expected during execution
const modelBroken = new ChatOpenAI({
  modelName: "gpt-4o-mini",
  callbacks: [cb],
});

await modelBroken.invoke("Write a haiku");

// ✅ Fixed: pass callbacks at invocation time
const modelFixed = new ChatOpenAI({
  modelName: "gpt-4o-mini",
});

await modelFixed.invoke("Write a haiku", {
  callbacks: [cb],
});

If you’re using runnables, the same rule applies:

import { RunnableSequence } from "@langchain/core/runnables";

const chain = RunnableSequence.from([
  // steps here
]);

// ❌ Broken if your expectation is that constructor-level callbacks always propagate
// await chain.invoke(input);

// ✅ Fixed
await chain.invoke(input, {
  callbacks: [cb],
});

A lot of developers hit this with handleLLMStart and then think LangChain is broken. It isn’t; the handler just isn’t attached to the actual run.

Other Possible Causes

1) You’re using stream() but expecting invoke() behavior

Streaming uses different event timing. If your handler only logs on handleLLMEnd, you may think nothing fired when the stream starts but never completes in dev.

const result = await model.stream("Tell me a joke", {
  callbacks: [cb],
});

// If you never consume the stream, end events may never arrive.
for await (const chunk of result) {
  process.stdout.write(chunk.content ?? "");
}

2) Your tool/function never gets called

If you’re debugging agent callbacks like handleToolStart, but the agent route never selects that tool, your callback won’t fire.

import { DynamicStructuredTool } from "@langchain/core/tools";

const tool = new DynamicStructuredTool({
  name: "lookup_customer",
  description: "Lookup customer by ID",
  schema: /* zod schema */,
  func: async () => "customer data",
});

// If agent doesn't choose this tool, no tool callback fires.

Check whether your prompt and tool descriptions actually make tool usage likely.

3) You’re on an old LangChain package mix

LangChain JS moved fast across langchain, @langchain/core, and provider packages like @langchain/openai. A version mismatch can produce weird runtime behavior where handlers don’t behave consistently.

{
  "dependencies": {
    "langchain": "^0.2.0",
    "@langchain/core": "^0.2.0",
    "@langchain/openai": "^0.2.0"
  }
}

Make sure these are aligned. Mixed major/minor versions are a common source of “works in prod, not in dev” confusion.

4) Dev server hot reload is recreating state

In Next.js, Vite, or tsx watch mode, module reloads can recreate objects between steps. If your callback manager lives in module scope and gets reinitialized mid-run, it can look like callbacks vanished.

// ❌ Avoid relying on mutable module state during HMR
let cbInstance = CallbackManager.fromHandlers({ /* ... */ });

// ✅ Create per-request / per-run instances
export function createCallbacks() {
  return CallbackManager.fromHandlers({ /* ... */ });
}

If you’re seeing logs disappear after file edits, this is often the reason.

How to Debug It

  1. Confirm which method is actually running

    • Add a plain console.log("before invoke") and console.log("after invoke").
    • If neither appears, your code path isn’t being hit.
    • If both appear but no callback logs show up, it’s a wiring issue.
  2. Attach a minimal handler

    • Use only handleLLMStart, handleLLMEnd, and handleChainError.
    • Don’t start with tracing frameworks or custom wrappers.
    • Example:
      const cb = CallbackManager.fromHandlers({
        handleLLMStart: async () => console.log("start"),
        handleLLMEnd: async () => console.log("end"),
        handleChainError: async (e) => console.log("error", e.message),
      });
      
  3. Move callbacks to the exact invocation call

    • Test both:
      await runnable.invoke(input, { callbacks: [cb] });
      
    • And:
      await runnable.stream(input, { callbacks: [cb] });
      
    • If one works and the other doesn’t, you’ve found an execution-path mismatch.
  4. Verify package versions and runtime

    • Print versions from lockfile or package manager output.
    • Confirm Node version matches what LangChain supports.
    • Check whether HMR/dev server is recreating objects between runs.

Prevention

  • Pass callbacks at the point of execution (invoke, stream, agent run), not just at construction time.
  • Keep LangChain packages version-aligned across langchain, @langchain/core, and provider packages.
  • In dev environments with hot reload, create callback managers per request or per run instead of storing them globally.

If you want one rule to remember: when a LangChain callback doesn’t fire in TypeScript development, assume attachment scope first, not framework magic. Most fixes are about making sure the handler sits on the exact runnable or model call that emits the event.


Keep learning

By Cyprian Aarons, AI Consultant at Topiax.

Want the complete 8-step roadmap?

Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.

Get the Starter Kit

Related Guides