How to Fix 'callback not firing' in LangGraph (TypeScript)

By Cyprian AaronsUpdated 2026-04-21
callback-not-firinglanggraphtypescript

What “callback not firing” usually means

In LangGraph TypeScript, this usually means your graph ran, but the event handler, streaming callback, or node-level callback you expected never got invoked. In practice, it shows up when using graph.stream(), graph.invoke(), or a custom RunnableConfig callback and the code path doesn’t actually support the callback shape you passed.

Most of the time, the bug is not in LangGraph itself. It’s a mismatch between how the graph is executed and how callbacks are registered.

The Most Common Cause

The #1 cause is passing callbacks in the wrong place, or expecting invoke() to behave like stream().

invoke() returns the final result. It does not emit per-step stream events. If your logic depends on onUpdate, token events, or node execution hooks, you need stream() or a properly wired callback handler.

Broken vs fixed

Broken patternFixed pattern
Uses invoke() and expects callbacks to fireUses stream() when you need incremental events
Passes callbacks in an unsupported config shapePasses callbacks through RunnableConfig.callbacks or uses stream iteration
// BROKEN: invoke() will not emit streaming-style events
import { StateGraph } from "@langchain/langgraph";

const graph = new StateGraph({
  channels: {
    input: null,
  },
});

const app = graph.compile();

await app.invoke(
  { input: "hello" },
  {
    callbacks: [
      {
        handleLLMNewToken(token: string) {
          console.log("token:", token);
        },
      },
    ],
  }
);

// Expected callback output never appears
// FIXED: use stream() for incremental events
import { StateGraph } from "@langchain/langgraph";

const graph = new StateGraph({
  channels: {
    input: null,
  },
});

const app = graph.compile();

for await (const event of app.stream(
  { input: "hello" },
  {
    callbacks: [
      {
        handleLLMNewToken(token: string) {
          console.log("token:", token);
        },
      },
    ],
  }
)) {
  console.log("event:", event);
}

If you’re using a model inside a node, also make sure the model itself supports callbacks. A plain function node won’t magically emit LLM token events unless it calls an instrumented LangChain model.

Other Possible Causes

1) The callback is attached to the wrong object

A common mistake is attaching callbacks to the graph instead of the runnable that actually produces tokens.

// Wrong: callback attached at top level, but LLM is created elsewhere
const app = graph.compile();

await app.invoke(input, {
  callbacks: [handler],
});
// Right: attach callbacks to the model or pass them into the node's runnable config
const model = llm.bind({
  // model-specific config here
});

await model.invoke("hello", {
  callbacks: [handler],
});

2) Your node returns before async work completes

If a node starts async work and forgets to await it, LangGraph may finish the step before your handler runs.

// Broken
const node = async () => {
  fetch("https://api.example.com/data").then(() => {
    console.log("callback-like side effect");
  });

  return { done: true };
};
// Fixed
const node = async () => {
  await fetch("https://api.example.com/data");
  console.log("side effect after completion");

  return { done: true };
};

3) You’re swallowing errors in the node

If a node throws and catches internally, LangGraph may only surface a generic runtime failure like:

  • Error in node "agent":
  • TypeError: Cannot read properties of undefined
  • LangGraphError: Invalid update from node

Your callback may never run because execution stopped early.

const node = async () => {
  try {
    return await doWork();
  } catch (e) {
    // swallowed error; graph continues with bad state
    return {};
  }
};

Fix by rethrowing or returning a valid state update.

const node = async () => {
  try {
    return await doWork();
  } catch (e) {
    console.error("node failed", e);
    throw e;
  }
};

4) Version mismatch between LangGraph and LangChain packages

This one is brutal in TypeScript because types compile but runtime behavior breaks. If you mix incompatible versions of:

  • @langchain/langgraph
  • @langchain/core
  • langchain

you can get silent callback failures or weird runtime errors like:

  • Cannot read properties of undefined (reading 'callbacks')
  • CallbackManager methods never called

Check your lockfile and keep versions aligned.

{
  "dependencies": {
    "@langchain/langgraph": "^0.2.0",
    "@langchain/core": "^0.2.30",
    "langchain": "^0.2.16"
  }
}

How to Debug It

  1. Confirm whether you need streaming

    • If you expect intermediate updates, switch from invoke() to stream().
    • If nothing changes, your issue is probably not a callback bug at all.
  2. Log inside the exact runnable that should fire

    • Put a console.log() in the node function.
    • If that logs but your callback doesn’t, the problem is wiring.
    • If that doesn’t log, your graph path isn’t executing that node.
  3. Strip config down to minimum

    • Remove custom handlers.
    • Remove extra middleware.
    • Run with only:
      await app.invoke(input);
      
    • Then add pieces back until it breaks.
  4. Check package versions and runtime shape

    • Verify all LangChain packages are on compatible releases.
    • Inspect whether your handler matches the expected interface for that API version.
    • Watch for TypeScript passing types that don’t match runtime objects.

Prevention

  • Use stream() whenever your logic depends on incremental events, token emission, or step-by-step tracing.
  • Keep LangGraph and LangChain package versions pinned together in package.json.
  • Add one integration test per graph that asserts:
    • node execution happened
    • streamed event arrived
    • final state was updated as expected

Keep learning

By Cyprian Aarons, AI Consultant at Topiax.

Want the complete 8-step roadmap?

Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.

Get the Starter Kit

Related Guides