How to Fix 'prompt template error during development' in LangGraph (TypeScript)

By Cyprian AaronsUpdated 2026-04-22
prompt-template-error-during-developmentlanggraphtypescript

When you see prompt template error during development in LangGraph, it usually means your prompt function or template is being evaluated before the variables it expects are available. In TypeScript projects, this often shows up when a node returns the wrong shape, a prompt references missing keys, or you pass a raw string where LangChain expects a structured prompt.

The error usually appears during graph execution, not at compile time. That makes it annoying: the code looks fine, but ChatPromptTemplate, MessagesPlaceholder, or a custom Runnable blows up once the graph starts running.

The Most Common Cause

The #1 cause is a mismatch between the state your node returns and the variables your prompt template expects.

If your prompt uses {input} but your node returns { question }, LangGraph will fail when it tries to format the prompt. In practice, you’ll see errors like:

  • Error: Missing value for input variable 'input'
  • Error: PromptTemplate requires values for: input
  • TypeError: Cannot read properties of undefined
  • LangGraphError wrapping a prompt formatting failure

Broken vs fixed pattern

BrokenFixed
Node returns questionNode returns input
Prompt expects {input}Prompt receives {input}
Graph state and template are inconsistentState keys match template keys
// BROKEN
import { StateGraph } from "@langchain/langgraph";
import { ChatPromptTemplate } from "@langchain/core/prompts";

const prompt = ChatPromptTemplate.fromMessages([
  ["system", "You are a helpful assistant."],
  ["human", "{input}"],
]);

type State = {
  question: string;
  answer?: string;
};

const graph = new StateGraph<State>({
  channels: {
    question: { value: (x, y) => y ?? x },
    answer: { value: (x, y) => y ?? x },
  },
});

graph.addNode("buildPrompt", async (state) => {
  // Wrong key: prompt expects `input`, state has `question`
  return await prompt.formatMessages({
    question: state.question,
  });
});
// FIXED
import { StateGraph } from "@langchain/langgraph";
import { ChatPromptTemplate } from "@langchain/core/prompts";

const prompt = ChatPromptTemplate.fromMessages([
  ["system", "You are a helpful assistant."],
  ["human", "{input}"],
]);

type State = {
  input: string;
  answer?: string;
};

const graph = new StateGraph<State>({
  channels: {
    input: { value: (x, y) => y ?? x },
    answer: { value: (x, y) => y ?? x },
  },
});

graph.addNode("buildPrompt", async (state) => {
  return await prompt.formatMessages({
    input: state.input,
  });
});

If you’re using StateGraph, keep the state keys aligned with the template variables. Don’t rename fields casually between nodes unless you also update every downstream prompt.

Other Possible Causes

1. You passed a raw object instead of a string or message array

This happens when a node returns something like { text: "hello" }, but the next step expects plain text.

// Wrong
return { text: "hello" };

// Right
return "hello";

If you’re feeding a model node directly, make sure it receives either:

  • formatted messages
  • a string input
  • the exact schema your runnable expects

2. Your MessagesPlaceholder key does not exist in state

A common LangChain/LangGraph setup uses conversation history.

const prompt = ChatPromptTemplate.fromMessages([
  ["system", "Answer clearly."],
  new MessagesPlaceholder("history"),
  ["human", "{input}"],
]);

If your state does not contain history, you’ll get an error like:

  • Missing value for input variable 'history'
  • Cannot find placeholder variable history

Fix by ensuring the graph state includes that key and it is an array of messages.

type State = {
  history: BaseMessage[];
  input: string;
};

3. You used the wrong placeholder syntax

In LangChain templates, braces matter. If you write ${input} instead of {input}, formatting fails.

// Wrong
["human", "${input}"]

// Right
["human", "{input}"]

This is easy to miss if you’re copying patterns from another templating system.

4. A previous node returned undefined

LangGraph will happily pass bad data downstream until something tries to format a prompt.

graph.addNode("fetchUser", async () => {
  return undefined; // breaks later when prompt needs user data
});

Fix by returning an explicit object and validating before the prompt step.

graph.addNode("fetchUser", async () => {
  return { input: "What is my policy status?" };
});

How to Debug It

  1. Print the exact state before the failing node

    • Log the object right before calling prompt.formatMessages(...).
    • Check for missing keys, undefined values, and wrong types.
  2. Inspect the template variables

    • Look at every {variable} and every MessagesPlaceholder("key").
    • Match those names against your state type and actual runtime object.
  3. Reduce the graph to one node

    • Temporarily remove branching and tool nodes.
    • Call the prompt formatter directly with a hardcoded object.
    • If that works, the bug is in upstream state shaping.
  4. Check runtime errors inside wrapped exceptions

    • LangGraph often wraps underlying LangChain errors.
    • Search for:
      • Missing value for input variable
      • Invalid prompt schema
      • Cannot read properties of undefined
    • The root cause is usually one level below the top-level stack trace.

Prevention

  • Keep your graph state schema and prompt variables in one place.
  • Use TypeScript types for every node output so mismatches fail earlier.
  • Validate node outputs before passing them into prompts, especially after tool calls or API fetches.
  • Prefer small adapter functions that map domain objects into prompt-ready inputs:
function toPromptInput(state: { question?: string }) {
  if (!state.question) throw new Error("question is required");
  return { input: state.question };
}

If you standardize on one naming convention across nodes and prompts, this error disappears fast. In LangGraph TypeScript apps, most “prompt template error” bugs are not LangGraph bugs — they’re contract mismatches between graph state and template variables.


Keep learning

By Cyprian Aarons, AI Consultant at Topiax.

Want the complete 8-step roadmap?

Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.

Get the Starter Kit

Related Guides