How to Fix 'prompt template error' in LangGraph (TypeScript)
What the error means
prompt template error in LangGraph usually means the model node tried to render a prompt and one of the variables in the template was missing, malformed, or the wrong type. In TypeScript projects, this shows up most often when you pass state into a ChatPromptTemplate, MessagesPlaceholder, or a custom node and the shape does not match what the prompt expects.
You’ll typically hit it when wiring a graph node to an LLM chain, especially after adding new state fields or refactoring message history.
The Most Common Cause
The #1 cause is a mismatch between your LangGraph state and the variables required by your prompt template.
A common failure pattern is passing state.messages into a prompt that expects messages, or forgetting to include a variable like input.
| Broken | Fixed |
|---|---|
| Uses the wrong key name | Passes exactly what the prompt expects |
| State shape does not match template vars | State shape aligned with prompt vars |
// BROKEN
import { ChatPromptTemplate } from "@langchain/core/prompts";
import { AIMessage } from "@langchain/core/messages";
const prompt = ChatPromptTemplate.fromMessages([
["system", "You are a helpful assistant."],
["human", "{input}"],
]);
// LangGraph state has `userInput`, not `input`
const node = async (state: { userInput: string }) => {
const messages = await prompt.formatMessages({
// ❌ prompt expects `input`
userInput: state.userInput,
});
return {
messages: [...messages, new AIMessage("...")],
};
};
// FIXED
import { ChatPromptTemplate } from "@langchain/core/prompts";
import { AIMessage } from "@langchain/core/messages";
const prompt = ChatPromptTemplate.fromMessages([
["system", "You are a helpful assistant."],
["human", "{input}"],
]);
const node = async (state: { userInput: string }) => {
const messages = await prompt.formatMessages({
input: state.userInput,
});
return {
messages: [...messages, new AIMessage("...")],
};
};
If you’re using MessagesPlaceholder, the same rule applies. The placeholder name must match the key you pass into formatMessages().
const prompt = ChatPromptTemplate.fromMessages([
["system", "Summarize the conversation."],
["placeholder", "{chat_history}"],
["human", "{input}"],
]);
// Must pass `{ chat_history, input }`
If you miss one of those keys, LangChain will throw an error during template rendering, and LangGraph will surface it while executing that node.
Other Possible Causes
1. Missing optional fields in state
If your graph state says a field exists but it’s actually undefined, template rendering can fail depending on how you use it.
// BROKEN
const prompt = ChatPromptTemplate.fromMessages([
["human", "Customer name: {customerName}"],
]);
await prompt.formatMessages({
customerName: undefined,
});
Fix by validating before rendering:
if (!state.customerName) {
throw new Error("customerName is required");
}
2. Wrong message type in messages
LangGraph expects valid message objects, not plain strings or ad hoc JSON.
// BROKEN
return {
messages: ["hello"], // ❌ not a valid message object
};
Use LangChain message classes:
import { HumanMessage } from "@langchain/core/messages";
return {
messages: [new HumanMessage("hello")],
};
3. Placeholder name mismatch
This is common when refactoring prompts.
// BROKEN
const prompt = ChatPromptTemplate.fromMessages([
["placeholder", "{history}"],
]);
await prompt.formatMessages({
chat_history: [],
});
Fix by matching names exactly:
const prompt = ChatPromptTemplate.fromMessages([
["placeholder", "{chat_history}"],
]);
await prompt.formatMessages({
chat_history: [],
});
4. Passing non-array values into MessagesPlaceholder
MessagesPlaceholder expects an array of messages. Passing a string or object triggers runtime issues.
// BROKEN
await prompt.formatMessages({
chat_history: "previous conversation",
});
Correct shape:
import { HumanMessage, AIMessage } from "@langchain/core/messages";
await prompt.formatMessages({
chat_history: [
new HumanMessage("Hi"),
new AIMessage("Hello"),
],
});
How to Debug It
- •
Read the full stack trace
- •Look for errors like:
- •
Error: Missing value for input variable 'input' - •
Error: Invalid template input - •
TypeError: Cannot read properties of undefined
- •
- •If it points to
ChatPromptTemplate.formatMessages(), it’s almost always a variable mismatch.
- •Look for errors like:
- •
Log the exact state entering the node
const node = async (state) => { console.log("STATE:", JSON.stringify(state, null, 2)); ... };Compare that output to the variables used in your template.
- •
Print the template variables
console.log(prompt.inputVariables);If your prompt expects
[ "input", "chat_history" ], make sure your graph passes both keys. - •
Validate before formatting Add a guard before calling
formatMessages():if (!state.input || !Array.isArray(state.chat_history)) { throw new Error("Invalid graph state for prompt rendering"); }
Prevention
- •Keep your graph state type and your prompt variables in sync.
- •Use strict TypeScript types for node inputs instead of
any. - •Add small validation checks before every
formatMessages()call. - •Treat message history as structured data only:
- •use
HumanMessage - •use
AIMessage - •use arrays for history
- •use
If you’re building multiple nodes, define one shared interface for state and reuse it across prompts. That removes most of these errors before runtime.
Keep learning
- •The complete AI Agents Roadmap — my full 8-step breakdown
- •Free: The AI Agent Starter Kit — PDF checklist + starter code
- •Work with me — I build AI for banks and insurance companies
By Cyprian Aarons, AI Consultant at Topiax.
Want the complete 8-step roadmap?
Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.
Get the Starter Kit