How to Fix 'prompt template error when scaling' in LangGraph (TypeScript)
When LangGraph throws a prompt template error when scaling, it usually means your graph is fine, but the prompt assembly is not. In practice, this shows up when you scale from a single-node test to a multi-node or conditional graph and one branch sends the wrong shape into a prompt template.
The error is often tied to ChatPromptTemplate, MessagesPlaceholder, or a node returning state that no longer matches what the next node expects. In TypeScript, the bug is usually a type mismatch that only becomes visible at runtime.
The Most Common Cause
The #1 cause is passing the wrong state shape into a prompt template after a node transition.
You’ll typically see something like:
- •
Error: Prompt template input variables mismatch - •
Error: Missing value for input variable 'messages' - •
Error: Invalid prompt input: expected array of messages
Here’s the broken pattern and the fixed pattern side by side.
| Broken | Fixed |
|---|---|
| Node returns partial state, next prompt expects full messages array | Node returns the exact state shape the prompt expects |
Uses state.input in one branch, state.messages in another | Normalizes state before prompt rendering |
import { ChatPromptTemplate, MessagesPlaceholder } from "@langchain/core/prompts";
import { AIMessage, HumanMessage } from "@langchain/core/messages";
// BROKEN
const prompt = ChatPromptTemplate.fromMessages([
["system", "You are a banking assistant."],
new MessagesPlaceholder("messages"),
]);
type State = {
input?: string;
messages?: HumanMessage[];
};
async function badNode(state: State) {
return {
input: state.input,
// messages missing or wrong shape in some branches
};
}
async function runBad(state: State) {
return prompt.invoke({
messages: state.messages, // undefined at runtime
});
}
import { ChatPromptTemplate, MessagesPlaceholder } from "@langchain/core/prompts";
import { HumanMessage } from "@langchain/core/messages";
// FIXED
const prompt = ChatPromptTemplate.fromMessages([
["system", "You are a banking assistant."],
new MessagesPlaceholder("messages"),
]);
type State = {
messages: HumanMessage[];
};
async function goodNode(state: State): Promise<State> {
return {
messages: [...state.messages],
};
}
async function runGood(state: State) {
return prompt.invoke({
messages: state.messages,
});
}
The key rule: if your prompt uses MessagesPlaceholder("messages"), every path into that node must provide messages as an array of LangChain message objects. Not strings. Not optional. Not sometimes.
Other Possible Causes
1. Variable name mismatch in ChatPromptTemplate
This happens when the template expects {customerName} but you pass {name}.
const prompt = ChatPromptTemplate.fromTemplate(
"Hello {customerName}, your policy status is {status}"
);
// Broken
await prompt.invoke({ name: "Ada", status: "active" });
// Fixed
await prompt.invoke({ customerName: "Ada", status: "active" });
If you scale graphs with multiple nodes, this often happens after renaming state fields in one place and forgetting another.
2. Returning plain strings where LangGraph expects message objects
A common TypeScript mistake is treating chat history like text.
// Broken
return {
messages: ["Hi", "How can I help?"], // wrong type
};
import { HumanMessage, AIMessage } from "@langchain/core/messages";
// Fixed
return {
messages: [
new HumanMessage("Hi"),
new AIMessage("How can I help?"),
],
};
If your node feeds a MessagesPlaceholder, it needs actual message instances, not string arrays.
3. Conditional edge sends incompatible state
In LangGraph, different branches can produce different shapes. That works until both branches converge on one prompt node.
// Broken branch A returns { query }
// Broken branch B returns { messages }
Fix by normalizing before merge:
type State = {
query?: string;
messages?: any[];
};
function normalizeState(state: State) {
return {
messages:
state.messages ??
(state.query ? [{ role: "user", content: state.query }] : []),
};
}
If you have a join node, make sure every upstream branch satisfies its contract.
4. Missing .partial() or .format() assumptions in templated prompts
Sometimes the issue is not LangGraph itself but how you build the prompt.
// Broken assumption: expecting runtime vars that never get passed
const p = ChatPromptTemplate.fromTemplate(
"Summarize this claim note for {region}: {note}"
);
If {region} is static, bind it early:
const p = ChatPromptTemplate.fromTemplate(
"Summarize this claim note for {region}: {note}"
).partial({ region: "EU" });
That removes one source of runtime mismatch when nodes are reused across flows.
How to Debug It
- •
Print the exact state right before the failing node
- •Log the payload entering the node that calls
prompt.invoke(...). - •Check whether fields are missing, renamed, or nested differently than expected.
- •Log the payload entering the node that calls
- •
Inspect the prompt’s required variables
- •For
ChatPromptTemplate.fromTemplate(...), list every{variable}. - •For
MessagesPlaceholder("messages"), verifymessagesexists and is an array of message objects.
- •For
- •
Trace all graph branches
- •Look at every conditional edge leading into the failing node.
- •Confirm each branch returns the same state shape, especially after reducers or merge nodes.
- •
Run the node in isolation
- •Call the failing function directly with hardcoded valid input.
- •If it works in isolation but fails in graph execution, the bug is upstream in state mutation or routing.
A useful mental model:
| Symptom | Likely cause |
|---|---|
| Missing variable error | Template variable name mismatch |
| Invalid message format | Plain strings instead of message objects |
| Works in one branch only | Conditional edge/state shape mismatch |
| Fails after refactor | Renamed field not updated everywhere |
Prevention
- •Define a strict shared state type for your graph and reuse it across all nodes.
- •Keep one canonical message field name, usually
messages, and never mix it withinput,chatHistory, orconversation. - •Add a small integration test that runs each branch into every downstream prompt node with realistic data.
If you’re building production LangGraph systems in TypeScript, treat prompts like typed interfaces. Most “prompt template error when scaling” issues are just contract violations between nodes that only show up once your graph stops being linear.
Keep learning
- •The complete AI Agents Roadmap — my full 8-step breakdown
- •Free: The AI Agent Starter Kit — PDF checklist + starter code
- •Work with me — I build AI for banks and insurance companies
By Cyprian Aarons, AI Consultant at Topiax.
Want the complete 8-step roadmap?
Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.
Get the Starter Kit