LangGraph Tutorial (Python): building prompt templates for beginners
This tutorial shows you how to build reusable prompt templates inside a LangGraph workflow in Python. You need this when you want your agent’s prompts to stay consistent, accept dynamic inputs, and remain easy to test as the graph grows.
What You'll Need
- •Python 3.10+
- •
langgraph - •
langchain-core - •
langchain-openai - •An OpenAI API key set as
OPENAI_API_KEY - •Basic familiarity with Python functions and dictionaries
- •A terminal and a virtual environment
Install the packages first:
pip install langgraph langchain-core langchain-openai
Step-by-Step
- •Start by defining the state your graph will pass around. For a beginner-friendly prompt template, keep the state small: one user input and one generated answer.
from typing import TypedDict
from langgraph.graph import StateGraph, START, END
class State(TypedDict):
topic: str
prompt: str
answer: str
- •Next, build a prompt template with
ChatPromptTemplate. This is the part beginners usually miss: the template should separate fixed instructions from variable input so your node stays reusable.
from langchain_core.prompts import ChatPromptTemplate
prompt_template = ChatPromptTemplate.from_messages(
[
("system", "You are a helpful tutor who explains concepts simply."),
("human", "Explain {topic} in 3 bullet points for a beginner."),
]
)
- •Now create a node that formats the prompt and sends it to an LLM. The node should accept state, fill the template, call the model, and return only the fields your graph needs next.
import os
from langchain_openai import ChatOpenAI
llm = ChatOpenAI(model="gpt-4o-mini", temperature=0)
def generate_answer(state: State) -> dict:
messages = prompt_template.format_messages(topic=state["topic"])
response = llm.invoke(messages)
return {
"prompt": messages[-1].content,
"answer": response.content,
}
- •Wire the node into a LangGraph workflow. This gives you a clean path from input state to output state, which is exactly what you want when teaching prompt templates to beginners.
graph_builder = StateGraph(State)
graph_builder.add_node("generate_answer", generate_answer)
graph_builder.add_edge(START, "generate_answer")
graph_builder.add_edge("generate_answer", END)
graph = graph_builder.compile()
- •Run the graph with a sample topic and inspect both the rendered prompt and final answer. Saving the formatted prompt into state makes debugging much easier than guessing what was sent to the model.
result = graph.invoke({"topic": "LangGraph"})
print("PROMPT:")
print(result["prompt"])
print("\nANSWER:")
print(result["answer"])
- •If you want to make the template more useful for real applications, add another variable like audience or tone. The pattern stays the same; only the template and state shape change.
from typing import TypedDict
class RichState(TypedDict):
topic: str
audience: str
prompt: str
answer: str
rich_template = ChatPromptTemplate.from_messages(
[
("system", "You are a helpful tutor."),
("human", "Explain {topic} for {audience}. Use simple language."),
]
)
Testing It
Run the script in a Python file after setting OPENAI_API_KEY in your environment. If everything is wired correctly, you should see the formatted user prompt printed first, followed by a short explanation from the model.
Check that state["topic"] is actually being substituted into the template. If you get an error about missing keys, your state field names and template variables do not match.
If you want stronger validation, print messages before calling llm.invoke() so you can confirm LangChain is building the right message objects. That is usually where beginners find mistakes in prompt construction.
Next Steps
- •Add branching in LangGraph so different prompts run for different topics.
- •Store multiple templates in separate nodes for summarization, classification, and rewriting.
- •Learn how to use
MessagesPlaceholderwhen your agent needs conversation history.
Keep learning
- •The complete AI Agents Roadmap — my full 8-step breakdown
- •Free: The AI Agent Starter Kit — PDF checklist + starter code
- •Work with me — I build AI for banks and insurance companies
By Cyprian Aarons, AI Consultant at Topiax.
Want the complete 8-step roadmap?
Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.
Get the Starter Kit