How to Integrate LangGraph for insurance with Redis for AI agents
Combining LangGraph for insurance with Redis gives you a practical agent runtime for regulated workflows. LangGraph handles the stateful orchestration of claims, underwriting, or policy servicing, while Redis gives you low-latency memory, session state, and a shared store for multi-step agent execution.
This setup is useful when an insurance agent needs to remember customer context across turns, resume interrupted workflows, and coordinate multiple tools without losing state.
Prerequisites
- •Python 3.10+
- •A LangGraph-based insurance workflow already defined
- •Redis 6+ running locally or in managed cloud
- •
langgraph,redis, andlangchain-coreinstalled - •An LLM provider configured for your LangGraph nodes
- •Basic familiarity with graph state, checkpoints, and tool calling
Install the packages:
pip install langgraph redis langchain-core
If you are using a managed Redis instance, make sure you have:
- •Hostname
- •Port
- •Password
- •TLS settings if required
Integration Steps
1) Define your LangGraph insurance state
Start with a typed state object that holds claim data, customer metadata, and workflow status. This is the part LangGraph will persist and resume.
from typing import TypedDict, Optional
from langgraph.graph import StateGraph, START, END
class InsuranceState(TypedDict):
customer_id: str
claim_id: Optional[str]
policy_number: Optional[str]
intent: Optional[str]
status: str
Keep the state small. Store only what the graph needs to continue execution; put large documents or raw payloads in external storage.
2) Add graph nodes for insurance workflow steps
Here is a simple claim triage flow. In production, these nodes would call your policy admin system, claims service, or fraud rules engine.
def classify_intent(state: InsuranceState) -> InsuranceState:
text = state.get("intent", "") or ""
if "claim" in text.lower():
state["status"] = "triage_claim"
else:
state["status"] = "general_support"
return state
def route_claim(state: InsuranceState) -> InsuranceState:
state["status"] = "awaiting_adjuster_review"
return state
graph = StateGraph(InsuranceState)
graph.add_node("classify_intent", classify_intent)
graph.add_node("route_claim", route_claim)
graph.add_edge(START, "classify_intent")
graph.add_conditional_edges(
"classify_intent",
lambda s: "route_claim" if s["status"] == "triage_claim" else END,
)
graph.add_edge("route_claim", END)
This is the orchestration layer. The next step is wiring persistence so the workflow can survive retries and user pauses.
3) Configure Redis as the checkpoint store
LangGraph supports checkpointing so graph execution can resume from saved state. For Redis-backed persistence, use the Redis checkpoint saver from langgraph.checkpoint.redis.
from redis import Redis
from langgraph.checkpoint.redis import RedisSaver
redis_client = Redis(
host="localhost",
port=6379,
db=0,
decode_responses=True,
)
checkpointer = RedisSaver(redis_client)
compiled_graph = graph.compile(checkpointer=checkpointer)
If your Redis deployment requires auth or TLS:
redis_client = Redis(
host="your-redis-host",
port=6380,
username="default",
password="your-password",
ssl=True,
decode_responses=True,
)
At this point LangGraph can persist checkpoints into Redis using its checkpoint API. That means each thread or conversation can be resumed by key.
4) Use Redis as shared memory for agent context
Checkpointing covers graph state. For reusable session memory like recent user activity or claim lookup cache, use standard Redis operations alongside LangGraph.
session_key = "insurance:session:customer_123"
redis_client.hset(session_key, mapping={
"last_policy_number": "POL-77881",
"last_claim_id": "CLM-44521",
"last_status": "waiting_for_documents",
})
cached_status = redis_client.hgetall(session_key)
print(cached_status)
This pattern works well when multiple agents need the same context. One agent can update session metadata while another resumes a graph run using the same customer identifier.
5) Run the graph with a stable thread ID
Use a consistent thread_id so LangGraph knows which checkpoint history belongs to which conversation or claim case.
config = {
"configurable": {
"thread_id": "claim-thread-001"
}
}
result = compiled_graph.invoke(
{
"customer_id": "CUST-1001",
"claim_id": None,
"policy_number": None,
"intent": "I want to file a claim for water damage",
"status": ""
},
config=config,
)
print(result)
For a real insurance system, this thread_id should map to a case ID, claim ID, or authenticated customer session ID. Do not generate random IDs per request if you need resumability.
Testing the Integration
Run this quick verification script to confirm both Redis persistence and graph execution are working.
from redis import Redis
from langgraph.checkpoint.redis import RedisSaver
redis_client = Redis(host="localhost", port=6379, db=0, decode_responses=True)
checkpointer = RedisSaver(redis_client)
# Verify Redis connectivity
assert redis_client.ping() is True
# Verify checkpoint saver initializes cleanly
print("Redis connected")
# Inspect stored keys after running your graph once
keys = redis_client.keys("*claim-thread-001*")
print("Checkpoint keys:", keys)
# Example expected result check
session_data = redis_client.hgetall("insurance:session:customer_123")
print("Session data:", session_data)
Expected output:
Redis connected
Checkpoint keys: ['...claim-thread-001...']
Session data: {'last_policy_number': 'POL-77881', 'last_claim_id': 'CLM-44521', 'last_status': 'waiting_for_documents'}
If you do not see checkpoint keys, confirm that:
- •The graph was compiled with
checkpointer=checkpointer - •The same
thread_idwas used on invoke - •Your Redis instance is writable from the app container or host
Real-World Use Cases
- •
Claims intake agents
Store conversation checkpoints in Redis so a claimant can upload documents later and resume exactly where they left off in LangGraph. - •
Underwriting assistants
Cache risk summaries, prior quote attempts, and document extraction results in Redis while LangGraph coordinates validation and approval steps. - •
Policy servicing workflows
Use LangGraph for multi-step service flows like address changes or beneficiary updates, with Redis holding session context across channels like chat and email.
Keep learning
- •The complete AI Agents Roadmap — my full 8-step breakdown
- •Free: The AI Agent Starter Kit — PDF checklist + starter code
- •Work with me — I build AI for banks and insurance companies
By Cyprian Aarons, AI Consultant at Topiax.
Want the complete 8-step roadmap?
Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.
Get the Starter Kit