How to Integrate FastAPI for insurance with LangChain for startups
Combining FastAPI for insurance with LangChain gives you a clean way to expose insurance workflows as APIs while letting an LLM reason over policy data, claims, and customer questions. For startups, this is the practical path to building agentic systems that can triage claims, answer policy questions, and route cases without turning your backend into a pile of prompt spaghetti.
Prerequisites
- •Python 3.10+
- •A FastAPI app already running or ready to create
- •
fastapi,uvicorn,langchain,langchain-openai, andpydanticinstalled - •An OpenAI API key set in your environment
- •Access to your insurance data model:
- •policy records
- •claims records
- •customer profile data
- •A clear boundary between:
- •deterministic business logic in FastAPI
- •natural-language reasoning in LangChain
Install the packages:
pip install fastapi uvicorn langchain langchain-openai pydantic httpx
Integration Steps
- •Build the FastAPI insurance service
Start with a small insurance API that exposes structured endpoints. Keep the business logic in normal Python functions so LangChain can call them predictably.
from fastapi import FastAPI, HTTPException
from pydantic import BaseModel
app = FastAPI(title="Insurance API")
class ClaimRequest(BaseModel):
policy_id: str
claim_amount: float
incident_type: str
POLICIES = {
"POL123": {"status": "active", "coverage_limit": 5000},
"POL999": {"status": "lapsed", "coverage_limit": 10000},
}
@app.get("/policy/{policy_id}")
def get_policy(policy_id: str):
policy = POLICIES.get(policy_id)
if not policy:
raise HTTPException(status_code=404, detail="Policy not found")
return {"policy_id": policy_id, **policy}
@app.post("/claims/validate")
def validate_claim(payload: ClaimRequest):
policy = POLICIES.get(payload.policy_id)
if not policy:
raise HTTPException(status_code=404, detail="Policy not found")
approved = (
policy["status"] == "active"
and payload.claim_amount <= policy["coverage_limit"]
)
return {
"policy_id": payload.policy_id,
"approved": approved,
"reason": "within coverage" if approved else "outside coverage or inactive policy",
}
- •Wrap FastAPI endpoints as LangChain tools
LangChain works best when your agent can call explicit tools. Use StructuredTool for typed inputs and wire it to your FastAPI endpoints through HTTP calls.
import os
import httpx
from pydantic import BaseModel, Field
from langchain_core.tools import StructuredTool
BASE_URL = "http://localhost:8000"
class PolicyLookupInput(BaseModel):
policy_id: str = Field(..., description="Insurance policy identifier")
class ClaimValidationInput(BaseModel):
policy_id: str
claim_amount: float
incident_type: str
def lookup_policy(policy_id: str) -> dict:
response = httpx.get(f"{BASE_URL}/policy/{policy_id}", timeout=10)
response.raise_for_status()
return response.json()
def validate_claim(policy_id: str, claim_amount: float, incident_type: str) -> dict:
response = httpx.post(
f"{BASE_URL}/claims/validate",
json={
"policy_id": policy_id,
"claim_amount": claim_amount,
"incident_type": incident_type,
},
timeout=10,
)
response.raise_for_status()
return response.json()
policy_tool = StructuredTool.from_function(
func=lookup_policy,
name="lookup_policy",
description="Fetch insurance policy details by ID",
)
claim_tool = StructuredTool.from_function(
func=validate_claim,
name="validate_claim",
description="Validate whether a claim is eligible under a policy",
)
- •Create the LangChain agent
Use a chat model plus tool calling so the LLM can decide when to query your FastAPI service. In startup systems, this is where you keep the model from hallucinating coverage rules.
from langchain_openai import ChatOpenAI
from langchain.agents import create_tool_calling_agent, AgentExecutor
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
llm = ChatOpenAI(model="gpt-4o-mini", temperature=0)
prompt = ChatPromptTemplate.from_messages([
("system", "You are an insurance operations assistant. Use tools for any policy or claim checks."),
("human", "{input}"),
MessagesPlaceholder(variable_name="agent_scratchpad"),
])
tools = [policy_tool, claim_tool]
agent = create_tool_calling_agent(llm, tools, prompt)
executor = AgentExecutor(agent=agent, tools=tools, verbose=True)
- •Expose an AI endpoint through FastAPI
Now put the agent behind a FastAPI route so your frontend or internal ops system can call it like any other service.
from fastapi import Body
class AgentQuery(BaseModel):
message: str
@app.post("/assistant")
async def assistant(query: AgentQuery):
result = executor.invoke({"input": query.message})
return {"answer": result["output"]}
- •Run both services and keep the contract stable
Run your API with Uvicorn and make sure tool inputs stay strict. If you change fields in ClaimRequest, update the tool schema at the same time.
uvicorn main:app --reload --port 8000
A good production pattern here is:
- •FastAPI owns validation and business rules
- •LangChain owns orchestration and language understanding
- •Tools only return structured JSON
- •The model never writes directly to core insurance tables
Testing the Integration
Send a natural-language request to the /assistant endpoint and confirm it calls the underlying insurance APIs correctly.
import httpx
payload = {
"message": "Check whether claim amount 4200 for policy POL123 is valid."
}
response = httpx.post("http://localhost:8000/assistant", json=payload, timeout=30)
print(response.status_code)
print(response.json())
Expected output:
{
"answer": "The claim is valid because the policy is active and the amount is within coverage."
}
If you want to verify the raw API path first:
import httpx
r1 = httpx.get("http://localhost:8000/policy/POL123")
r2 = httpx.post(
"http://localhost:8000/claims/validate",
json={"policy_id": "POL123", "claim_amount": 4200, "incident_type": "collision"},
)
print(r1.json())
print(r2.json())
Expected output:
{"policy_id":"POL123","status":"active","coverage_limit":5000}
{"policy_id":"POL123","approved":true,"reason":"within coverage"}
Real-World Use Cases
- •Claims triage assistant
- •Let agents summarize incoming claims, check eligibility via API tools, and route low-risk cases automatically.
- •Policy Q&A chatbot
- •Answer customer questions about coverage limits, status, deductibles, and renewal dates using structured backend lookups.
- •Underwriting copilot
- •Pull applicant data from internal services and have LangChain generate underwriting notes while FastAPI handles rule enforcement.
This setup works because it keeps your insurance logic deterministic and your LLM layer narrow. That separation matters when you move from prototype to regulated production systems.
Keep learning
- •The complete AI Agents Roadmap — my full 8-step breakdown
- •Free: The AI Agent Starter Kit — PDF checklist + starter code
- •Work with me — I build AI for banks and insurance companies
By Cyprian Aarons, AI Consultant at Topiax.
Want the complete 8-step roadmap?
Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.
Get the Starter Kit