How to Integrate FastAPI for insurance with PostgreSQL for startups
FastAPI for insurance gives you a clean API layer for policy workflows, claims intake, and customer-facing automation. PostgreSQL gives you the durable system of record you need for quotes, policies, documents, and audit trails. Put them together and you get an AI agent backend that can accept insurance events over HTTP, persist them reliably, and serve them back to downstream services without turning your app into a pile of ad hoc JSON files.
Prerequisites
- •Python 3.11+
- •A running PostgreSQL instance
- •
piporuvfor dependency management - •A FastAPI app already scaffolded
- •Access to the FastAPI for insurance SDK or package used by your startup
- •Environment variables configured:
- •
DATABASE_URL=postgresql://user:password@localhost:5432/insurance_db - •any FastAPI for insurance API key or service token required by your provider
- •
Install the core packages:
pip install fastapi uvicorn psycopg[binary] sqlalchemy pydantic
If your FastAPI for insurance integration uses a vendor SDK, install that too:
pip install fastapi-insurance-sdk
Integration Steps
- •Create the database connection layer
Use SQLAlchemy with psycopg so your app can talk to PostgreSQL cleanly. Keep the engine and session in one place; don’t scatter raw connections across route handlers.
from sqlalchemy import create_engine, text
from sqlalchemy.orm import sessionmaker
DATABASE_URL = "postgresql://user:password@localhost:5432/insurance_db"
engine = create_engine(DATABASE_URL, pool_pre_ping=True)
SessionLocal = sessionmaker(bind=engine, autoflush=False, autocommit=False)
def get_db():
db = SessionLocal()
try:
yield db
finally:
db.close()
def test_connection():
with engine.connect() as conn:
result = conn.execute(text("SELECT 1"))
return result.scalar_one()
- •Define a table for insurance events
For startups building AI agents, you usually want to store inbound events first: quote requests, claim updates, document uploads, or underwriting notes. Use a table that captures both structured fields and raw payloads.
from sqlalchemy import Column, Integer, String, DateTime, JSON, func
from sqlalchemy.orm import declarative_base
Base = declarative_base()
class InsuranceEvent(Base):
__tablename__ = "insurance_events"
id = Column(Integer, primary_key=True)
event_type = Column(String(100), nullable=False)
customer_id = Column(String(100), nullable=False)
payload = Column(JSON, nullable=False)
created_at = Column(DateTime(timezone=True), server_default=func.now(), nullable=False)
Base.metadata.create_all(bind=engine)
- •Wire FastAPI routes to PostgreSQL persistence
This is the core integration point. Your FastAPI endpoint receives an insurance request from an agent or client app, validates it with Pydantic, then writes it to PostgreSQL.
from fastapi import FastAPI, Depends
from pydantic import BaseModel
from sqlalchemy.orm import Session
app = FastAPI()
class InsuranceEventIn(BaseModel):
event_type: str
customer_id: str
payload: dict
@app.post("/insurance/events")
def create_insurance_event(event: InsuranceEventIn, db: Session = Depends(get_db)):
record = InsuranceEvent(
event_type=event.event_type,
customer_id=event.customer_id,
payload=event.payload,
)
db.add(record)
db.commit()
db.refresh(record)
return {
"id": record.id,
"status": "stored",
"event_type": record.event_type,
"customer_id": record.customer_id,
}
- •Call the FastAPI for insurance SDK from your agent workflow
If your startup uses a vendor SDK for insurance-specific operations like quote generation or policy lookup, call it before persisting results. The pattern is simple: fetch external insurance data through the SDK, then store the response in PostgreSQL.
from fastapi_insurance_sdk import InsuranceClient
client = InsuranceClient(api_key="your-api-key")
def fetch_quote_and_store(db: Session, customer_id: str):
quote = client.quotes.create(
customer_id=customer_id,
product="home",
coverage_amount=250000,
deductible=1000,
)
record = InsuranceEvent(
event_type="quote_created",
customer_id=customer_id,
payload={
"quote_id": quote.id,
"premium": quote.premium,
"coverage_amount": quote.coverage_amount,
"raw": quote.model_dump() if hasattr(quote, "model_dump") else quote.dict(),
},
)
db.add(record)
db.commit()
db.refresh(record)
return record
- •Expose a read endpoint for your AI agent
Your agent will need retrieval as much as ingestion. Add a query endpoint so downstream workflows can inspect recent events or policy state from PostgreSQL.
from sqlalchemy import select
@app.get("/insurance/events/{customer_id}")
def list_customer_events(customer_id: str, db: Session = Depends(get_db)):
stmt = (
select(InsuranceEvent)
.where(InsuranceEvent.customer_id == customer_id)
.order_by(InsuranceEvent.created_at.desc())
.limit(20)
)
rows = db.execute(stmt).scalars().all()
return [
{
"id": row.id,
"event_type": row.event_type,
"payload": row.payload,
"created_at": row.created_at,
}
for row in rows
]
Testing the Integration
Start the API:
uvicorn main:app --reload
Then test the write path:
import requests
response = requests.post(
"http://127.0.0.1:8000/insurance/events",
json={
"event_type": "claim_submitted",
"customer_id": "CUST-1001",
"payload": {
"claim_number": "CLM-90001",
"amount": 4200,
"loss_type": "water_damage"
}
},
)
print(response.status_code)
print(response.json())
Expected output:
200
{
'id': 1,
'status': 'stored',
'event_type': 'claim_submitted',
'customer_id': 'CUST-1001'
}
Then verify PostgreSQL directly:
from sqlalchemy import text
with engine.connect() as conn:
result = conn.execute(text("SELECT event_type, customer_id FROM insurance_events ORDER BY id DESC LIMIT 1"))
print(result.fetchone())
Expected output:
('claim_submitted', 'CUST-1001')
Real-World Use Cases
- •
Claims intake agent
- •Accept claim submissions through FastAPI.
- •Store claim metadata and attachments in PostgreSQL.
- •Let an AI agent triage claims by reading recent events from the database.
- •
Quote orchestration service
- •Call the FastAPI for insurance SDK to generate quotes.
- •Persist quote requests and responses in PostgreSQL.
- •Build auditability into every pricing decision.
- •
Policy servicing backend
- •Handle policy changes like address updates or beneficiary edits.
- •Track every change as an immutable event in PostgreSQL.
- •Give your AI assistant enough history to answer customer questions accurately.
Keep learning
- •The complete AI Agents Roadmap — my full 8-step breakdown
- •Free: The AI Agent Starter Kit — PDF checklist + starter code
- •Work with me — I build AI for banks and insurance companies
By Cyprian Aarons, AI Consultant at Topiax.
Want the complete 8-step roadmap?
Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.
Get the Starter Kit