How to Integrate FastAPI for banking with PostgreSQL for production AI
FastAPI for banking gives you the API layer for secure, typed request handling. PostgreSQL gives you durable state, auditability, and queryable history for agent decisions, customer records, and transaction workflows.
If you are building production AI for banking, this combo is the baseline: FastAPI exposes controlled endpoints for your agent system, while PostgreSQL stores conversations, risk scores, approvals, and transaction metadata with proper consistency.
Prerequisites
- •Python 3.10+
- •A running PostgreSQL 14+ instance
- •A FastAPI app scaffolded and ready to run
- •
uvicorn,fastapi,psycopg[binary], andsqlalchemyinstalled - •A
.envfile or secret manager configured with:- •
DATABASE_URL - •
API_KEYor service auth credentials
- •
- •Network access from your app to PostgreSQL
- •A database role with least-privilege permissions for the app
Integration Steps
- •Install dependencies and define your database connection
Use SQLAlchemy for connection management and psycopg under the hood. This keeps your FastAPI app clean and gives you a stable production path for pooling and transactions.
pip install fastapi uvicorn sqlalchemy psycopg[binary] python-dotenv
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker, declarative_base
DATABASE_URL = "postgresql+psycopg://bank_app_user:strong_password@localhost:5432/bank_ai"
engine = create_engine(
DATABASE_URL,
pool_size=10,
max_overflow=20,
pool_pre_ping=True,
)
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
Base = declarative_base()
- •Create a table for agent events and banking actions
For production AI, do not store everything in memory. Persist each decision so you can trace what the agent saw, what it decided, and what action was taken.
from sqlalchemy import Column, Integer, String, DateTime, Text, func
class AgentEvent(Base):
__tablename__ = "agent_events"
id = Column(Integer, primary_key=True, index=True)
customer_id = Column(String(64), nullable=False, index=True)
event_type = Column(String(50), nullable=False)
payload = Column(Text, nullable=False)
status = Column(String(20), nullable=False, default="pending")
created_at = Column(DateTime(timezone=True), server_default=func.now())
Create the table on startup or via migrations:
Base.metadata.create_all(bind=engine)
- •Build the FastAPI endpoint that writes to PostgreSQL
This is the core integration point. FastAPI handles request validation with Pydantic; PostgreSQL stores the event record immediately so downstream AI workers can process it reliably.
from fastapi import FastAPI, Depends
from pydantic import BaseModel
from sqlalchemy.orm import Session
app = FastAPI(title="Banking AI API")
class TransactionRequest(BaseModel):
customer_id: str
event_type: str
payload: str
def get_db():
db = SessionLocal()
try:
yield db
finally:
db.close()
@app.post("/events")
def create_event(request: TransactionRequest, db: Session = Depends(get_db)):
event = AgentEvent(
customer_id=request.customer_id,
event_type=request.event_type,
payload=request.payload,
status="queued",
)
db.add(event)
db.commit()
db.refresh(event)
return {
"id": event.id,
"customer_id": event.customer_id,
"status": event.status,
"created_at": event.created_at,
}
That endpoint is production-friendly because it is explicit about input shape and uses a real database transaction.
- •Read back events for agent orchestration
Your AI worker or orchestration layer needs to fetch pending work from PostgreSQL. This is where you turn stored events into actionable tasks.
from sqlalchemy import select
@app.get("/events/pending")
def list_pending_events(db: Session = Depends(get_db)):
stmt = select(AgentEvent).where(AgentEvent.status == "queued").order_by(AgentEvent.created_at.asc())
results = db.execute(stmt).scalars().all()
return [
{
"id": row.id,
"customer_id": row.customer_id,
"event_type": row.event_type,
"payload": row.payload,
"status": row.status,
}
for row in results
]
If you later add an LLM-based decision service, this endpoint becomes the handoff point between request ingestion and model execution.
- •Update processing status after the AI agent acts
In banking systems, state transitions matter. You need to mark work as processed only after the downstream action succeeds.
@app.patch("/events/{event_id}/status")
def update_event_status(event_id: int, status: str, db: Session = Depends(get_db)):
event = db.get(AgentEvent, event_id)
if not event:
return {"error": "event not found"}
event.status = status
db.commit()
return {
"id": event.id,
"status": event.status,
"customer_id": event.customer_id,
}
This pattern gives you an auditable lifecycle:
- •
queued - •
processing - •
approved - •
rejected - •
completed - •
failed
Testing the Integration
Run your API:
uvicorn main:app --reload
Then verify both write and read paths:
import requests
base_url = "http://127.0.0.1:8000"
payload = {
"customer_id": "CUST-10021",
"event_type": "loan_review",
"payload": '{"amount": 25000, "risk_score": 0.72}'
}
create_resp = requests.post(f"{base_url}/events", json=payload)
print("CREATE:", create_resp.json())
list_resp = requests.get(f"{base_url}/events/pending")
print("PENDING:", list_resp.json())
Expected output:
CREATE: {'id': 1, 'customer_id': 'CUST-10021', 'status': 'queued', 'created_at': '2026-04-21T...'}
PENDING: [{'id': 1, 'customer_id': 'CUST-10021', 'event_type': 'loan_review', 'payload': '{"amount": 25000, "risk_score": 0.72}', 'status': 'queued'}]
If that works end-to-end, your API layer is writing correctly and PostgreSQL is persisting state as expected.
Real-World Use Cases
- •
Loan underwriting assistants
- •Store application data in PostgreSQL.
- •Use FastAPI endpoints to trigger AI scoring and manual review workflows.
- •
Fraud triage pipelines
- •Ingest suspicious activity events through FastAPI.
- •Persist evidence trails in PostgreSQL for audit and investigator review.
- •
Customer support copilots
- •Save conversation context and case state in PostgreSQL.
- •Let FastAPI expose controlled endpoints for retrieval, escalation, and resolution updates.
For production AI in banking, this setup is not optional plumbing. It is the control plane that keeps your agent system observable, auditable, and safe enough to run against real financial workflows.
Keep learning
- •The complete AI Agents Roadmap — my full 8-step breakdown
- •Free: The AI Agent Starter Kit — PDF checklist + starter code
- •Work with me — I build AI for banks and insurance companies
By Cyprian Aarons, AI Consultant at Topiax.
Want the complete 8-step roadmap?
Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.
Get the Starter Kit