How to Integrate FastAPI for investment banking with PostgreSQL for production AI
FastAPI gives you the API surface for investment banking workflows: trade capture, client onboarding, approval queues, document retrieval, and agent orchestration. PostgreSQL gives you the durable state layer you need for production AI: audit trails, prompt history, portfolio snapshots, and tool-call logs.
Put them together and you get an agent system that can answer banker queries, persist decisions, and survive restarts without losing context.
Prerequisites
- •Python 3.11+
- •A running PostgreSQL 15+ instance
- •
piporuv - •FastAPI installed
- •
psycopgorpsycopg2-binaryinstalled - •
uvicorninstalled for local API execution - •An environment file or secret manager for:
- •
DATABASE_URL - •
APP_ENV - •any bank-specific credentials or service tokens
- •
Install the core packages:
pip install fastapi uvicorn psycopg[binary] pydantic
If your AI agent also calls external model APIs, keep those credentials separate from your database config.
Integration Steps
1) Define your PostgreSQL connection layer
Start with a small database module that creates a reusable connection pool. For production AI systems, avoid opening a new connection per request.
# db.py
import os
from contextlib import contextmanager
import psycopg
from psycopg_pool import ConnectionPool
DATABASE_URL = os.getenv("DATABASE_URL")
pool = ConnectionPool(conninfo=DATABASE_URL, min_size=1, max_size=10)
@contextmanager
def get_conn():
with pool.connection() as conn:
yield conn
Create tables for requests and agent outputs. Keep the schema simple and auditable.
# init_db.py
from db import get_conn
DDL = """
CREATE TABLE IF NOT EXISTS banking_requests (
id BIGSERIAL PRIMARY KEY,
client_id TEXT NOT NULL,
request_type TEXT NOT NULL,
payload JSONB NOT NULL,
status TEXT NOT NULL DEFAULT 'received',
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
);
CREATE TABLE IF NOT EXISTS agent_audit_log (
id BIGSERIAL PRIMARY KEY,
request_id BIGINT REFERENCES banking_requests(id),
action TEXT NOT NULL,
result JSONB NOT NULL,
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
);
"""
with get_conn() as conn:
with conn.cursor() as cur:
cur.execute(DDL)
conn.commit()
2) Build the FastAPI app and write request records to PostgreSQL
This is the entry point for your investment banking workflow. The API receives a request, stores it in PostgreSQL, then returns a stable ID for downstream agent processing.
# main.py
from fastapi import FastAPI
from pydantic import BaseModel
from db import get_conn
app = FastAPI(title="Investment Banking AI API")
class BankingRequest(BaseModel):
client_id: str
request_type: str
payload: dict
@app.post("/requests")
def create_request(req: BankingRequest):
with get_conn() as conn:
with conn.cursor() as cur:
cur.execute(
"""
INSERT INTO banking_requests (client_id, request_type, payload)
VALUES (%s, %s, %s)
RETURNING id
""",
(req.client_id, req.request_type, req.payload),
)
request_id = cur.fetchone()[0]
conn.commit()
return {"request_id": request_id, "status": "received"}
This pattern is what you want in production AI: accept input quickly, persist first, process second.
3) Add an agent processing function that reads from PostgreSQL and writes results back
Your agent can pull the stored request, run policy checks or model inference, then persist its decision trail. That audit record matters in banking.
# agent.py
from db import get_conn
def process_request(request_id: int):
with get_conn() as conn:
with conn.cursor() as cur:
cur.execute(
"SELECT client_id, request_type, payload FROM banking_requests WHERE id = %s",
(request_id,),
)
row = cur.fetchone()
if not row:
raise ValueError(f"Request {request_id} not found")
client_id, request_type, payload = row
# Replace this with real model logic or policy engine output.
result = {
"client_id": client_id,
"request_type": request_type,
"decision": "approved",
"confidence": 0.94,
"summary": f"Processed {request_type} for {client_id}",
}
cur.execute(
"""
INSERT INTO agent_audit_log (request_id, action, result)
VALUES (%s, %s, %s)
""",
(request_id, "processed", result),
)
cur.execute(
"UPDATE banking_requests SET status = %s WHERE id = %s",
("processed", request_id),
)
conn.commit()
return result
4) Expose a processing endpoint in FastAPI
Now wire the agent into an API route so downstream systems can trigger processing on demand. This is common when you have human-in-the-loop approval flows.
# main.py continued
from agent import process_request
@app.post("/requests/{request_id}/process")
def run_agent(request_id: int):
result = process_request(request_id)
return {"request_id": request_id, "result": result}
At this point you have a clean contract:
- •FastAPI handles transport and validation.
- •PostgreSQL handles persistence and auditability.
- •The agent function handles business logic.
5) Add a read endpoint for traceability and ops support
Banking teams need to inspect what happened after the fact. Give them a way to retrieve both the original request and the latest audit entry.
# main.py continued
@app.get("/requests/{request_id}")
def get_request(request_id: int):
with get_conn() as conn:
with conn.cursor() as cur:
cur.execute(
"""
SELECT id, client_id, request_type, payload::text, status::text
FROM banking_requests
WHERE id = %s
""",
(request_id,),
)
request_row = cur.fetchone()
cur.execute(
"""
SELECT action, result::text, created_at::text
FROM agent_audit_log
WHERE request_id = %s
ORDER BY created_at DESC
LIMIT 1
""",
(request_id,),
)
audit_row = cur.fetchone()
return {
"request": request_row,
"latest_audit": audit_row,
}
Testing the Integration
Run the app:
uvicorn main:app --reload --port 8000
Then create a request and process it:
import requests
base_url = "http://127.0.0.1:8000"
create_resp = requests.post(
f"{base_url}/requests",
json={
"client_id": "CUST-1042",
"request_type": "trade_review",
"payload": {"symbol": "AAPL", "notional": 2500000},
},
)
request_id = create_resp.json()["request_id"]
process_resp = requests.post(f"{base_url}/requests/{request_id}/process")
get_resp = requests.get(f"{base_url}/requests/{request_id}")
print("Created:", create_resp.json())
print("Processed:", process_resp.json())
print("Fetched:", get_resp.json())
Expected output:
Created: {'request_id': 1, 'status': 'received'}
Processed: {'request_id': 1, 'result': {'client_id': 'CUST-1042', 'request_type': 'trade_review', 'decision': 'approved', 'confidence': 0.94}}
Fetched: {'request': [1, 'CUST-1042', 'trade_review', '{"symbol": "AAPL", "notional": 2500000}', 'processed'], 'latest_audit': ['processed', '{"client_id": "CUST-1042", ...}', '2026-04-21T...']}
Real-World Use Cases
- •
Trade exception handling
Route failed trade validations through FastAPI endpoints and store exception states in PostgreSQL for reconciliation teams. - •
Client onboarding agents
Capture KYC/AML document metadata in PostgreSQL while FastAPI exposes review endpoints for compliance workflows. - •
Investment research assistants
Persist analyst questions, retrieved documents, model answers, and approval outcomes so every response has traceability.
Keep learning
- •The complete AI Agents Roadmap — my full 8-step breakdown
- •Free: The AI Agent Starter Kit — PDF checklist + starter code
- •Work with me — I build AI for banks and insurance companies
By Cyprian Aarons, AI Consultant at Topiax.
Want the complete 8-step roadmap?
Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.
Get the Starter Kit