How to Integrate FastAPI for retail banking with PostgreSQL for multi-agent systems
Combining FastAPI for retail banking with PostgreSQL gives you a clean way to expose banking workflows as APIs while keeping account state, agent memory, and audit trails in a durable relational store. In a multi-agent system, that means one agent can handle customer intent, another can validate balances or limits, and PostgreSQL becomes the shared source of truth for transactions and coordination.
Prerequisites
- •Python 3.10+
- •FastAPI installed and running
- •Uvicorn for local API serving
- •PostgreSQL 14+
- •
psycopgorasyncpgfor database access - •
sqlalchemyif you want ORM-based models - •A PostgreSQL database with credentials ready
- •Basic retail banking domain tables:
- •
customers - •
accounts - •
transactions - •
agent_tasks
- •
Install the core packages:
pip install fastapi uvicorn psycopg[binary] sqlalchemy pydantic
Integration Steps
- •Create your FastAPI app and define banking endpoints.
For retail banking, keep the API surface narrow: balance lookup, transfer initiation, and transaction history. That gives your agents predictable tools to call.
from fastapi import FastAPI, HTTPException
from pydantic import BaseModel
app = FastAPI(title="Retail Banking API")
class TransferRequest(BaseModel):
from_account: str
to_account: str
amount: float
@app.get("/accounts/{account_id}/balance")
def get_balance(account_id: str):
return {"account_id": account_id, "balance": 1250.75}
@app.post("/transfers")
def create_transfer(payload: TransferRequest):
if payload.amount <= 0:
raise HTTPException(status_code=400, detail="Transfer amount must be positive")
return {
"status": "queued",
"from_account": payload.from_account,
"to_account": payload.to_account,
"amount": payload.amount,
}
- •Connect FastAPI to PostgreSQL using a shared connection layer.
Use a proper pool so multiple agents and API requests do not fight over connections. For production banking systems, this is not optional.
import os
import psycopg
from psycopg.rows import dict_row
DATABASE_URL = os.getenv(
"DATABASE_URL",
"postgresql://bank_user:bank_pass@localhost:5432/retail_bank"
)
def get_conn():
return psycopg.connect(DATABASE_URL, row_factory=dict_row)
- •Persist account state and transaction records in PostgreSQL.
This is where the integration becomes useful for multi-agent orchestration. The API writes durable records that other agents can read later for reconciliation, fraud checks, or customer support workflows.
from datetime import datetime
def record_transfer(from_account: str, to_account: str, amount: float):
with get_conn() as conn:
with conn.cursor() as cur:
cur.execute(
"""
INSERT INTO transactions (
from_account,
to_account,
amount,
status,
created_at
)
VALUES (%s, %s, %s, %s, %s)
RETURNING id
""",
(from_account, to_account, amount, "queued", datetime.utcnow())
)
tx_id = cur.fetchone()["id"]
conn.commit()
return tx_id
@app.post("/transfers/persisted")
def create_persisted_transfer(payload: TransferRequest):
tx_id = record_transfer(
payload.from_account,
payload.to_account,
payload.amount
)
return {"transaction_id": tx_id, "status": "queued"}
- •Add an agent-facing task table for multi-agent coordination.
In a multi-agent system, one agent may create tasks while another executes them. PostgreSQL works well as the coordination layer because it supports locking, filtering by status, and auditability.
def create_agent_task(task_type: str, payload: dict):
with get_conn() as conn:
with conn.cursor() as cur:
cur.execute(
"""
INSERT INTO agent_tasks (task_type, payload, status)
VALUES (%s, %s::jsonb, %s)
RETURNING id
""",
(task_type, json.dumps(payload), "pending")
)
task_id = cur.fetchone()["id"]
conn.commit()
return task_id
def fetch_pending_tasks(limit: int = 10):
with get_conn() as conn:
with conn.cursor() as cur:
cur.execute(
"""
SELECT id, task_type, payload
FROM agent_tasks
WHERE status = 'pending'
ORDER BY id ASC
LIMIT %s
""",
(limit,)
)
return cur.fetchall()
- •Run the FastAPI service and wire agents to call the endpoints.
Your orchestration layer can call /accounts/{id}/balance before creating transfer tasks or use /transfers/persisted to write directly into the ledger flow.
import requests
base_url = "http://localhost:8000"
balance_resp = requests.get(f"{base_url}/accounts/ACC123/balance")
print(balance_resp.json())
transfer_resp = requests.post(
f"{base_url}/transfers/persisted",
json={
"from_account": "ACC123",
"to_account": "ACC999",
"amount": 50.0,
},
)
print(transfer_resp.json())
Testing the Integration
Use a simple end-to-end check that hits the API and confirms PostgreSQL persistence through the returned transaction ID.
import requests
response = requests.post(
"http://localhost:8000/transfers/persisted",
json={
"from_account": "ACC1001",
"to_account": "ACC2002",
"amount": 25.5,
},
)
print(response.status_code)
print(response.json())
Expected output:
200
{'transaction_id': 1842, 'status': 'queued'}
If you want deeper verification, query PostgreSQL directly:
SELECT id, from_account, to_account, amount, status
FROM transactions
ORDER BY id DESC
LIMIT 1;
Real-World Use Cases
- •
Customer support agent + ledger agent
- •One agent answers balance questions through FastAPI.
- •Another reads PostgreSQL transaction history to explain pending transfers or failed debits.
- •
Fraud screening pipeline
- •A transfer request lands in FastAPI.
- •PostgreSQL stores the event.
- •A second agent pulls pending tasks from
agent_tasks, scores risk, then updates status.
- •
Branch operations assistant
- •Agents create follow-up tasks for KYC review, limit increases, or chargeback investigation.
- •PostgreSQL keeps every step auditable for compliance teams.
Keep learning
- •The complete AI Agents Roadmap — my full 8-step breakdown
- •Free: The AI Agent Starter Kit — PDF checklist + starter code
- •Work with me — I build AI for banks and insurance companies
By Cyprian Aarons, AI Consultant at Topiax.
Want the complete 8-step roadmap?
Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.
Get the Starter Kit