How to Integrate FastAPI for fintech with PostgreSQL for startups
FastAPI gives you the HTTP layer for fintech workflows, while PostgreSQL gives you durable state for customers, transactions, and audit trails. Put them together and you get a backend that can receive payment events, validate them in real time, persist them safely, and feed an AI agent with structured financial context.
For startups building AI agent systems, this combo is the default architecture: FastAPI handles requests from agents or external services, and PostgreSQL stores the source of truth for balances, ledgers, risk flags, and conversation state.
Prerequisites
- •Python 3.10+
- •PostgreSQL 14+
- •A running FastAPI app
- •
pipinstalled - •A PostgreSQL database created for your app
- •Basic knowledge of async Python
- •Environment variables set for:
- •
DATABASE_URL - •
APP_ENV - •
SECRET_KEYif you sign requests or tokens
- •
Install the core packages:
pip install fastapi uvicorn[standard] sqlalchemy psycopg2-binary pydantic
If you want async database access, use asyncpg:
pip install asyncpg
Integration Steps
1) Define your database connection
Start with a clean PostgreSQL connection string and SQLAlchemy engine. For production fintech systems, keep connection pooling explicit and avoid opening raw connections per request.
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker, declarative_base
DATABASE_URL = "postgresql+psycopg2://fintech_user:fintech_pass@localhost:5432/fintech_db"
engine = create_engine(
DATABASE_URL,
pool_size=10,
max_overflow=20,
pool_pre_ping=True,
)
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
Base = declarative_base()
This gives you a reusable engine and session factory. In an AI agent system, every tool call that touches financial state should go through this layer.
2) Create a transaction model
Model your financial records explicitly. Don’t store “payment” as a blob unless you enjoy debugging broken audits later.
from sqlalchemy import Column, Integer, String, Numeric, DateTime, func
class Transaction(Base):
__tablename__ = "transactions"
id = Column(Integer, primary_key=True, index=True)
account_id = Column(String(64), index=True, nullable=False)
reference = Column(String(128), unique=True, index=True, nullable=False)
amount = Column(Numeric(12, 2), nullable=False)
currency = Column(String(3), nullable=False, default="USD")
status = Column(String(32), nullable=False, default="pending")
created_at = Column(DateTime(timezone=True), server_default=func.now())
Create the table at startup:
Base.metadata.create_all(bind=engine)
For fintech workflows:
- •
referenceshould be unique for idempotency - •
statusshould track lifecycle states likepending,settled,failed - •
amountshould use fixed precision numeric types
3) Wire FastAPI to PostgreSQL with dependency injection
FastAPI’s dependency system is the cleanest way to manage DB sessions per request.
from fastapi import FastAPI, Depends
from sqlalchemy.orm import Session
app = FastAPI()
def get_db():
db = SessionLocal()
try:
yield db
finally:
db.close()
Now add a route that writes a transaction to PostgreSQL:
from fastapi import HTTPException
from pydantic import BaseModel
class TransactionCreate(BaseModel):
account_id: str
reference: str
amount: float
currency: str = "USD"
@app.post("/transactions")
def create_transaction(payload: TransactionCreate, db: Session = Depends(get_db)):
existing = db.query(Transaction).filter(Transaction.reference == payload.reference).first()
if existing:
raise HTTPException(status_code=409, detail="Duplicate transaction reference")
tx = Transaction(
account_id=payload.account_id,
reference=payload.reference,
amount=payload.amount,
currency=payload.currency,
status="pending",
)
db.add(tx)
db.commit()
db.refresh(tx)
return {
"id": tx.id,
"reference": tx.reference,
"status": tx.status,
"amount": str(tx.amount),
"currency": tx.currency,
}
This is the core pattern your AI agent will call when it needs to record a payment intent or ledger event.
4) Add an endpoint for retrieval and agent context
AI agents need read access too. Build a lookup endpoint so an orchestration layer can fetch transaction state before deciding the next action.
@app.get("/transactions/{reference}")
def get_transaction(reference: str, db: Session = Depends(get_db)):
tx = db.query(Transaction).filter(Transaction.reference == reference).first()
if not tx:
raise HTTPException(status_code=404, detail="Transaction not found")
return {
"id": tx.id,
"account_id": tx.account_id,
"reference": tx.reference,
"amount": str(tx.amount),
"currency": tx.currency,
"status": tx.status,
"created_at": tx.created_at.isoformat() if tx.created_at else None,
}
In practice:
- •The agent creates a transaction record first
- •A downstream service settles it later
- •The agent polls or receives a webhook update and reads the final status from PostgreSQL
5) Run the app and connect it to your startup workflow
Start FastAPI with Uvicorn:
uvicorn main:app --reload --host 0.0.0.0 --port 8000
A startup AI agent might call this API after extracting payment details from a user message or internal workflow event. The DB becomes the durable memory layer for all financial actions.
Testing the Integration
Use FastAPI’s built-in test client to verify both API behavior and PostgreSQL persistence.
from fastapi.testclient import TestClient
client = TestClient(app)
def test_create_and_fetch_transaction():
payload = {
"account_id": "acct_001",
"reference": "txn_abc123",
"amount": 1250.50,
"currency": "USD",
}
create_resp = client.post("/transactions", json=payload)
assert create_resp.status_code == 200
fetch_resp = client.get("/transactions/txn_abc123")
assert fetch_resp.status_code == 200
print(fetch_resp.json())
Expected output:
{
"id": 1,
"account_id": "acct_001",
"reference": "txn_abc123",
"amount": "1250.50",
"currency": "USD",
"status": "pending",
"created_at": "2026-04-21T12:00:00+00:00"
}
If this passes, your API can write to PostgreSQL and read back consistent financial state.
Real-World Use Cases
- •
Payment orchestration for startups
Use FastAPI as the control plane for payment initiation and PostgreSQL as the ledger of record for transaction lifecycle tracking. - •
AI-assisted reconciliation
An agent can compare incoming bank events against stored transactions in PostgreSQL and flag mismatches through FastAPI endpoints. - •
Risk and fraud review workflows
Store risk scores, review decisions, and audit history in PostgreSQL while exposing review actions through FastAPI APIs for internal tools or agents.
Keep learning
- •The complete AI Agents Roadmap — my full 8-step breakdown
- •Free: The AI Agent Starter Kit — PDF checklist + starter code
- •Work with me — I build AI for banks and insurance companies
By Cyprian Aarons, AI Consultant at Topiax.
Want the complete 8-step roadmap?
Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.
Get the Starter Kit