How to Integrate FastAPI for fintech with PostgreSQL for multi-agent systems
Combining FastAPI for fintech with PostgreSQL gives you a clean way to expose financial workflows as APIs while keeping agent state, transaction history, and audit trails in a durable database. In multi-agent systems, that matters because agents need shared memory, deterministic persistence, and a reliable way to coordinate actions like payment checks, fraud scoring, and customer risk profiling.
Prerequisites
- •Python 3.10+
- •FastAPI installed and running in your fintech service
- •PostgreSQL 14+ running locally or in your cloud environment
- •A PostgreSQL database and user with read/write permissions
- •
psycopgorasyncpgfor PostgreSQL connectivity - •
uvicornfor serving the FastAPI app - •Environment variables configured for:
- •
DATABASE_URL - •
API_KEYor other auth secrets used by your fintech endpoints
- •
- •Basic familiarity with async Python
Integration Steps
- •
Install the dependencies
Use FastAPI for the API layer and PostgreSQL as the persistent store for agent messages, decisions, and workflow state.
pip install fastapi uvicorn psycopg[binary] pydantic - •
Create the PostgreSQL connection layer
For multi-agent systems, keep a single database access module so every agent writes to the same schema. This example uses
psycopg.connect()with a connection string fromDATABASE_URL.import os import psycopg from psycopg.rows import dict_row DATABASE_URL = os.getenv("DATABASE_URL") def get_connection(): return psycopg.connect(DATABASE_URL, row_factory=dict_row) def init_db(): with get_connection() as conn: with conn.cursor() as cur: cur.execute(""" CREATE TABLE IF NOT EXISTS agent_events ( id SERIAL PRIMARY KEY, agent_name TEXT NOT NULL, event_type TEXT NOT NULL, payload JSONB NOT NULL, created_at TIMESTAMPTZ DEFAULT NOW() ) """) conn.commit() - •
Build the FastAPI endpoint that writes agent events to PostgreSQL
This is the core integration point. Your FastAPI route receives an agent action, validates it with Pydantic, then persists it using standard PostgreSQL SQL.
from fastapi import FastAPI from pydantic import BaseModel import json app = FastAPI() class AgentEvent(BaseModel): agent_name: str event_type: str payload: dict @app.on_event("startup") def startup(): init_db() @app.post("/agent-events") def create_agent_event(event: AgentEvent): with get_connection() as conn: with conn.cursor() as cur: cur.execute( """ INSERT INTO agent_events (agent_name, event_type, payload) VALUES (%s, %s, %s::jsonb) RETURNING id, created_at """, (event.agent_name, event.event_type, json.dumps(event.payload)), ) row = cur.fetchone() conn.commit() return { "id": row["id"], "created_at": row["created_at"], "status": "stored" } - •
Add a read endpoint for coordination between agents
Multi-agent systems need shared memory retrieval. One agent can write a fraud signal while another reads it before approving a transaction.
from typing import List @app.get("/agent-events/{agent_name}") def list_agent_events(agent_name: str) -> List[dict]: with get_connection() as conn: with conn.cursor() as cur: cur.execute( """ SELECT id, agent_name, event_type, payload, created_at FROM agent_events WHERE agent_name = %s ORDER BY created_at DESC LIMIT 20 """, (agent_name,), ) rows = cur.fetchall() return rows - •
Wire the service into your multi-agent workflow
In production, one agent may call this API after scoring risk, while another uses it to fetch context before making a decision. Keep HTTP calls separate from DB logic so you can scale them independently.
import requests BASE_URL = "http://localhost:8000" def publish_agent_decision(): response = requests.post( f"{BASE_URL}/agent-events", json={ "agent_name": "risk_agent", "event_type": "risk_score", "payload": { "customer_id": "cust_123", "score": 82, "decision": "review" } }, timeout=5, ) response.raise_for_status() return response.json() def fetch_recent_context(): response = requests.get(f"{BASE_URL}/agent-events/risk_agent", timeout=5) response.raise_for_status() return response.json()
Testing the Integration
Run the API:
uvicorn main:app --reload
Then test it with a simple Python script:
import requests
payload = {
"agent_name": "fraud_agent",
"event_type": "flag_transaction",
"payload": {
"transaction_id": "tx_987",
"reason": "velocity_threshold_exceeded"
}
}
create_resp = requests.post("http://localhost:8000/agent-events", json=payload)
print(create_resp.json())
read_resp = requests.get("http://localhost:8000/agent-events/fraud_agent")
print(read_resp.json())
Expected output:
{
"id": 1,
"created_at": "2026-04-21T12:34:56.789012+00:00",
"status": "stored"
}
And for the read call:
[
{
"id": 1,
"agent_name": "fraud_agent",
"event_type": "flag_transaction",
"payload": {
"transaction_id": "tx_987",
"reason": "velocity_threshold_exceeded"
},
"created_at": "2026-04-21T12:34:56.789012+00:00"
}
]
Real-World Use Cases
- •
Fraud triage pipelines
- •One agent scores transactions.
- •Another agent checks historical events in PostgreSQL before escalating.
- •FastAPI exposes both steps as internal services.
- •
Customer support copilots
- •Agents pull account context from PostgreSQL.
- •FastAPI routes orchestrate ticket creation, balance checks, and case updates.
- •
Credit decision workflows
- •A risk agent stores underwriting signals.
- •A policy agent reads those signals and applies approval rules through API endpoints.
The pattern is simple: use FastAPI as the control plane and PostgreSQL as the system of record. That gives you predictable behavior when multiple agents are reading and writing financial state at the same time.
Keep learning
- •The complete AI Agents Roadmap — my full 8-step breakdown
- •Free: The AI Agent Starter Kit — PDF checklist + starter code
- •Work with me — I build AI for banks and insurance companies
By Cyprian Aarons, AI Consultant at Topiax.
Want the complete 8-step roadmap?
Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.
Get the Starter Kit