How to Integrate FastAPI for lending with PostgreSQL for multi-agent systems

By Cyprian AaronsUpdated 2026-04-21
fastapi-for-lendingpostgresqlmulti-agent-systems

Combining FastAPI for lending with PostgreSQL gives you a clean way to expose lending workflows as APIs while persisting agent state, loan decisions, repayment events, and audit trails in a durable database. In a multi-agent system, that means one agent can score risk, another can fetch customer context, and a third can write the final decision without losing consistency.

This setup is useful when you need lending agents that coordinate through HTTP, but still share transactional data safely. FastAPI handles the orchestration layer; PostgreSQL becomes the source of truth for application state, decision logs, and downstream reporting.

Prerequisites

  • Python 3.10+
  • FastAPI installed
  • Uvicorn installed for local API serving
  • psycopg2-binary or asyncpg for PostgreSQL access
  • A running PostgreSQL instance
  • A database and user created for your lending app
  • Basic familiarity with REST endpoints and SQL
  • Optional: SQLAlchemy if you want ORM support in production

Integration Steps

  1. Set up your FastAPI lending service.

Start by defining the API surface that your agents will call. For lending systems, keep endpoints explicit: create application, fetch application status, and record decisions.

from fastapi import FastAPI, HTTPException
from pydantic import BaseModel

app = FastAPI(title="Lending Agent API")

class LoanApplication(BaseModel):
    applicant_id: str
    amount: float
    term_months: int
    income: float

@app.post("/loan/applications")
async def create_application(payload: LoanApplication):
    # Placeholder for agent orchestration + DB write
    return {"status": "received", "application": payload.model_dump()}

@app.get("/loan/applications/{application_id}")
async def get_application(application_id: str):
    return {"application_id": application_id, "status": "pending"}

This gives your multi-agent system a stable contract. One agent can POST applications, another can poll status, and a decision agent can update outcomes later.

  1. Configure PostgreSQL connectivity.

Use a connection pool so multiple agents or concurrent requests do not open raw connections per call. For production-grade API work, psycopg2.pool.SimpleConnectionPool is enough to start.

import os
from psycopg2.pool import SimpleConnectionPool

DATABASE_URL = os.getenv("DATABASE_URL", "postgresql://lending_user:lending_pass@localhost:5432/lending_db")

pool = SimpleConnectionPool(
    minconn=1,
    maxconn=10,
    dsn=DATABASE_URL,
)

def get_conn():
    return pool.getconn()

def put_conn(conn):
    pool.putconn(conn)

Create a table for applications and decisions before wiring the endpoint logic.

CREATE TABLE IF NOT EXISTS loan_applications (
    id SERIAL PRIMARY KEY,
    applicant_id TEXT NOT NULL,
    amount NUMERIC(12, 2) NOT NULL,
    term_months INT NOT NULL,
    income NUMERIC(12, 2) NOT NULL,
    status TEXT NOT NULL DEFAULT 'pending',
    created_at TIMESTAMP NOT NULL DEFAULT NOW()
);
  1. Persist incoming lending requests from FastAPI into PostgreSQL.

Now connect the endpoint to the database. The key pattern is: validate input in FastAPI, write to PostgreSQL in a transaction, return the generated ID.

from fastapi import FastAPI
from pydantic import BaseModel

app = FastAPI()

class LoanApplication(BaseModel):
    applicant_id: str
    amount: float
    term_months: int
    income: float

@app.post("/loan/applications")
async def create_application(payload: LoanApplication):
    conn = get_conn()
    try:
        with conn.cursor() as cur:
            cur.execute(
                """
                INSERT INTO loan_applications (applicant_id, amount, term_months, income)
                VALUES (%s, %s, %s, %s)
                RETURNING id;
                """,
                (payload.applicant_id, payload.amount, payload.term_months, payload.income),
            )
            application_id = cur.fetchone()[0]
            conn.commit()
            return {"application_id": application_id, "status": "stored"}
    finally:
        put_conn(conn)

This is the core integration point. Your first agent can submit an application over HTTP; PostgreSQL stores it durably so other agents can continue processing without shared memory.

  1. Add an agent decision step that reads from PostgreSQL and updates status.

In multi-agent systems, one common flow is: intake agent writes the request, risk agent reads it back, underwriting agent updates it. That keeps each step auditable.

@app.post("/loan/applications/{application_id}/decision")
async def decide_application(application_id: int):
    conn = get_conn()
    try:
        with conn.cursor() as cur:
            cur.execute(
                "SELECT amount, income FROM loan_applications WHERE id = %s;",
                (application_id,),
            )
            row = cur.fetchone()
            if not row:
                raise HTTPException(status_code=404, detail="Application not found")

            amount, income = row
            debt_to_income_proxy = amount / max(income * 12, 1)

            status = "approved" if debt_to_income_proxy < 0.35 else "rejected"

            cur.execute(
                """
                UPDATE loan_applications
                SET status = %s
                WHERE id = %s;
                """,
                (status, application_id),
            )
            conn.commit()
            return {"application_id": application_id, "status": status}
    finally:
        put_conn(conn)

This pattern works well when each agent has one job:

  • Intake agent creates records
  • Risk agent evaluates them
  • Policy agent applies business rules
  • Audit agent writes immutable events
  1. Run the service and wire it into your multi-agent workflow.

Start FastAPI with Uvicorn and let other agents call it over HTTP.

uvicorn main:app --reload --host 0.0.0.0 --port 8000

If another Python-based agent needs to submit an application programmatically:

import requests

response = requests.post(
    "http://localhost:8000/loan/applications",
    json={
        "applicant_id": "cust_1001",
        "amount": 5000,
        "term_months": 24,
        "income": 4200,
    },
)

print(response.json())

That gives you a simple service boundary between agents and storage. Each agent can stay stateless while PostgreSQL preserves workflow state.

Testing the Integration

Use a quick end-to-end check: insert an application through FastAPI and confirm PostgreSQL stored it correctly.

import requests

create_resp = requests.post(
    "http://localhost:8000/loan/applications",
    json={
        "applicant_id": "cust_2002",
        "amount": 12000,
        "term_months": 36,
        "income": 6500,
    },
)

data = create_resp.json()
app_id = data["application_id"]

decision_resp = requests.post(f"http://localhost:8000/loan/applications/{app_id}/decision")
print(create_resp.json())
print(decision_resp.json())

Expected output:

{"application_id": 1, "status": "stored"}
{"application_id": 1, "status": "approved"}

If you want to verify directly in PostgreSQL:

SELECT id, applicant_id, amount, term_months, income, status
FROM loan_applications
ORDER BY id DESC;

Real-World Use Cases

  • Loan intake orchestration: One agent collects borrower data through FastAPI while another validates eligibility rules and persists every step in PostgreSQL.
  • Underwriting pipelines: Multiple agents can read the same loan record from PostgreSQL, apply different scoring models or policy checks through API calls, and write back final decisions.
  • Audit-ready decisioning: Store all request payloads, model outputs, approvals, rejections, and timestamps in PostgreSQL for compliance reporting and post-decision review.

Keep learning

By Cyprian Aarons, AI Consultant at Topiax.

Want the complete 8-step roadmap?

Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.

Get the Starter Kit

Related Guides