How to Integrate FastAPI for retail banking with LangChain for startups

By Cyprian AaronsUpdated 2026-04-21
fastapi-for-retail-bankinglangchainstartups

Why this integration matters

If you’re building retail banking workflows for a startup, FastAPI gives you the HTTP layer for account lookup, KYC checks, transaction status, and internal banking APIs. LangChain adds the orchestration layer so an agent can interpret user intent, call those APIs, and return a structured answer instead of a brittle keyword match.

The useful pattern here is simple: FastAPI exposes banking capabilities as typed endpoints, and LangChain wraps those endpoints as tools inside an agent. That gives you a controlled way to let users ask things like “What’s my available balance?” or “Did my card payment settle?” without wiring every intent manually.

Prerequisites

  • Python 3.10+
  • A FastAPI app already running or ready to create
  • langchain, langchain-openai, fastapi, uvicorn, httpx, and pydantic installed
  • An OpenAI API key set in your environment if you use ChatOpenAI
  • A banking backend or mock service with endpoints for:
    • account balance
    • recent transactions
    • payment status
  • Basic familiarity with async Python

Install the packages:

pip install fastapi uvicorn httpx pydantic langchain langchain-openai

Integration Steps

  1. Build the banking API in FastAPI

Start by exposing the banking operations as explicit endpoints. Keep them narrow and typed; agents work better when each tool does one thing.

from fastapi import FastAPI, HTTPException
from pydantic import BaseModel

app = FastAPI(title="Retail Banking API")

class BalanceResponse(BaseModel):
    account_id: str
    available_balance: float
    currency: str

class PaymentStatusResponse(BaseModel):
    payment_id: str
    status: str

@app.get("/accounts/{account_id}/balance", response_model=BalanceResponse)
async def get_balance(account_id: str):
    if account_id != "acc_123":
        raise HTTPException(status_code=404, detail="Account not found")
    return BalanceResponse(
        account_id=account_id,
        available_balance=2450.75,
        currency="USD",
    )

@app.get("/payments/{payment_id}/status", response_model=PaymentStatusResponse)
async def get_payment_status(payment_id: str):
    return PaymentStatusResponse(payment_id=payment_id, status="settled")

Run it:

uvicorn app:app --reload --port 8000
  1. Wrap FastAPI endpoints as LangChain tools

LangChain agents need tools they can call. The cleanest production pattern is to wrap your REST calls in functions and decorate them with @tool.

import httpx
from langchain_core.tools import tool

BASE_URL = "http://localhost:8000"

@tool
def get_account_balance(account_id: str) -> str:
    """Get the available balance for a retail banking account."""
    response = httpx.get(f"{BASE_URL}/accounts/{account_id}/balance", timeout=10)
    response.raise_for_status()
    data = response.json()
    return f"Account {data['account_id']} has {data['available_balance']} {data['currency']} available."

@tool
def get_payment_status(payment_id: str) -> str:
    """Get the current status of a payment."""
    response = httpx.get(f"{BASE_URL}/payments/{payment_id}/status", timeout=10)
    response.raise_for_status()
    data = response.json()
    return f"Payment {data['payment_id']} is {data['status']}."

This keeps your agent decoupled from internal banking logic. If the API changes later, you update the tool wrapper instead of rewriting the agent.

  1. Create a LangChain agent that uses those tools

Use a chat model plus tool-calling agent. In LangChain, create_tool_calling_agent and AgentExecutor are the standard path for this.

from langchain_openai import ChatOpenAI
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
from langchain.agents import create_tool_calling_agent, AgentExecutor

llm = ChatOpenAI(model="gpt-4o-mini", temperature=0)

prompt = ChatPromptTemplate.from_messages([
    ("system", "You are a retail banking assistant. Use tools for balances and payment status."),
    ("human", "{input}"),
    MessagesPlaceholder(variable_name="agent_scratchpad"),
])

tools = [get_account_balance, get_payment_status]
agent = create_tool_calling_agent(llm, tools, prompt)

executor = AgentExecutor(agent=agent, tools=tools, verbose=True)

Now the model can decide whether to call /accounts/{account_id}/balance or /payments/{payment_id}/status based on user input.

  1. Expose an AI endpoint in FastAPI

Your startup usually wants one public endpoint that frontends or internal apps can hit. This endpoint receives user text and returns the agent’s answer.

from fastapi import Body

@app.post("/assistant")
async def assistant(message: str = Body(embed=True)):
    result = await executor.ainvoke({"input": message})
    return {"answer": result["output"]}

That’s the bridge: FastAPI handles transport and auth boundaries; LangChain handles orchestration.

  1. Add guardrails for banking workflows

For retail banking, don’t let the agent freestyle sensitive actions. Restrict it to read-only operations unless you’ve implemented explicit approval flows.

ALLOWED_INTENTS = {"balance", "payment_status"}

def validate_request(message: str) -> None:
    lowered = message.lower()
    if "transfer" in lowered or "withdraw" in lowered:
        raise HTTPException(status_code=403, detail="Action not allowed in this assistant")

@app.post("/assistant-safe")
async def assistant_safe(message: str = Body(embed=True)):
    validate_request(message)
    result = await executor.ainvoke({"input": message})
    return {"answer": result["output"]}

For startups in regulated environments, this kind of intent filtering is not optional.

Testing the Integration

You can test end-to-end with a simple request against your FastAPI assistant endpoint.

import httpx

payload = {"message": "What is the balance for account acc_123?"}

response = httpx.post("http://localhost:8000/assistant", json=payload, timeout=30)
print(response.status_code)
print(response.json())

Expected output:

200
{
  "answer": "Account acc_123 has 2450.75 USD available."
}

If you want to test directly from Python without hitting HTTP twice, call the executor:

import asyncio

async def main():
    result = await executor.ainvoke({"input": "Check payment status for payment_789"})
    print(result["output"])

asyncio.run(main())

Expected output:

Payment payment_789 is settled.

Real-World Use Cases

  • Balance and transaction support bot

    • Let customers ask for balances, last payments, or settlement status through a chat UI backed by your FastAPI service.
  • Internal ops assistant

    • Give support teams an agent that checks account state before escalating issues or opening tickets.
  • KYC and onboarding helper

    • Use FastAPI endpoints for identity verification status while LangChain summarizes next steps for applicants.

The practical win here is control. FastAPI keeps your banking surface area explicit and auditable, while LangChain lets you build an agent that speaks natural language without turning your backend into an unstructured mess.


Keep learning

By Cyprian Aarons, AI Consultant at Topiax.

Want the complete 8-step roadmap?

Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.

Get the Starter Kit

Related Guides