How to Integrate FastAPI for pension funds with LangChain for startups

By Cyprian AaronsUpdated 2026-04-21
fastapi-for-pension-fundslangchainstartups

Combining FastAPI for pension funds with LangChain gives you a clean way to expose pension-domain workflows as HTTP APIs while letting an LLM reason over policy data, member queries, and document retrieval. For startups building AI agents, this is the practical pattern: FastAPI handles the service boundary and validation, LangChain handles orchestration, tool use, and retrieval.

Prerequisites

  • Python 3.10+
  • fastapi
  • uvicorn
  • langchain
  • langchain-openai or another LangChain model provider
  • pydantic
  • Access to your pension fund backend API or database
  • An OpenAI API key if you use ChatOpenAI
  • A working FastAPI app with CORS enabled if your agent runs in a separate frontend

Install the core packages:

pip install fastapi uvicorn langchain langchain-openai pydantic httpx

Integration Steps

  1. Build a FastAPI service for pension fund operations.

Start by exposing the pension-specific functions as normal REST endpoints. Keep them narrow: member lookup, contribution status, retirement estimate, and document retrieval.

from fastapi import FastAPI, HTTPException
from pydantic import BaseModel
from typing import Optional

app = FastAPI(title="Pension Fund API")

class MemberQuery(BaseModel):
    member_id: str

class RetirementEstimateRequest(BaseModel):
    member_id: str
    current_age: int
    retirement_age: int
    monthly_contribution: float

@app.get("/members/{member_id}")
def get_member(member_id: str):
    # Replace with real DB/API call
    if member_id != "M123":
        raise HTTPException(status_code=404, detail="Member not found")
    return {
        "member_id": member_id,
        "name": "Amina Patel",
        "fund_balance": 128400.50,
        "status": "active",
    }

@app.post("/retirement/estimate")
def estimate_retirement(payload: RetirementEstimateRequest):
    years = max(payload.retirement_age - payload.current_age, 0)
    projected = payload.monthly_contribution * 12 * years * 1.05
    return {
        "member_id": payload.member_id,
        "projected_value": round(projected, 2),
        "years_to_retirement": years,
    }
  1. Wrap those endpoints as LangChain tools.

LangChain works best when your agent can call external systems through tools. Use StructuredTool or the newer tool decorators so the model can invoke your FastAPI-backed operations safely.

import httpx
from langchain_core.tools import StructuredTool
from pydantic import BaseModel, Field

FASTAPI_BASE_URL = "http://localhost:8000"

class MemberArgs(BaseModel):
    member_id: str = Field(..., description="Pension fund member ID")

class EstimateArgs(BaseModel):
    member_id: str
    current_age: int
    retirement_age: int
    monthly_contribution: float

def fetch_member(member_id: str) -> dict:
    response = httpx.get(f"{FASTAPI_BASE_URL}/members/{member_id}", timeout=10)
    response.raise_for_status()
    return response.json()

def fetch_estimate(member_id: str, current_age: int, retirement_age: int, monthly_contribution: float) -> dict:
    response = httpx.post(
        f"{FASTAPI_BASE_URL}/retirement/estimate",
        json={
            "member_id": member_id,
            "current_age": current_age,
            "retirement_age": retirement_age,
            "monthly_contribution": monthly_contribution,
        },
        timeout=10,
    )
    response.raise_for_status()
    return response.json()

get_member_tool = StructuredTool.from_function(
    func=fetch_member,
    name="get_member",
    description="Fetch pension fund member details from the FastAPI pension service.",
)

estimate_tool = StructuredTool.from_function(
    func=fetch_estimate,
    name="estimate_retirement",
    description="Calculate a retirement projection using the pension service.",
)
  1. Create the LangChain agent that uses those tools.

Use a chat model plus an agent executor. The agent decides when to call your FastAPI endpoints and how to present the result to the user.

from langchain_openai import ChatOpenAI
from langchain.agents import create_tool_calling_agent, AgentExecutor
from langchain_core.prompts import ChatPromptTemplate

llm = ChatOpenAI(model="gpt-4o-mini", temperature=0)

prompt = ChatPromptTemplate.from_messages([
    ("system", "You are a pension assistant for startup customer support. Use tools for any account or projection lookup."),
    ("human", "{input}"),
])

tools = [get_member_tool, estimate_tool]
agent = create_tool_calling_agent(llm=llm, tools=tools, prompt=prompt)
executor = AgentExecutor(agent=agent, tools=tools, verbose=True)

result = executor.invoke({
    "input": "Get details for member M123 and estimate retirement with age 34 to 60 and monthly contribution 1200."
})

print(result["output"])
  1. Expose the agent through FastAPI so startups can call one endpoint.

This is the pattern you want in production: your app receives a request, forwards it to the LangChain agent, and returns a structured answer. That keeps your frontend and downstream systems simple.

from fastapi import FastAPI
from pydantic import BaseModel

agent_app = FastAPI(title="Pension AI Agent")

class ChatRequest(BaseModel):
    message: str

@agent_app.post("/agent/chat")
def chat(req: ChatRequest):
    response = executor.invoke({"input": req.message})
    return {"answer": response["output"]}
  1. Run both services and keep them separated.

Run the pension API on one port and the agent API on another during development. In production, put them behind your gateway and secure both with auth.

uvicorn pension_api:app --reload --port 8000
uvicorn agent_api:agent_app --reload --port 8001

Testing the Integration

Hit the agent endpoint with a real request and confirm it triggers both the LangChain tool call and the underlying FastAPI endpoint.

import httpx

response = httpx.post(
    "http://localhost:8001/agent/chat",
    json={
        "message": "Look up member M123 and tell me their estimated retirement value using age 34 to 60 with monthly contribution 1200."
    },
)

print(response.status_code)
print(response.json())

Expected output:

{
  "answer": "Member M123 is Amina Patel with a fund balance of 128400.5 and active status. The estimated retirement value is 397440.0 over 26 years."
}

Real-World Use Cases

  • Pension support agents that answer member questions about balances, eligibility, contribution history, and projected benefits.
  • Internal advisor copilots that summarize policy data before a human review call.
  • Document-aware workflows that combine policy PDFs, benefit rules, and live API calls for faster case handling.

Keep learning

By Cyprian Aarons, AI Consultant at Topiax.

Want the complete 8-step roadmap?

Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.

Get the Starter Kit

Related Guides