How to Integrate FastAPI for pension funds with LangChain for AI agents

By Cyprian AaronsUpdated 2026-04-21
fastapi-for-pension-fundslangchainai-agents

Why this integration matters

Pension fund workflows are full of repetitive, policy-heavy queries: contribution status, benefit estimates, eligibility checks, document retrieval, and case triage. FastAPI gives you a clean service layer for those operations, while LangChain lets an AI agent decide when to call those services and how to turn the results into a usable answer.

The useful pattern here is simple: FastAPI exposes pension fund data and actions as stable HTTP endpoints, and LangChain wraps those endpoints as tools an agent can invoke. That gives you a controlled AI layer without handing the model direct database access.

Prerequisites

  • Python 3.10+
  • A running FastAPI service for your pension fund domain
  • pip install fastapi uvicorn langchain langchain-openai httpx pydantic
  • An OpenAI API key set in your environment:
    • export OPENAI_API_KEY=...
  • A pension fund API with endpoints like:
    • GET /members/{member_id}
    • GET /benefits/{member_id}/estimate
    • POST /cases/triage
  • Basic familiarity with:
    • FastAPI(), path operations like @app.get()
    • LangChain tools via @tool
    • Agent creation with create_openai_tools_agent() and AgentExecutor

Integration Steps

1) Expose pension fund capabilities through FastAPI

Start by wrapping the pension fund operations behind explicit HTTP endpoints. Keep the contract narrow so the agent only sees what it needs.

from fastapi import FastAPI, HTTPException
from pydantic import BaseModel

app = FastAPI(title="Pension Fund API")

class TriageRequest(BaseModel):
    member_id: str
    issue_type: str
    description: str

@app.get("/members/{member_id}")
async def get_member(member_id: str):
    # Replace with real DB/service lookup
    if member_id != "M1001":
        raise HTTPException(status_code=404, detail="Member not found")

    return {
        "member_id": member_id,
        "name": "A. Moyo",
        "status": "active",
        "fund": "Retirement Plus"
    }

@app.get("/benefits/{member_id}/estimate")
async def estimate_benefit(member_id: str, salary: float, years_of_service: int):
    monthly_estimate = salary * 0.015 * years_of_service
    return {
        "member_id": member_id,
        "estimated_monthly_benefit": round(monthly_estimate, 2)
    }

@app.post("/cases/triage")
async def triage_case(payload: TriageRequest):
    return {
        "case_id": "CASE-90021",
        "priority": "medium",
        "routing_team": "member-services",
        "summary": payload.description
    }

Run it:

uvicorn app:app --reload --port 8000

2) Wrap FastAPI endpoints as LangChain tools

LangChain agents work best when each external action is a tool with a clear input/output boundary. Use @tool and call your FastAPI service with httpx.

import os
import httpx
from langchain_core.tools import tool

BASE_URL = os.getenv("PENSION_API_URL", "http://localhost:8000")

@tool
def get_member_tool(member_id: str) -> dict:
    """Fetch pension member details by member ID."""
    response = httpx.get(f"{BASE_URL}/members/{member_id}", timeout=10)
    response.raise_for_status()
    return response.json()

@tool
def estimate_benefit_tool(member_id: str, salary: float, years_of_service: int) -> dict:
    """Estimate monthly pension benefit for a member."""
    params = {"salary": salary, "years_of_service": years_of_service}
    response = httpx.get(
        f"{BASE_URL}/benefits/{member_id}/estimate",
        params=params,
        timeout=10,
    )
    response.raise_for_status()
    return response.json()

@tool
def triage_case_tool(member_id: str, issue_type: str, description: str) -> dict:
    """Create a triaged support case for a pension fund member."""
    payload = {
        "member_id": member_id,
        "issue_type": issue_type,
        "description": description,
    }
    response = httpx.post(f"{BASE_URL}/cases/triage", json=payload, timeout=10)
    response.raise_for_status()
    return response.json()

This is the main integration boundary. The model never touches your backend directly; it only calls tools that you control.

3) Build the LangChain agent around those tools

Now wire the tools into an agent using LangChain’s OpenAI tool-calling stack. This gives you natural-language routing with deterministic tool execution.

from langchain_openai import ChatOpenAI
from langchain.agents import create_openai_tools_agent, AgentExecutor
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder

llm = ChatOpenAI(model="gpt-4o-mini", temperature=0)

prompt = ChatPromptTemplate.from_messages([
    ("system", 
     "You are a pension fund assistant. "
     "Use tools for factual data lookup and case creation. "
     "Do not guess numbers."),
    ("human", "{input}"),
    MessagesPlaceholder(variable_name="agent_scratchpad"),
])

tools = [get_member_tool, estimate_benefit_tool, triage_case_tool]

agent = create_openai_tools_agent(llm=llm, tools=tools, prompt=prompt)
executor = AgentExecutor(agent=agent, tools=tools, verbose=True)

At this point the agent can answer questions like:

  • “Look up member M1001”
  • “Estimate retirement benefit for M1001 using salary 42000 and 18 years”
  • “Create a support case for contribution mismatch”

4) Call the agent from your application layer

Keep your application code thin. Your API or worker should pass user text to the executor and return the result.

def handle_user_query(query: str) -> str:
    result = executor.invoke({"input": query})
    return result["output"]

if __name__ == "__main__":
    print(handle_user_query(
        "Check member M1001 and estimate their monthly benefit using salary 42000 and 18 years of service."
    ))

This is where LangChain adds value: one user request can trigger multiple backend calls in sequence without you hand-coding every branch.

Testing the Integration

Use a simple smoke test against the running FastAPI service and agent.

if __name__ == "__main__":
    query = (
        "Get details for member M1001, then estimate their monthly benefit "
        "using salary 42000 and 18 years of service."
    )
    output = handle_user_query(query)
    print(output)

Expected output will look like this:

Member M1001 belongs to A. Moyo and is active under Retirement Plus.
Estimated monthly benefit is 11340.0.

If you want to test just the tool layer before involving the agent:

print(get_member_tool.invoke({"member_id": "M1001"}))
print(estimate_benefit_tool.invoke({
    "member_id": "M1001",
    "salary": 42000,
    "years_of_service": 18
}))

That should return JSON dictionaries from your FastAPI service.

Real-World Use Cases

  • Member self-service assistant
    • Answer questions about account status, contribution history, and estimated retirement benefits.
  • Case triage automation
    • Classify incoming complaints or requests and route them to the correct internal team.
  • Advisor support copilot
    • Help call-center or back-office staff retrieve records faster while keeping all actions behind audited API endpoints.

The production pattern is straightforward: FastAPI owns business operations, LangChain owns orchestration. Keep each endpoint explicit, keep tool inputs typed, and keep anything sensitive behind server-side authorization checks.


Keep learning

By Cyprian Aarons, AI Consultant at Topiax.

Want the complete 8-step roadmap?

Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.

Get the Starter Kit

Related Guides