How to Integrate FastAPI for wealth management with LangChain for startups
Combining FastAPI for wealth management with LangChain gives you a clean way to expose financial workflows as APIs while letting an LLM reason over them. For startups, that usually means one thing: you can turn portfolio lookup, client onboarding, and investment Q&A into agent-driven endpoints without building a separate orchestration layer.
Prerequisites
- •Python 3.10+
- •
fastapi - •
uvicorn - •
langchain - •
langchain-openaior another LangChain chat model provider - •A running FastAPI service for your wealth management backend
- •API credentials for your model provider
- •Basic familiarity with async Python and REST APIs
Install the core packages:
pip install fastapi uvicorn langchain langchain-openai httpx pydantic
Integration Steps
- •Define the wealth management API client in FastAPI
Start by exposing the core wealth operations as typed endpoints. In practice, this is where you wrap portfolio data, risk profiles, and account summaries behind clean routes.
from fastapi import FastAPI, HTTPException
from pydantic import BaseModel
from typing import List
app = FastAPI(title="Wealth Management API")
class PortfolioRequest(BaseModel):
client_id: str
class Asset(BaseModel):
symbol: str
quantity: float
market_value: float
class PortfolioResponse(BaseModel):
client_id: str
total_value: float
assets: List[Asset]
@app.post("/portfolio", response_model=PortfolioResponse)
async def get_portfolio(req: PortfolioRequest):
if req.client_id != "client_123":
raise HTTPException(status_code=404, detail="Client not found")
return PortfolioResponse(
client_id=req.client_id,
total_value=250000.0,
assets=[
Asset(symbol="AAPL", quantity=50, market_value=10000.0),
Asset(symbol="VOO", quantity=100, market_value=45000.0),
],
)
This is the contract LangChain will call. Keep the schema strict; agents do better when the API returns predictable shapes.
- •Wrap the FastAPI endpoint in a LangChain tool
LangChain tools let the agent call external systems using a function-like interface. For startup use cases, I prefer wrapping HTTP calls rather than binding directly to internal code, because it keeps deployment boundaries clean.
import httpx
from langchain_core.tools import tool
WEALTH_API_URL = "http://localhost:8000"
@tool
async def get_client_portfolio(client_id: str) -> dict:
"""Fetch a client's portfolio from the wealth management API."""
async with httpx.AsyncClient() as client:
response = await client.post(
f"{WEALTH_API_URL}/portfolio",
json={"client_id": client_id},
timeout=10.0,
)
response.raise_for_status()
return response.json()
That @tool decorator is the key LangChain integration point here. It turns your FastAPI-backed operation into something the agent can invoke during reasoning.
- •Create the LangChain agent that uses the tool
Now wire the tool into a chat model and let it decide when to call the API. Use a model that supports tool calling; OpenAI-compatible models are the most common path.
import os
from langchain_openai import ChatOpenAI
from langchain_core.messages import HumanMessage
llm = ChatOpenAI(
model="gpt-4o-mini",
api_key=os.environ["OPENAI_API_KEY"],
)
tools = [get_client_portfolio]
llm_with_tools = llm.bind_tools(tools)
async def answer_portfolio_question(question: str):
messages = [HumanMessage(content=question)]
response = await llm_with_tools.ainvoke(messages)
return response
At this stage, the model can decide whether to answer directly or call get_client_portfolio. That’s the basic agent loop you want for wealth workflows.
- •Add an orchestration endpoint in FastAPI for agent requests
Expose an AI endpoint that accepts natural language and routes it through LangChain. This gives your startup one API surface for both structured finance data and conversational intelligence.
from fastapi import Body
@app.post("/agent/portfolio-summary")
async def portfolio_summary(prompt: str = Body(..., embed=True)):
result = await answer_portfolio_question(prompt)
return {
"prompt": prompt,
"response": result.content,
"tool_calls": [
{
"name": tc["name"],
"args": tc["args"],
"id": tc["id"],
}
for tc in getattr(result, "tool_calls", [])
],
}
This pattern works well when you want product teams to consume one endpoint instead of stitching together separate AI and finance services.
- •Make the workflow production-safe
For real deployments, add auth, request tracing, retries, and timeouts around the HTTP boundary. Also validate what comes back from tools before passing it into downstream prompts.
from pydantic import BaseModel, Field
class AgentRequest(BaseModel):
prompt: str = Field(min_length=1)
@app.post("/agent/secure-summary")
async def secure_summary(req: AgentRequest):
result = await answer_portfolio_question(req.prompt)
if not result.content:
raise HTTPException(status_code=500, detail="Empty agent response")
return {"answer": result.content}
The main rule here is simple: treat every tool call like an untrusted network dependency, even if it lives in your own stack.
Testing the Integration
Run both services locally:
uvicorn main:app --reload --port 8000
Then test the agent endpoint:
import httpx
import asyncio
async def test():
async with httpx.AsyncClient() as client:
resp = await client.post(
"http://localhost:8000/agent/portfolio-summary",
json={"prompt": "Summarize client_123's portfolio in one paragraph."},
)
print(resp.status_code)
print(resp.json())
asyncio.run(test())
Expected output:
{
"prompt": "Summarize client_123's portfolio in one paragraph.",
"response": "...",
"tool_calls": [
{
"name": "get_client_portfolio",
"args": {"client_id": "client_123"},
"id": "call_..."
}
]
}
If everything is wired correctly, you should see a tool call to get_client_portfolio and a natural-language summary built from its response.
Real-World Use Cases
- •
Client portfolio assistant
- •Let relationship managers ask questions like “What changed in this account this quarter?” and get structured answers backed by live FastAPI data.
- •
Onboarding copilot
- •Combine KYC endpoints with LangChain to guide users through missing documents, risk profiling, and account setup.
- •
Advisor support bot
- •Build an internal assistant that pulls holdings, performance metrics, and policy constraints before drafting recommendations.
The practical win here is speed with control. FastAPI gives you typed financial APIs; LangChain gives you agent behavior on top of those APIs without turning your backend into prompt spaghetti.
Keep learning
- •The complete AI Agents Roadmap — my full 8-step breakdown
- •Free: The AI Agent Starter Kit — PDF checklist + starter code
- •Work with me — I build AI for banks and insurance companies
By Cyprian Aarons, AI Consultant at Topiax.
Want the complete 8-step roadmap?
Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.
Get the Starter Kit