How to Integrate CrewAI for wealth management with FastAPI for startups
CrewAI for wealth management gives you the agent layer for portfolio analysis, client profiling, and recommendation workflows. FastAPI gives you the HTTP surface to expose those workflows to your product, internal tools, or partner systems.
Put them together and you get a clean pattern for startup-grade financial assistants: an API that accepts client context, routes it through a CrewAI workflow, and returns structured advice your app can use immediately.
Prerequisites
- •Python 3.10+
- •
fastapi - •
uvicorn - •
crewai - •A configured LLM provider for CrewAI, such as OpenAI or Anthropic
- •Basic knowledge of REST APIs and Python async/sync boundaries
- •A project structure like:
- •
app/main.py - •
app/agents.py - •
app/schemas.py
- •
Install the dependencies:
pip install fastapi uvicorn crewai pydantic
If your CrewAI wealth management setup uses tools like market data or portfolio sources, configure those credentials first. Keep secrets in environment variables, not hardcoded in the app.
Integration Steps
1) Define the request and response schemas
Start with strict input/output contracts. Wealth workflows need typed payloads because you do not want free-form JSON drifting into production logic.
# app/schemas.py
from pydantic import BaseModel, Field
from typing import List, Optional
class ClientProfile(BaseModel):
age: int = Field(..., ge=18)
risk_tolerance: str
investment_horizon_years: int = Field(..., ge=1)
assets_under_management: float = Field(..., ge=0)
class WealthRequest(BaseModel):
client_id: str
profile: ClientProfile
goals: List[str]
constraints: Optional[List[str]] = []
class WealthResponse(BaseModel):
summary: str
allocation_recommendation: str
next_actions: List[str]
This gives FastAPI automatic validation and makes your agent output easier to test.
2) Build the CrewAI wealth management workflow
Create agents and a task that turns client context into an advisory output. In CrewAI, the core primitives are Agent, Task, and Crew.
# app/agents.py
from crewai import Agent, Task, Crew, Process
wealth_advisor = Agent(
role="Wealth Advisor",
goal="Analyze client profile and produce suitable investment guidance",
backstory=(
"You are a wealth management specialist focused on startup clients "
"with clear risk profiles and practical recommendations."
),
verbose=True,
)
portfolio_strategist = Agent(
role="Portfolio Strategist",
goal="Translate client goals into a concise allocation strategy",
backstory="You design allocations based on horizon, risk tolerance, and constraints.",
verbose=True,
)
def build_wealth_crew(client_payload: str) -> Crew:
task = Task(
description=(
"Review this client data and produce a summary, allocation recommendation, "
f"and next actions:\n\n{client_payload}"
),
expected_output="A structured wealth management recommendation.",
agent=wealth_advisor,
)
strategy_task = Task(
description="Refine the recommendation into an actionable portfolio strategy.",
expected_output="A concise allocation plan with next steps.",
agent=portfolio_strategist,
context=[task],
)
return Crew(
agents=[wealth_advisor, portfolio_strategist],
tasks=[task, strategy_task],
process=Process.sequential,
verbose=True,
)
The key point here is that FastAPI will pass request data into this crew as plain text or structured JSON serialized to string.
3) Expose the crew through a FastAPI endpoint
Keep the API layer thin. The endpoint should validate input, invoke the crew synchronously or in a worker thread, then return typed results.
# app/main.py
from fastapi import FastAPI, HTTPException
from app.schemas import WealthRequest, WealthResponse
from app.agents import build_wealth_crew
app = FastAPI(title="Wealth Management AI API")
@app.post("/wealth/advice", response_model=WealthResponse)
def get_wealth_advice(payload: WealthRequest):
try:
client_text = payload.model_dump_json(indent=2)
crew = build_wealth_crew(client_text)
result = crew.kickoff()
return WealthResponse(
summary=str(result),
allocation_recommendation="Use the agent output above as the base recommendation.",
next_actions=[
"Review compliance constraints",
"Validate against internal model portfolio rules",
"Send to advisor dashboard"
],
)
except Exception as exc:
raise HTTPException(status_code=500, detail=str(exc))
crew.kickoff() is the main execution call you want here. It runs the sequential tasks and returns the final output from the crew.
4) Add startup-safe configuration for secrets
Do not bake model keys into code. Use environment variables so your deployment pipeline can rotate them without touching application logic.
# app/config.py
import os
class Settings:
OPENAI_API_KEY = os.getenv("OPENAI_API_KEY")
MODEL_NAME = os.getenv("MODEL_NAME", "gpt-4o-mini")
settings = Settings()
Then load these settings before creating your agents if your CrewAI setup depends on explicit LLM configuration. In production, wire this through your container environment or secret manager.
5) Run FastAPI and connect it to your frontend or internal service
Start the server with Uvicorn:
uvicorn app.main:app --reload --port 8000
Then call it from any internal service using requests or from your frontend through fetch/Axios.
import requests
payload = {
"client_id": "cli_001",
"profile": {
"age": 34,
"risk_tolerance": "moderate",
"investment_horizon_years": 10,
"assets_under_management": 250000.0
},
"goals": ["retirement planning", "capital preservation"],
"constraints": ["avoid high-volatility assets"]
}
response = requests.post("http://localhost:8000/wealth/advice", json=payload)
print(response.status_code)
print(response.json())
That is enough to wire an agent workflow behind an API endpoint without overengineering it.
Testing the Integration
Use FastAPI’s built-in test client to verify request validation and response shape. This catches broken schemas early.
# tests/test_main.py
from fastapi.testclient import TestClient
from app.main import app
client = TestClient(app)
def test_wealth_advice_endpoint():
payload = {
"client_id": "cli_001",
"profile": {
"age": 34,
"risk_tolerance": "moderate",
"investment_horizon_years": 10,
"assets_under_management": 250000.0
},
"goals": ["retirement planning"],
"constraints": ["avoid high-volatility assets"]
}
response = client.post("/wealth/advice", json=payload)
assert response.status_code == 200
body = response.json()
assert "summary" in body
assert "allocation_recommendation" in body
Expected output:
200 OK
{
"summary": "...",
"allocation_recommendation": "...",
"next_actions": [
"Review compliance constraints",
...
]
}
If you want deterministic tests around agent output, mock crew.kickoff() and assert only on contract fields. Do not snapshot raw LLM text unless you enjoy flaky CI.
Real-World Use Cases
- •
Advisor copilot API
Let relationship managers submit client profiles and receive draft recommendations before human review. - •
Onboarding risk assessment
Build an intake service that classifies new clients by risk tolerance and suggests suitable model portfolios. - •
Portfolio explanation engine
Expose a backend endpoint that translates allocation changes into plain-English explanations for clients or advisors.
The production pattern is simple: FastAPI handles transport and validation, CrewAI handles reasoning and task orchestration. That separation keeps your system maintainable when you add more agents for compliance checks, tax-aware planning, or market commentary later on.
Keep learning
- •The complete AI Agents Roadmap — my full 8-step breakdown
- •Free: The AI Agent Starter Kit — PDF checklist + starter code
- •Work with me — I build AI for banks and insurance companies
By Cyprian Aarons, AI Consultant at Topiax.
Want the complete 8-step roadmap?
Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.
Get the Starter Kit