How to Integrate CrewAI for pension funds with FastAPI for AI agents
Combining CrewAI for pension funds with FastAPI gives you a clean way to expose multi-agent workflows as production HTTP services. That matters when your pension ops team needs automated document triage, member query handling, or compliance checks behind a stable API that can be called from portals, back offices, or internal systems.
FastAPI handles the request/response layer. CrewAI for pension funds handles the agent orchestration, task planning, and tool execution. Put them together and you get an AI service that is easier to secure, test, and integrate than a one-off script.
Prerequisites
- •Python 3.10 or later
- •A FastAPI project set up with
uvicorn - •CrewAI for pension funds installed and configured
- •An API key or model access configured in environment variables
- •Basic familiarity with async Python and Pydantic
- •A pension-fund-specific CrewAI setup with agents and tasks defined
- •Optional: Redis or a queue if you plan to run long-running jobs asynchronously
Install the core packages:
pip install fastapi uvicorn crewai pydantic python-dotenv
Integration Steps
- •Create your FastAPI app and load configuration
Keep secrets out of code. Use environment variables for model keys and any pension-fund-specific configuration.
from fastapi import FastAPI
from pydantic import BaseModel
from dotenv import load_dotenv
load_dotenv()
app = FastAPI(title="Pension Fund AI Agent API")
class MemberQueryRequest(BaseModel):
member_id: str
query: str
- •Define CrewAI agents and tasks for pension fund workflows
A common pattern is one agent for policy/compliance review and another for member support or document analysis.
from crewai import Agent, Task, Crew, Process
compliance_agent = Agent(
role="Pension Compliance Analyst",
goal="Review member requests against pension fund rules",
backstory="You validate requests against fund policies, regulations, and internal controls.",
verbose=True,
)
support_agent = Agent(
role="Member Support Specialist",
goal="Draft clear responses to pension members",
backstory="You explain pension outcomes in plain language without changing policy meaning.",
verbose=True,
)
review_task = Task(
description=(
"Review this member query: {query}. "
"Check if it appears eligible under pension fund rules and summarize next steps."
),
expected_output="A concise compliance review with recommendation and rationale.",
agent=compliance_agent,
)
response_task = Task(
description=(
"Write a member-facing response based on the compliance review. "
"Keep it clear, professional, and accurate."
),
expected_output="A member-ready response message.",
agent=support_agent,
)
- •Wrap the CrewAI execution in a service function
This is the bridge between your HTTP layer and your agent workflow. If your version of CrewAI uses different constructor names or execution methods, keep the same pattern: build crew once, call it from the endpoint.
def run_pension_workflow(member_id: str, query: str) -> dict:
crew = Crew(
agents=[compliance_agent, support_agent],
tasks=[review_task, response_task],
process=Process.sequential,
verbose=True,
)
result = crew.kickoff(inputs={
"member_id": member_id,
"query": query,
})
return {
"member_id": member_id,
"result": str(result),
}
- •Expose the workflow through a FastAPI endpoint
This gives you a stable API contract for frontends and downstream systems.
from fastapi import HTTPException
@app.post("/pension/query")
async def handle_member_query(payload: MemberQueryRequest):
if not payload.member_id or not payload.query:
raise HTTPException(status_code=400, detail="member_id and query are required")
output = run_pension_workflow(payload.member_id, payload.query)
return {
"status": "success",
"data": output,
}
- •Add a health check and run the server
Health checks matter in production because orchestration layers need to know whether your service is alive before sending traffic.
@app.get("/health")
async def health():
return {"status": "ok"}
Run it:
uvicorn main:app --reload --host 0.0.0.0 --port 8000
Testing the Integration
Use curl or any HTTP client to verify the endpoint returns an agent-generated result.
curl -X POST "http://localhost:8000/pension/query" \
-H "Content-Type: application/json" \
-d '{
"member_id": "M12345",
"query": "Can I withdraw part of my pension before retirement?"
}'
Expected output:
{
"status": "success",
"data": {
"member_id": "M12345",
"result": "..."
}
}
If you want to test inside Python:
from fastapi.testclient import TestClient
from main import app
client = TestClient(app)
response = client.post("/pension/query", json={
"member_id": "M12345",
"query": "Can I withdraw part of my pension before retirement?"
})
print(response.status_code)
print(response.json())
Real-World Use Cases
- •Member query triage: route questions about withdrawals, beneficiaries, contributions, or retirement eligibility to specialized agents.
- •Compliance-first document review: analyze forms, letters, or claim requests before they reach human reviewers.
- •Advisor support APIs: expose an internal endpoint that drafts policy-aligned explanations for call center staff or financial advisors.
The useful pattern here is simple: FastAPI owns transport and validation; CrewAI owns reasoning and task coordination. Keep that boundary clean, and you get an AI agent service that is easier to maintain than embedding orchestration logic directly in route handlers.
Keep learning
- •The complete AI Agents Roadmap — my full 8-step breakdown
- •Free: The AI Agent Starter Kit — PDF checklist + starter code
- •Work with me — I build AI for banks and insurance companies
By Cyprian Aarons, AI Consultant at Topiax.
Want the complete 8-step roadmap?
Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.
Get the Starter Kit