How to Integrate CrewAI for pension funds with FastAPI for RAG

By Cyprian AaronsUpdated 2026-04-22
crewai-for-pension-fundsfastapirag

Combining CrewAI for pension funds with FastAPI gives you a clean way to expose pension-specific AI workflows behind an API. The pattern is useful when you need an agent that can answer member questions, summarize policy documents, or retrieve plan rules from a RAG pipeline without wiring the whole system into a monolith.

The practical win is this: FastAPI handles request validation, auth, and HTTP delivery, while CrewAI orchestrates the retrieval and reasoning steps. That split keeps your pension assistant maintainable when compliance, document sources, and agent behavior change.

Prerequisites

  • Python 3.10+
  • A working FastAPI project
  • crewai installed
  • uvicorn installed
  • A vector store or retriever for your pension documents
  • Access to an LLM provider configured in your environment
  • Pension fund policy PDFs, FAQs, contribution rules, or benefit guides already indexed for RAG

Install the core dependencies:

pip install fastapi uvicorn crewai crewai-tools pydantic

If you are using OpenAI-compatible models for the agent and embeddings, set environment variables like:

export OPENAI_API_KEY="your-key"

Integration Steps

1) Define the RAG retriever for pension documents

Your retriever should return relevant chunks from pension plan documents. Keep this layer separate from the agent so you can swap vector stores later.

from typing import List
from dataclasses import dataclass

@dataclass
class DocumentChunk:
    text: str
    source: str

class PensionRetriever:
    def search(self, query: str, top_k: int = 4) -> List[DocumentChunk]:
        # Replace this with Pinecone, FAISS, Chroma, pgvector, etc.
        return [
            DocumentChunk(
                text="Members can retire at age 60 with 15 years of service under Plan A.",
                source="plan_a_benefits.pdf"
            ),
            DocumentChunk(
                text="Voluntary contributions are capped at 10% of monthly salary.",
                source="contributions_policy.pdf"
            ),
        ]

2) Wrap retrieval as a CrewAI tool

CrewAI works best when retrieval is exposed as a tool. That lets the agent call it only when needed instead of stuffing every document into the prompt.

from crewai.tools import tool

retriever = PensionRetriever()

@tool("search_pension_knowledge_base")
def search_pension_knowledge_base(query: str) -> str:
    """Search pension fund documents and return relevant context."""
    chunks = retriever.search(query=query, top_k=4)
    return "\n\n".join(
        f"Source: {chunk.source}\nText: {chunk.text}"
        for chunk in chunks
    )

3) Build the CrewAI agent and task

This is where you define the behavior. For pension use cases, keep the instructions tight: answer only from retrieved context, cite sources, and avoid inventing policy details.

from crewai import Agent, Task, Crew, Process

pension_agent = Agent(
    role="Pension Fund Assistant",
    goal="Answer member questions using pension policy documents and retrieved context.",
    backstory=(
        "You assist pension fund members and administrators. "
        "You must ground answers in retrieved plan documents."
    ),
    tools=[search_pension_knowledge_base],
    verbose=True,
)

def run_pension_rag(question: str) -> str:
    task = Task(
        description=(
            f"Answer this pension question using only retrieved context: {question}. "
            "Include source references."
        ),
        expected_output="A concise answer with citations to source documents.",
        agent=pension_agent,
    )

    crew = Crew(
        agents=[pension_agent],
        tasks=[task],
        process=Process.sequential,
        verbose=True,
    )

    result = crew.kickoff()
    return str(result)

4) Expose the agent through FastAPI

FastAPI gives you a stable contract for frontend apps, internal tools, or other services. Use Pydantic models so your request and response shapes stay explicit.

from fastapi import FastAPI
from pydantic import BaseModel

app = FastAPI(title="Pension RAG API")

class QueryRequest(BaseModel):
    question: str

class QueryResponse(BaseModel):
    answer: str

@app.post("/rag/query", response_model=QueryResponse)
def query_pension_rag(payload: QueryRequest):
    answer = run_pension_rag(payload.question)
    return QueryResponse(answer=answer)

5) Run the service and wire it into your app stack

Start the API with Uvicorn. In production, put this behind a reverse proxy and add auth before exposing it to users.

uvicorn main:app --reload --host 0.0.0.0 --port 8000

If you want to call it from another Python service:

import requests

response = requests.post(
    "http://localhost:8000/rag/query",
    json={"question": "At what age can I retire under Plan A?"}
)

print(response.json())

Testing the Integration

Use FastAPI’s TestClient to verify that the endpoint returns an answer and that your CrewAI pipeline is reachable end-to-end.

from fastapi.testclient import TestClient

client = TestClient(app)

def test_rag_query():
    response = client.post("/rag/query", json={
        "question": "What is the voluntary contribution limit?"
    })

    assert response.status_code == 200
    body = response.json()
    assert "answer" in body
    print(body["answer"])

test_rag_query()

Expected output:

Source: contributions_policy.pdf
Text: Voluntary contributions are capped at 10% of monthly salary.

In practice you’ll see a fuller natural-language answer from the agent plus citations pulled from your indexed pension docs.

Real-World Use Cases

  • Member self-service assistant
    Answer questions about retirement age, contribution limits, vesting rules, withdrawal conditions, and benefit eligibility from approved plan documents.

  • HR and pensions operations copilot
    Help internal teams summarize policy changes, compare plan versions, or draft responses to common member queries with source-backed answers.

  • Document-grounded compliance support
    Build an audit-friendly RAG interface that returns cited excerpts from policy manuals when regulators or administrators need traceable answers.


Keep learning

By Cyprian Aarons, AI Consultant at Topiax.

Want the complete 8-step roadmap?

Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.

Get the Starter Kit

Related Guides