How to Integrate LlamaIndex for pension funds with Supabase for multi-agent systems
Pension fund workflows need more than a vector store and a chat UI. You need retrieval over policy documents, contribution rules, and member records, plus a shared state layer that multiple agents can read and write without stepping on each other.
That is where LlamaIndex for pension funds and Supabase fit together well. LlamaIndex handles document ingestion, indexing, and retrieval; Supabase gives you Postgres-backed persistence, auth, and a clean way to coordinate multi-agent state.
Prerequisites
- •Python 3.10+
- •A Supabase project with:
- •
SUPABASE_URL - •
SUPABASE_ANON_KEYor service role key for backend jobs
- •
- •Access to your pension fund documents:
- •PDFs, DOCX, HTML, or plain text
- •Installed packages:
- •
llama-index - •
llama-index-vector-stores-supabase - •
supabase - •an embedding model provider such as OpenAI or local embeddings
- •
- •A Supabase table for agent state, for example:
- •
agent_messages
- •
- •A Supabase vector table created for LlamaIndex storage
- •Basic familiarity with Python async I/O if you plan to run multiple agents concurrently
Integration Steps
1) Install the libraries and set up environment variables
Start with the core packages. For production agent systems, keep secrets in environment variables and use a service role key only from trusted backend code.
pip install llama-index llama-index-vector-stores-supabase supabase python-dotenv
import os
from dotenv import load_dotenv
load_dotenv()
SUPABASE_URL = os.getenv("SUPABASE_URL")
SUPABASE_KEY = os.getenv("SUPABASE_SERVICE_ROLE_KEY")
OPENAI_API_KEY = os.getenv("OPENAI_API_KEY")
2) Create the Supabase client for shared agent state
Use Supabase as the coordination layer between agents. One agent can ingest documents, another can answer member questions, and a third can validate policy compliance using the same database.
from supabase import create_client, Client
supabase: Client = create_client(SUPABASE_URL, SUPABASE_KEY)
# Example table schema assumption:
# agent_messages(id uuid default gen_random_uuid(), agent_name text, session_id text,
# role text, content text, created_at timestamptz default now())
def log_agent_message(agent_name: str, session_id: str, role: str, content: str):
data = {
"agent_name": agent_name,
"session_id": session_id,
"role": role,
"content": content,
}
return supabase.table("agent_messages").insert(data).execute()
For multi-agent systems, this table becomes your shared memory. Keep it simple: one row per message or event.
3) Build a LlamaIndex pipeline over pension fund documents
LlamaIndex indexes your pension fund corpus so agents can retrieve policy-specific answers instead of hallucinating from general model knowledge. This is where member handbook PDFs, benefit rules, retirement calculators, and trustee minutes become queryable context.
from llama_index.core import VectorStoreIndex, SimpleDirectoryReader
from llama_index.core.settings import Settings
from llama_index.embeddings.openai import OpenAIEmbedding
Settings.embed_model = OpenAIEmbedding(model="text-embedding-3-small")
documents = SimpleDirectoryReader(
input_dir="./pension_docs",
recursive=True
).load_data()
index = VectorStoreIndex.from_documents(documents)
query_engine = index.as_query_engine(similarity_top_k=5)
response = query_engine.query(
"What is the early retirement rule for members aged 55?"
)
print(response)
If your pension fund documents are updated often, rebuild or incrementally update the index on ingestion events. In regulated environments, version your source files so answers can be traced back to the exact policy revision.
4) Persist the vector index in Supabase
For production multi-agent systems, store embeddings in Supabase rather than keeping everything in memory. LlamaIndex supports Supabase as a vector store through its integration package.
from llama_index.vector_stores.supabase import SupabaseVectorStore
from llama_index.core import StorageContext
vector_store = SupabaseVectorStore(
postgres_connection_string=(
f"postgresql://postgres:{os.getenv('SUPABASE_DB_PASSWORD')}"
f"@db.{os.getenv('SUPABASE_PROJECT_REF')}.supabase.co:5432/postgres"
),
collection_name="pension_fund_docs",
)
storage_context = StorageContext.from_defaults(vector_store=vector_store)
supabase_index = VectorStoreIndex.from_documents(
documents,
storage_context=storage_context,
)
supabase_query_engine = supabase_index.as_query_engine()
answer = supabase_query_engine.query("Summarize vesting requirements.")
print(answer)
This gives you durable retrieval backed by Postgres. It also makes it easier to share one indexed knowledge base across several agents without duplicating storage.
5) Wire both sides into a simple multi-agent flow
A practical pattern is:
- •Agent A retrieves relevant pension policy passages via LlamaIndex
- •Agent B stores the user question and answer in Supabase
- •Agent C reads conversation history from Supabase for follow-up actions
def answer_member_question(session_id: str, question: str):
log_agent_message("member_support_agent", session_id, "user", question)
result = query_engine.query(question)
answer_text = str(result)
log_agent_message("member_support_agent", session_id, "assistant", answer_text)
return answer_text
session_id = "sess_001"
question = "Can I transfer my pension if I leave before vesting?"
answer = answer_member_question(session_id, question)
print(answer)
In a real deployment you would add routing logic here:
- •compliance agent checks if the response includes regulated language
- •escalation agent hands off ambiguous cases to a human reviewer
- •audit agent writes every step to an immutable log table
Testing the Integration
Run a direct retrieval test and confirm both indexing and persistence work.
test_question = "What happens to contributions if a member exits before vesting?"
test_answer = answer_member_question("test_session", test_question)
print("ANSWER:", test_answer)
history = (
supabase.table("agent_messages")
.select("*")
.eq("session_id", "test_session")
.order("created_at")
.execute()
)
print(history.data)
Expected output:
ANSWER: Contributions may be refunded or handled according to the plan's exit rules...
[
{
"agent_name": "member_support_agent",
"session_id": "test_session",
"role": "user",
"content": "What happens to contributions if a member exits before vesting?"
},
{
"agent_name": "member_support_agent",
"session_id": "test_session",
"role": "assistant",
"content": "Contributions may be refunded or handled according to the plan's exit rules..."
}
]
If retrieval returns generic answers instead of policy-specific ones:
- •check that documents were actually loaded from
./pension_docs - •verify your embeddings model is configured correctly
- •confirm your Supabase vector table exists and matches the expected schema
Real-World Use Cases
- •
Member support agent
- •Answers benefit eligibility questions using indexed pension policy docs.
- •Stores each interaction in Supabase for audit and handoff.
- •
Compliance review agent
- •Retrieves trustee minutes and policy updates.
- •Flags responses that conflict with current plan rules before they reach members.
- •
Operations orchestration
- •One agent ingests new plan documents.
- •Another updates embeddings.
- •A third monitors unresolved cases in Supabase and escalates them to staff.
Keep learning
- •The complete AI Agents Roadmap — my full 8-step breakdown
- •Free: The AI Agent Starter Kit — PDF checklist + starter code
- •Work with me — I build AI for banks and insurance companies
By Cyprian Aarons, AI Consultant at Topiax.
Want the complete 8-step roadmap?
Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.
Get the Starter Kit