How to Integrate LlamaIndex for pension funds with Supabase for startups
Combining LlamaIndex for pension funds with Supabase gives you a clean pattern for building AI agents that can answer policy, contribution, and retirement-plan questions from structured and unstructured data. The useful part is not just retrieval; it’s persistence, access control, and auditability in one stack, which matters when you’re serving finance-adjacent workflows in a startup.
Prerequisites
- •Python 3.10+
- •A Supabase project with:
- •
SUPABASE_URL - •
SUPABASE_ANON_KEYor service role key
- •
- •A PostgreSQL table for storing pension fund documents or metadata
- •LlamaIndex installed with the packages you need:
- •
llama-index - •
llama-index-vector-stores-supabase - •
llama-index-embeddings-openaior another embedding provider
- •
- •An OpenAI API key if you use OpenAI embeddings/LLM
- •Basic familiarity with:
- •Supabase SQL editor
- •Python environment variables
Integration Steps
- •
Install dependencies and configure environment variables
Start by installing the libraries your agent will use to index pension fund content and persist it in Supabase.
pip install llama-index llama-index-vector-stores-supabase llama-index-embeddings-openai supabase python-dotenvSet your environment variables:
import os from dotenv import load_dotenv load_dotenv() SUPABASE_URL = os.getenv("SUPABASE_URL") SUPABASE_SERVICE_ROLE_KEY = os.getenv("SUPABASE_SERVICE_ROLE_KEY") OPENAI_API_KEY = os.getenv("OPENAI_API_KEY") - •
Connect to Supabase and prepare your storage layer
Use the Supabase Python client for app-side reads/writes, and let LlamaIndex talk to the same backend through its vector store integration.
from supabase import create_client, Client supabase: Client = create_client(SUPABASE_URL, SUPABASE_SERVICE_ROLE_KEY) # Optional: store raw pension plan documents or metadata result = supabase.table("pension_documents").insert({ "doc_id": "plan_001", "title": "Employer Match Policy", "source": "internal_policy.pdf" }).execute() print(result.data)If you want vector search, create a table compatible with the LlamaIndex Supabase vector store setup. In practice, you’ll usually provision this once in SQL, then let LlamaIndex write embeddings into it.
- •
Build a LlamaIndex pipeline that writes embeddings into Supabase
This is where the actual retrieval layer comes together. You load pension fund content, chunk it, embed it, and persist vectors into Supabase.
from llama_index.core import VectorStoreIndex, SimpleDirectoryReader, StorageContext from llama_index.embeddings.openai import OpenAIEmbedding from llama_index.vector_stores.supabase import SupabaseVectorStore embed_model = OpenAIEmbedding(model="text-embedding-3-small") vector_store = SupabaseVectorStore( postgres_connection_string=f"postgresql://postgres:{SUPABASE_SERVICE_ROLE_KEY}@db.{SUPABASE_URL.split('//')[-1]}:5432/postgres", collection_name="pension_fund_vectors", dimension=1536, query_name="match_pension_fund_documents" ) storage_context = StorageContext.from_defaults(vector_store=vector_store) documents = SimpleDirectoryReader("./pension_docs").load_data() index = VectorStoreIndex.from_documents( documents, storage_context=storage_context, embed_model=embed_model )In production, your documents are usually things like:
- •pension plan rules
- •contribution schedules
- •vesting policies
- •employer match terms
- •member FAQ PDFs
- •
Query the indexed pension data through an agent
Once indexed, your startup agent can answer questions using semantic retrieval from Supabase-backed vectors.
from llama_index.core import Settings Settings.embed_model = embed_model query_engine = index.as_query_engine(similarity_top_k=3) response = query_engine.query( "What is the employer match policy for employees under the standard plan?" ) print(response)If you want to combine this with app data in Supabase — for example user profile or plan enrollment status — fetch that first, then pass it into your prompt or agent context.
- •
Combine structured Supabase data with LlamaIndex retrieval
This pattern is what makes the integration useful for startups: structured state in Supabase, unstructured knowledge in LlamaIndex.
user_profile = supabase.table("members").select("*").eq("member_id", "user_123").execute() membership_data = user_profile.data[0] question = f""" Member age: {membership_data['age']} Contribution rate: {membership_data['contribution_rate']} Query: Can this member increase contributions without violating plan rules? """ answer = query_engine.query(question) print(answer)
Testing the Integration
Run a simple end-to-end test: write a document to disk, index it into Supabase-backed vectors, then ask a question that should hit that content.
from llama_index.core import SimpleDirectoryReader
test_query = "When does vesting start for employer contributions?"
response = query_engine.query(test_query)
print("Answer:", response)
Expected output:
Answer: Vesting starts after 12 months of service under the standard employer contribution policy.
If you get irrelevant answers:
- •check your chunk size and document quality
- •verify embeddings were written to the right Supabase collection/table
- •confirm your query function name matches the SQL function expected by the vector store
Real-World Use Cases
- •Pension policy assistant
- •Answer employee questions about vesting, matching rules, contribution caps, and eligibility using indexed plan documents.
- •Advisor support workflow
- •Pull member records from Supabase and combine them with retrieved policy context to generate compliant guidance drafts.
- •Internal ops copilot
- •Let operations teams search pension fund PDFs, FAQs, and regulatory notes while keeping user state and permissions in Supabase.
Keep learning
- •The complete AI Agents Roadmap — my full 8-step breakdown
- •Free: The AI Agent Starter Kit — PDF checklist + starter code
- •Work with me — I build AI for banks and insurance companies
By Cyprian Aarons, AI Consultant at Topiax.
Want the complete 8-step roadmap?
Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.
Get the Starter Kit