How to Integrate LlamaIndex for wealth management with Supabase for AI agents
Combining LlamaIndex for wealth management with Supabase gives you a clean pattern for building AI agents that can retrieve portfolio data, market research, client notes, and compliance context from one place. LlamaIndex handles retrieval and reasoning over your wealth management knowledge base, while Supabase gives you a Postgres-backed system of record for agent state, chat history, documents, and structured client data.
This is the setup you want when an advisor assistant needs to answer questions like “What changed in this client’s allocation since last quarter?” or “Summarize the risk notes tied to this household.” You get retrieval over unstructured content plus durable storage for agent memory and workflow state.
Prerequisites
- •Python 3.10+
- •A Supabase project with:
- •Project URL
- •An API key
- •A Postgres database enabled
- •A LlamaIndex installation with the relevant packages for document loading and retrieval
- •Access to your wealth management data sources:
- •PDFs
- •CSVs
- •advisor notes
- •policy docs
- •Environment variables configured:
- •
SUPABASE_URL - •
SUPABASE_SERVICE_ROLE_KEYor anon key for local testing - •any LLM provider key used by LlamaIndex
- •
- •A table in Supabase for agent state or chat logs
Install the core dependencies:
pip install llama-index supabase python-dotenv sqlalchemy psycopg2-binary
Integration Steps
- •Set up Supabase as the persistence layer.
Use Supabase to store agent conversations, retrieved snippets, or client metadata. For production systems, I prefer keeping raw documents in object storage and using Postgres tables for metadata and agent state.
import os
from supabase import create_client, Client
from dotenv import load_dotenv
load_dotenv()
supabase_url = os.getenv("SUPABASE_URL")
supabase_key = os.getenv("SUPABASE_SERVICE_ROLE_KEY")
supabase: Client = create_client(supabase_url, supabase_key)
response = supabase.table("agent_sessions").insert({
"session_id": "wm-001",
"client_id": "client_123",
"status": "active"
}).execute()
print(response.data)
- •Load wealth management documents into LlamaIndex.
LlamaIndex works well when you convert client-facing PDFs, policy docs, and investment memos into indexed nodes. In a real deployment, keep sensitive data access-controlled before indexing.
from llama_index.core import SimpleDirectoryReader, VectorStoreIndex
documents = SimpleDirectoryReader("./wealth_docs").load_data()
index = VectorStoreIndex.from_documents(documents)
query_engine = index.as_query_engine()
result = query_engine.query("Summarize the client's risk profile and investment constraints.")
print(result)
- •Store indexed metadata or extracted summaries in Supabase.
A practical pattern is to let LlamaIndex generate summaries or structured outputs, then persist them in Supabase so your agent can resume context across sessions.
summary = str(result)
supabase.table("client_summaries").upsert({
"client_id": "client_123",
"summary": summary,
"source": "llamaindex"
}).execute()
- •Retrieve Supabase records and feed them back into the agent.
This is where the integration becomes useful. Pull structured data from Supabase, combine it with retrieved document context from LlamaIndex, then generate an answer grounded in both sources.
from llama_index.core import Document
client_row = supabase.table("client_summaries").select("*").eq("client_id", "client_123").execute()
stored_summary = client_row.data[0]["summary"]
context_doc = Document(text=stored_summary)
context_index = VectorStoreIndex.from_documents([context_doc])
context_query_engine = context_index.as_query_engine()
answer = context_query_engine.query("What should an advisor review before rebalancing this portfolio?")
print(answer)
- •Connect both layers inside one agent workflow.
For a production AI agent, wrap retrieval and persistence in a single function. That gives you one entry point for advisor questions and makes audit logging straightforward.
def answer_wealth_question(client_id: str, question: str):
row = supabase.table("client_summaries").select("*").eq("client_id", client_id).execute()
summary_text = row.data[0]["summary"]
docs = [Document(text=summary_text)]
idx = VectorStoreIndex.from_documents(docs)
engine = idx.as_query_engine()
response = engine.query(question)
supabase.table("agent_logs").insert({
"client_id": client_id,
"question": question,
"answer": str(response)
}).execute()
return response
print(answer_wealth_question(
"client_123",
"What are the main suitability risks before changing allocations?"
))
Testing the Integration
Run a simple end-to-end check: write a summary into Supabase, retrieve it through LlamaIndex, and confirm the response includes the expected content.
test_client_id = "client_test_001"
supabase.table("client_summaries").upsert({
"client_id": test_client_id,
"summary": "Client has moderate risk tolerance, prefers income-generating assets, and avoids high-volatility positions."
}).execute()
response = answer_wealth_question(
test_client_id,
"What portfolio preferences should an advisor respect?"
)
print(str(response))
Expected output:
Client prefers income-generating assets and has moderate risk tolerance. Avoid high-volatility positions.
If that comes back correctly, your retrieval path is working and your persistence layer is wired up.
Real-World Use Cases
- •Advisor copilot that answers questions from client notes, IPS documents, product sheets, and prior meeting summaries.
- •Compliance assistant that checks proposed trades against stored suitability rules and logs every decision in Supabase.
- •Client service bot that keeps conversation state in Supabase while using LlamaIndex to ground responses in approved wealth management content.
Keep learning
- •The complete AI Agents Roadmap — my full 8-step breakdown
- •Free: The AI Agent Starter Kit — PDF checklist + starter code
- •Work with me — I build AI for banks and insurance companies
By Cyprian Aarons, AI Consultant at Topiax.
Want the complete 8-step roadmap?
Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.
Get the Starter Kit