How to Integrate LlamaIndex for wealth management with Supabase for startups

By Cyprian AaronsUpdated 2026-04-22
llamaindex-for-wealth-managementsupabasestartups

Wealth management agents need two things: structured financial knowledge and a persistent system of record. LlamaIndex gives you retrieval and reasoning over portfolio docs, policy notes, and client profiles; Supabase gives you Postgres-backed storage, auth, and a clean API layer for startup-grade apps.

Put them together and you get an AI agent that can answer client questions, store interaction history, track suitability notes, and keep investment context synced across sessions.

Prerequisites

  • Python 3.10+
  • A Supabase project with:
    • SUPABASE_URL
    • SUPABASE_ANON_KEY or service role key for server-side use
  • A LlamaIndex setup with:
    • llama-index
    • an embedding model provider configured
  • Access to your wealth management data sources:
    • PDFs, CSVs, policy docs, product sheets, client notes
  • Optional but recommended:
    • psycopg2-binary if you want direct Postgres access
    • python-dotenv for local environment variables

Install the packages:

pip install llama-index supabase python-dotenv psycopg2-binary

Integration Steps

1) Initialize Supabase as your persistence layer

Use Supabase for session state, client metadata, and audit logs. For startup systems, this is the part that keeps your agent from being stateless.

import os
from supabase import create_client, Client
from dotenv import load_dotenv

load_dotenv()

SUPABASE_URL = os.environ["SUPABASE_URL"]
SUPABASE_KEY = os.environ["SUPABASE_SERVICE_ROLE_KEY"]

supabase: Client = create_client(SUPABASE_URL, SUPABASE_KEY)

# Example: store a client profile used by the agent
result = supabase.table("client_profiles").insert({
    "client_id": "c_1001",
    "name": "Amina Patel",
    "risk_profile": "moderate",
    "portfolio_value": 250000,
    "goals": ["retirement", "tax efficiency"]
}).execute()

print(result.data)

Create a table like this in Supabase:

create table if not exists client_profiles (
  id bigint generated always as identity primary key,
  client_id text unique not null,
  name text not null,
  risk_profile text not null,
  portfolio_value numeric not null,
  goals jsonb not null default '[]'::jsonb,
  created_at timestamptz default now()
);

2) Load wealth management documents into LlamaIndex

LlamaIndex needs source documents it can index. For wealth management, this usually means fund factsheets, product disclosures, internal playbooks, and advisor notes.

from llama_index.core import VectorStoreIndex, SimpleDirectoryReader

documents = SimpleDirectoryReader("./wealth_docs").load_data()
index = VectorStoreIndex.from_documents(documents)

query_engine = index.as_query_engine()

response = query_engine.query(
    "What is the recommended portfolio allocation for a moderate-risk client nearing retirement?"
)

print(response)

If your data is sensitive or regulated, keep document ingestion server-side and restrict access to indexed content by tenant or advisor role.

3) Persist agent memory in Supabase

LlamaIndex handles retrieval. Supabase handles durable memory. Store each interaction so your agent can reconstruct context across sessions.

from datetime import datetime

def save_agent_message(client_id: str, role: str, message: str):
    return supabase.table("agent_messages").insert({
        "client_id": client_id,
        "role": role,
        "message": message,
        "created_at": datetime.utcnow().isoformat()
    }).execute()

save_agent_message(
    "c_1001",
    "user",
    "Should I rebalance after the recent market drop?"
)

save_agent_message(
    "c_1001",
    "assistant",
    "Based on your risk profile and time horizon, we should review drift before making changes."
)

Schema:

create table if not exists agent_messages (
  id bigint generated always as identity primary key,
  client_id text not null,
  role text not null,
  message text not null,
  created_at timestamptz default now()
);

4) Build a retrieval + storage workflow

The pattern is simple: retrieve from LlamaIndex, persist the response in Supabase, then reuse both on the next turn.

def answer_client_question(client_id: str, question: str):
    # Pull recent context from Supabase
    history = supabase.table("agent_messages") \
        .select("role,message") \
        .eq("client_id", client_id) \
        .order("created_at", desc=True) \
        .limit(5) \
        .execute()

    recent_context = "\n".join(
        f"{row['role']}: {row['message']}" for row in reversed(history.data)
    )

    prompt = f"""
Recent conversation:
{recent_context}

Client question:
{question}
"""

    answer = query_engine.query(prompt)
    save_agent_message(client_id, "assistant", str(answer))
    return answer

result = answer_client_question(
    "c_1001",
    "Can you summarize the risks of increasing bond exposure?"
)

print(result)

This is where the integration becomes useful. LlamaIndex does the semantic heavy lifting; Supabase keeps continuity and auditability.

5) Add structured portfolio data for better answers

Wealth management agents are better when they can combine unstructured docs with structured portfolio data. Store positions in Supabase and feed them into the prompt.

def get_portfolio_positions(client_id: str):
    result = supabase.table("portfolio_positions") \
        .select("*") \
        .eq("client_id", client_id) \
        .execute()
    return result.data

positions = get_portfolio_positions("c_1001")

portfolio_summary = "\n".join(
    f"{p['ticker']}: {p['weight']}% ({p['asset_class']})"
    for p in positions
)

final_prompt = f"""
Client portfolio:
{portfolio_summary}

Question:
Should this client rebalance now?
"""

print(query_engine.query(final_prompt))

Schema:

create table if not exists portfolio_positions (
  id bigint generated always as identity primary key,
  client_id text not null,
  ticker text not null,
  asset_class text not null,
  weight numeric not null
);

Testing the Integration

Run a basic end-to-end test that writes to Supabase and queries LlamaIndex using stored context.

test_client_id = "c_test_01"

supabase.table("client_profiles").upsert({
    "client_id": test_client_id,
    "name": "Test Client",
    "risk_profile": "conservative",
    "portfolio_value": 100000,
    "goals": ["capital preservation"]
}).execute()

save_agent_message(test_client_id, "user", "What should I know about downside protection?")

answer = answer_client_question(
    test_client_id,
    "Explain downside protection in simple terms."
)

print(answer)

Expected output:

Downside protection refers to strategies or instruments that reduce losses during market declines...

If this fails, check:

  • Supabase credentials and row-level security policies
  • Table names and column types
  • Whether your LlamaIndex documents were actually loaded
  • Whether your embedding provider is configured correctly

Real-World Use Cases

  • Advisor copilot

    • Answers portfolio questions from internal research docs.
    • Stores every interaction in Supabase for compliance review.
  • Client onboarding assistant

    • Collects risk tolerance, goals, and investment horizon.
    • Persists onboarding state in Supabase while using LlamaIndex to explain products.
  • Portfolio review bot

    • Pulls holdings from Supabase.
    • Uses LlamaIndex to compare holdings against policy documents and generate rebalancing notes.

Keep learning

By Cyprian Aarons, AI Consultant at Topiax.

Want the complete 8-step roadmap?

Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.

Get the Starter Kit

Related Guides