How to Integrate LlamaIndex for insurance with Supabase for production AI
Combining LlamaIndex for insurance with Supabase gives you a clean pattern for production AI agents: LlamaIndex handles retrieval over policy docs, claims notes, and underwriting rules, while Supabase stores tenant data, chat history, and structured outputs. That split matters in insurance because you need traceability, row-level access control, and a system that can answer questions from both unstructured documents and relational records.
Prerequisites
- •Python 3.10+
- •A Supabase project with:
- •
SUPABASE_URL - •
SUPABASE_SERVICE_ROLE_KEYfor server-side jobs
- •
- •A Postgres database in Supabase
- •Your LlamaIndex packages installed
- •Access to an embedding model and LLM provider
- •A document set for insurance use cases:
- •policy PDFs
- •claims summaries
- •underwriting guidelines
- •broker correspondence
Install the core dependencies:
pip install llama-index supabase python-dotenv sqlalchemy psycopg2-binary
pip install llama-index-embeddings-openai llama-index-llms-openai
Integration Steps
- •Set up environment variables and clients.
You want Supabase as your system of record and LlamaIndex as your retrieval layer. Keep secrets out of code and initialize both clients once.
import os
from dotenv import load_dotenv
from supabase import create_client, Client
load_dotenv()
SUPABASE_URL = os.getenv("SUPABASE_URL")
SUPABASE_SERVICE_ROLE_KEY = os.getenv("SUPABASE_SERVICE_ROLE_KEY")
supabase: Client = create_client(SUPABASE_URL, SUPABASE_SERVICE_ROLE_KEY)
- •Load insurance documents into LlamaIndex.
Use LlamaIndex’s document loaders to ingest policy files or claim notes. For production, store raw files in object storage and index only the text chunks you need for retrieval.
from llama_index.core import VectorStoreIndex, SimpleDirectoryReader
from llama_index.embeddings.openai import OpenAIEmbedding
from llama_index.llms.openai import OpenAI
documents = SimpleDirectoryReader("./insurance_docs").load_data()
embed_model = OpenAIEmbedding(model="text-embedding-3-small")
llm = OpenAI(model="gpt-4o-mini")
index = VectorStoreIndex.from_documents(
documents,
embed_model=embed_model,
)
query_engine = index.as_query_engine(llm=llm)
- •Persist structured agent data in Supabase.
LlamaIndex is not your transactional store. Use Supabase tables for users, conversations, claims metadata, and audit logs.
Example schema:
create table if not exists insurance_agent_logs (
id bigserial primary key,
user_id uuid not null,
question text not null,
answer text not null,
source_count int not null default 0,
created_at timestamptz not null default now()
);
Insert a record from Python after each agent response:
def save_interaction(user_id: str, question: str, answer: str, source_count: int):
supabase.table("insurance_agent_logs").insert({
"user_id": user_id,
"question": question,
"answer": answer,
"source_count": source_count,
}).execute()
- •Build the agent flow: retrieve from LlamaIndex, write metadata to Supabase.
This is the production pattern. Ask the retriever for context, generate an answer, then persist the interaction for audit and analytics.
def answer_insurance_question(user_id: str, question: str):
response = query_engine.query(question)
answer_text = str(response)
source_count = len(getattr(response, "source_nodes", []))
save_interaction(
user_id=user_id,
question=question,
answer=answer_text,
source_count=source_count,
)
return {
"answer": answer_text,
"source_count": source_count,
}
- •Query Supabase for operational context before answering.
In insurance workflows, the agent usually needs customer or claim context before it answers. Pull that from Supabase first, then use it to shape the retrieval query.
def get_claim_context(claim_id: str):
result = (
supabase.table("claims")
.select("claim_id,status,line_of_business,loss_date")
.eq("claim_id", claim_id)
.single()
.execute()
)
return result.data
def answer_claim_question(user_id: str, claim_id: str):
context = get_claim_context(claim_id)
question = f"Based on this claim context {context}, what policy exclusions may apply?"
response = query_engine.query(question)
save_interaction(
user_id=user_id,
question=question,
answer=str(response),
source_count=len(getattr(response, "source_nodes", [])),
)
return response
Testing the Integration
Run a simple end-to-end check against one known policy document and one test row in Supabase.
test_user_id = "11111111-1111-1111-1111-111111111111"
test_question = "Does this homeowner policy cover water damage from a burst pipe?"
result = answer_insurance_question(test_user_id, test_question)
print("Answer:", result["answer"])
print("Sources:", result["source_count"])
log_check = (
supabase.table("insurance_agent_logs")
.select("*")
.eq("user_id", test_user_id)
.order("created_at", desc=True)
.limit(1)
.execute()
)
print("Latest log:", log_check.data[0])
Expected output:
Answer: The policy covers sudden accidental water damage from a burst pipe...
Sources: 2
Latest log: {
'id': 42,
'user_id': '11111111-1111-1111-1111-111111111111',
'question': 'Does this homeowner policy cover water damage from a burst pipe?',
'answer': 'The policy covers sudden accidental water damage...',
'source_count': 2,
'created_at': '2026-04-22T10:15:00Z'
}
Real-World Use Cases
- •
Claims triage assistant
- •Pull claim metadata from Supabase.
- •Retrieve policy clauses with LlamaIndex.
- •Return likely coverage issues with citations.
- •
Underwriting copilot
- •Store applicant records in Supabase.
- •Query underwriting guidelines indexed by LlamaIndex.
- •Flag missing disclosures or risky attributes.
- •
Broker servicing agent
- •Keep customer conversation history in Supabase.
- •Use LlamaIndex to search endorsements and product docs.
- •Answer coverage questions with traceable sources.
The production pattern is simple: let LlamaIndex handle semantic retrieval over messy insurance content, and let Supabase handle durable state. That separation keeps your agent auditable, scalable, and easier to secure across multiple lines of business.
Keep learning
- •The complete AI Agents Roadmap — my full 8-step breakdown
- •Free: The AI Agent Starter Kit — PDF checklist + starter code
- •Work with me — I build AI for banks and insurance companies
By Cyprian Aarons, AI Consultant at Topiax.
Want the complete 8-step roadmap?
Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.
Get the Starter Kit