How to Integrate LangChain for pension funds with Twilio for RAG
Combining LangChain for pension funds with Twilio gives you a practical RAG channel for member support, claims intake, and policy Q&A over SMS or WhatsApp. Instead of forcing users into a portal, you let them ask questions in the channel they already use, while your agent retrieves grounded answers from pension documents and sends them back through Twilio.
Prerequisites
- •Python 3.10+
- •A Twilio account with:
- •
TWILIO_ACCOUNT_SID - •
TWILIO_AUTH_TOKEN - •A Twilio phone number or WhatsApp sender
- •
- •Access to your pension fund knowledge base:
- •policy PDFs
- •benefit guides
- •contribution rules
- •FAQ docs
- •LangChain installed with your preferred vector store integration
- •An embedding model provider configured, such as OpenAI or Azure OpenAI
- •A webhook endpoint exposed publicly for Twilio callbacks:
- •ngrok for local development works fine
- •Environment variables set in
.env
pip install langchain langchain-community langchain-openai twilio flask python-dotenv pypdf faiss-cpu
Integration Steps
- •
Load pension fund documents and build the retrieval index
Start by turning your pension documentation into retrievable chunks. For production, keep this index separate from your app runtime so you can refresh it when policies change.
import os
from dotenv import load_dotenv
from langchain_community.document_loaders import PyPDFLoader
from langchain_text_splitters import RecursiveCharacterTextSplitter
from langchain_openai import OpenAIEmbeddings
from langchain_community.vectorstores import FAISS
load_dotenv()
pdf_path = "pension_fund_rules.pdf"
loader = PyPDFLoader(pdf_path)
documents = loader.load()
splitter = RecursiveCharacterTextSplitter(chunk_size=1000, chunk_overlap=150)
chunks = splitter.split_documents(documents)
embeddings = OpenAIEmbeddings(model="text-embedding-3-small")
vectorstore = FAISS.from_documents(chunks, embeddings)
vectorstore.save_local("pension_fund_index")
- •
Create a LangChain retriever and RAG chain
Use a retriever to fetch relevant context, then feed that context into a chat model. Keep the prompt strict so the assistant only answers from retrieved pension content.
from langchain_community.vectorstores import FAISS
from langchain_openai import ChatOpenAI
from langchain_core.prompts import ChatPromptTemplate
from langchain.chains.combine_documents import create_stuff_documents_chain
from langchain.chains.retrieval import create_retrieval_chain
embeddings = OpenAIEmbeddings(model="text-embedding-3-small")
vectorstore = FAISS.load_local(
"pension_fund_index",
embeddings,
allow_dangerous_deserialization=True,
)
retriever = vectorstore.as_retriever(search_kwargs={"k": 4})
prompt = ChatPromptTemplate.from_messages([
("system", "You are a pension fund support assistant. Answer only using the provided context. If the answer is not in the context, say you don't have enough information."),
("human", "Question: {input}\n\nContext: {context}")
])
llm = ChatOpenAI(model="gpt-4o-mini", temperature=0)
document_chain = create_stuff_documents_chain(llm, prompt)
rag_chain = create_retrieval_chain(retriever, document_chain)
- •
Expose a Twilio webhook to receive inbound SMS or WhatsApp messages
Twilio will POST incoming messages to your webhook. Your handler should extract the user text, run it through the RAG chain, and return a TwiML response.
import os
from flask import Flask, request, Response
app = Flask(__name__)
@app.post("/twilio/inbound")
def twilio_inbound():
incoming_text = request.form.get("Body", "").strip()
from_number = request.form.get("From", "")
result = rag_chain.invoke({"input": incoming_text})
answer = result["answer"]
twiml = f"""<?xml version="1.0" encoding="UTF-8"?>
<Response>
<Message>{answer}</Message>
</Response>"""
return Response(twiml, mimetype="application/xml")
if __name__ == "__main__":
app.run(port=5000, debug=True)
- •
Send outbound replies with the Twilio REST API
If you want asynchronous handling or escalation flows, use the Twilio Python SDK directly instead of only returning TwiML.
from twilio.rest import Client
twilio_client = Client(
os.environ["TWILIO_ACCOUNT_SID"],
os.environ["TWILIO_AUTH_TOKEN"]
)
def send_sms_reply(to_number: str, body: str):
message = twilio_client.messages.create(
from_=os.environ["TWILIO_PHONE_NUMBER"],
to=to_number,
body=body,
)
return message.sid
# Example usage after RAG lookup:
result = rag_chain.invoke({"input": "When can I retire early?"})
send_sms_reply("+15551234567", result["answer"])
- •
Add basic guardrails before replying
Pension data is sensitive. Add checks for identity-sensitive questions and route those to a human or authenticated flow instead of answering blindly.
def is_sensitive_query(text: str) -> bool:
keywords = ["my balance", "my payout", "my id number", "my contributions", "beneficiary"]
lowered = text.lower()
return any(k in lowered for k in keywords)
@app.post("/twilio/inbound")
def twilio_inbound():
incoming_text = request.form.get("Body", "").strip()
if is_sensitive_query(incoming_text):
reply = "I need to verify your identity before sharing account-specific pension details."
else:
result = rag_chain.invoke({"input": incoming_text})
reply = result["answer"]
twiml = f"""<?xml version="1.0" encoding="UTF-8"?>
<Response><Message>{reply}</Message></Response>"""
return Response(twiml, mimetype="application/xml")
Testing the Integration
Use Flask locally and expose it with ngrok, then configure your Twilio webhook URL to point at /twilio/inbound.
# test_rag_sms.py
import requests
payload = {
"Body": "What happens to my pension if I leave before retirement?",
"From": "+15550001111"
}
response = requests.post("http://localhost:5000/twilio/inbound", data=payload)
print(response.status_code)
print(response.text)
Expected output:
200
<?xml version="1.0" encoding="UTF-8"?>
<Response>
<Message>You may preserve your benefit in the fund until retirement age...</Message>
</Response>
If you want to test the outbound path too:
sid = send_sms_reply("+15550001111", "Your pension question was received.")
print(sid)
Expected output:
SMXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
Real-World Use Cases
- •
Member self-service over SMS
- •Answer questions about retirement age, vesting rules, contribution timing, and document requirements without sending users to a portal.
- •
Claims and case intake
- •Let members text supporting documents or questions, then route them into a case workflow after RAG-based triage.
- •
Policy change notifications
- •Push updates about rule changes or annual statements through WhatsApp/SMS with answers grounded in current pension documents.
Keep learning
- •The complete AI Agents Roadmap — my full 8-step breakdown
- •Free: The AI Agent Starter Kit — PDF checklist + starter code
- •Work with me — I build AI for banks and insurance companies
By Cyprian Aarons, AI Consultant at Topiax.
Want the complete 8-step roadmap?
Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.
Get the Starter Kit