How to Integrate LangChain for insurance with Slack for RAG

By Cyprian AaronsUpdated 2026-04-21
langchain-for-insuranceslackrag

Insurance teams live in Slack. Claims questions, policy clarifications, underwriting checks, and customer escalations all end up there.

If you connect LangChain for insurance to Slack, you can turn those conversations into a retrieval-augmented generation workflow: users ask in Slack, the agent pulls the right policy or claims context from your knowledge base, and returns grounded answers with traceability.

Prerequisites

  • Python 3.10+
  • A Slack workspace with admin access
  • A Slack app with:
    • chat:write
    • channels:history
    • groups:history
    • im:history
    • app_mentions:read
  • A Slack Bot Token (xoxb-...)
  • A LangChain insurance setup with:
    • document loaders for policy/claims docs
    • embeddings model
    • vector store
    • retriever
  • Environment variables configured:
    • SLACK_BOT_TOKEN
    • SLACK_APP_TOKEN if using Socket Mode
    • OPENAI_API_KEY or your embedding/LLM provider key

Install the core packages:

pip install slack-bolt langchain langchain-community langchain-openai faiss-cpu python-dotenv

Integration Steps

  1. Build the insurance knowledge base first.

Your Slack bot is only useful if retrieval is solid. Load policy docs, claims playbooks, and underwriting notes into a vector store so LangChain can retrieve grounded context.

from langchain_community.document_loaders import DirectoryLoader, TextLoader
from langchain_text_splitters import RecursiveCharacterTextSplitter
from langchain_openai import OpenAIEmbeddings
from langchain_community.vectorstores import FAISS

loader = DirectoryLoader(
    "insurance_docs",
    glob="**/*.txt",
    loader_cls=TextLoader,
)
docs = loader.load()

splitter = RecursiveCharacterTextSplitter(chunk_size=800, chunk_overlap=120)
chunks = splitter.split_documents(docs)

embeddings = OpenAIEmbeddings(model="text-embedding-3-small")
vectorstore = FAISS.from_documents(chunks, embeddings)

retriever = vectorstore.as_retriever(search_kwargs={"k": 4})
  1. Create the LangChain RAG chain for insurance answers.

Use a chat model and a retrieval chain that answers only from retrieved context. In insurance, that matters because hallucinated coverage details create real risk.

from langchain_openai import ChatOpenAI
from langchain.chains.combine_documents import create_stuff_documents_chain
from langchain.chains.retrieval import create_retrieval_chain
from langchain_core.prompts import ChatPromptTemplate

llm = ChatOpenAI(model="gpt-4o-mini", temperature=0)

prompt = ChatPromptTemplate.from_messages([
    ("system", "You are an insurance assistant. Answer only from the provided context. If the answer is missing, say you don't know."),
    ("human", "Question: {input}\n\nContext:\n{context}")
])

document_chain = create_stuff_documents_chain(llm, prompt)
rag_chain = create_retrieval_chain(retriever, document_chain)
  1. Wire Slack events to the RAG chain.

Use Slack Bolt to listen for mentions or direct messages. When a user asks a question in Slack, pass it into the LangChain retrieval chain and post the answer back.

import os
from slack_bolt import App

app = App(token=os.environ["SLACK_BOT_TOKEN"])

@app.event("app_mention")
def handle_mention(body, say):
    text = body["event"]["text"]
    user_question = text.split(">", 1)[-1].strip() if ">" in text else text

    result = rag_chain.invoke({"input": user_question})
    answer = result["answer"]

    say(answer)

If you want direct message support too, register the same handler for message.im.

@app.event("message")
def handle_dm(body, say):
    event = body.get("event", {})
    if event.get("channel_type") != "im":
        return

    user_question = event.get("text", "")
    result = rag_chain.invoke({"input": user_question})

    say(result["answer"])
  1. Run Slack in Socket Mode or via Events API.

Socket Mode is simpler for internal tools because you avoid public webhook setup during development.

from slack_bolt.adapter.socket_mode import SocketModeHandler

if __name__ == "__main__":
    handler = SocketModeHandler(app, os.environ["SLACK_APP_TOKEN"])
    handler.start()
  1. Add traceability so support teams can verify answers.

In insurance workflows, users need citations or source hints. Return doc metadata alongside the answer so people can inspect which policy section was used.

@app.event("app_mention")
def handle_mention(body, say):
    text = body["event"]["text"]
    user_question = text.split(">", 1)[-1].strip() if ">" in text else text

    result = rag_chain.invoke({"input": user_question})
    answer = result["answer"]
    context_docs = result.get("context", [])

    sources = []
    for doc in context_docs[:3]:
        source = doc.metadata.get("source", "unknown")
        sources.append(f"- {source}")

    response_text = f"{answer}\n\nSources:\n" + "\n".join(sources) if sources else answer
    say(response_text)

Testing the Integration

Start by asking a controlled question that exists in your docs. Use Slack mention syntax or DM the bot directly.

test_question = "What is the waiting period for accidental death coverage?"
result = rag_chain.invoke({"input": test_question})

print("ANSWER:")
print(result["answer"])
print("\nSOURCES:")
for doc in result.get("context", []):
    print(doc.metadata.get("source"))

Expected output:

ANSWER:
The waiting period for accidental death coverage is 30 days under policy section 4.2.

SOURCES:
insurance_docs/policy_manual.txt
insurance_docs/coverage_terms.txt

If you get an empty or vague answer:

  • check whether the document was actually indexed
  • verify your retriever returns relevant chunks with k=4
  • confirm the Slack bot token has chat:write
  • make sure your mention parsing is stripping <@BOTID>

Real-World Use Cases

  • Claims ops assistant
    Agents ask in Slack whether a claim meets documentation requirements, and the bot returns policy-backed answers with source references.

  • Underwriting policy checker
    Underwriters paste scenario details into a channel and get instant guidance pulled from underwriting manuals and risk rules.

  • Customer support escalation helper
    Support reps use Slack to ask coverage questions during live cases without switching systems or guessing at policy language.


Keep learning

By Cyprian Aarons, AI Consultant at Topiax.

Want the complete 8-step roadmap?

Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.

Get the Starter Kit

Related Guides