How to Integrate LangChain for payments with Slack for RAG
Combining LangChain for payments with Slack gives you a clean pattern for building an agent that can answer policy, invoice, and billing questions from a RAG index while keeping the conversation in Slack. The practical win is simple: users ask in Slack, the agent retrieves the right docs, checks payment context, and responds with grounded answers instead of guessing.
Prerequisites
- •Python 3.10+
- •A Slack app with:
- •Bot token
- •Signing secret
- •Event subscriptions enabled
- •Access to your payment provider API through LangChain-compatible tooling
- •A vector store for RAG, such as FAISS, Pinecone, or Chroma
- •
langchain,slack-bolt,slack-sdk, and your LLM provider package installed - •Environment variables set:
- •
SLACK_BOT_TOKEN - •
SLACK_SIGNING_SECRET - •
OPENAI_API_KEYor equivalent - •Payment API credentials for your billing system
- •
Install the core packages:
pip install langchain langchain-openai slack-bolt slack-sdk faiss-cpu python-dotenv
Integration Steps
- •Set up the payment-aware LangChain tools
If your payment system exposes APIs for invoice lookup, payment status, or transaction history, wrap those endpoints as LangChain tools. That gives your agent structured access to payment data before it answers in Slack.
import os
import requests
from langchain_core.tools import tool
PAYMENTS_BASE_URL = os.environ["PAYMENTS_BASE_URL"]
PAYMENTS_API_KEY = os.environ["PAYMENTS_API_KEY"]
@tool
def get_invoice_status(invoice_id: str) -> str:
"""Fetch invoice status from the payments system."""
resp = requests.get(
f"{PAYMENTS_BASE_URL}/invoices/{invoice_id}",
headers={"Authorization": f"Bearer {PAYMENTS_API_KEY}"},
timeout=15,
)
resp.raise_for_status()
data = resp.json()
return f"Invoice {invoice_id}: {data['status']} | amount={data['amount']} | due_date={data['due_date']}"
- •Build the RAG retriever
Your RAG layer should retrieve policy docs, billing FAQs, and operational runbooks. Keep it narrow; payment agents work better when the corpus is specific.
from langchain_community.vectorstores import FAISS
from langchain_openai import OpenAIEmbeddings
from langchain_text_splitters import RecursiveCharacterTextSplitter
from langchain_community.document_loaders import TextLoader
loader = TextLoader("./docs/billing_policy.txt")
docs = loader.load()
splitter = RecursiveCharacterTextSplitter(chunk_size=800, chunk_overlap=120)
chunks = splitter.split_documents(docs)
embeddings = OpenAIEmbeddings()
vectorstore = FAISS.from_documents(chunks, embeddings)
retriever = vectorstore.as_retriever(search_kwargs={"k": 4})
- •Create the LangChain agent that uses both retrieval and payment tools
The agent should first retrieve relevant context, then call payment tools only when needed. For production systems, keep tool scope tight so the model doesn’t wander into unrelated actions.
from langchain_openai import ChatOpenAI
from langchain_core.prompts import ChatPromptTemplate
from langchain_core.runnables import RunnableLambda
llm = ChatOpenAI(model="gpt-4o-mini", temperature=0)
def format_docs(docs):
return "\n\n".join(d.page_content for d in docs)
rag_chain = (
RunnableLambda(lambda x: retriever.invoke(x["question"]))
| RunnableLambda(format_docs)
)
prompt = ChatPromptTemplate.from_messages([
("system", "You are a billing assistant. Use retrieved policy context and payment tools when needed."),
("human", "Question: {question}\n\nContext:\n{context}")
])
def answer_question(question: str) -> str:
context = rag_chain.invoke({"question": question})
messages = prompt.format_messages(question=question, context=context)
return llm.invoke(messages).content
- •Wire the agent into Slack using Bolt
Slack Bolt handles events and message posting. Listen for mentions or direct messages, run your retrieval/payment logic, then post the answer back into the channel.
import os
from slack_bolt import App
app = App(
token=os.environ["SLACK_BOT_TOKEN"],
signing_secret=os.environ["SLACK_SIGNING_SECRET"],
)
@app.event("app_mention")
def handle_mention(event, say):
text = event.get("text", "")
user_question = text.replace("<@BOT_ID>", "").strip()
# Example: if user asks about an invoice ID, you can route to payment tool.
if "invoice" in user_question.lower():
invoice_status = get_invoice_status.invoke({"invoice_id": "inv_12345"})
response = f"{invoice_status}\n\n{answer_question(user_question)}"
else:
response = answer_question(user_question)
say(response)
if __name__ == "__main__":
app.start(port=int(os.environ.get("PORT", 3000)))
- •Add a small routing layer for better control
In real systems, don’t send every request through one generic chain. Route between RAG-only answers and payment-tool calls based on intent.
def route_request(question: str) -> str:
q = question.lower()
if any(term in q for term in ["invoice", "payment status", "chargeback", "refund"]):
invoice_id = "inv_12345" # replace with entity extraction in production
payment_context = get_invoice_status.invoke({"invoice_id": invoice_id})
return f"{payment_context}\n\n{answer_question(question)}"
return answer_question(question)
Testing the Integration
Use a simple local test before wiring it to Slack events. This verifies both retrieval and payment lookup paths.
if __name__ == "__main__":
test_questions = [
"What is our refund policy?",
"Check invoice inv_12345 status",
]
for q in test_questions:
print("Q:", q)
print("A:", route_request(q))
print("-" * 60)
Expected output:
Q: What is our refund policy?
A: Based on the billing policy document...
------------------------------------------------------------
Q: Check invoice inv_12345 status
A: Invoice inv_12345: paid | amount=1200.00 | due_date=2026-05-01
Based on the retrieved policy context...
------------------------------------------------------------
Real-World Use Cases
- •
Billing support bot in Slack
- •Employees ask about failed payments, overdue invoices, or refund rules.
- •The agent retrieves policy docs and checks live payment state before responding.
- •
Finance ops assistant
- •Teams query settlement timelines or transaction anomalies directly from Slack.
- •RAG provides internal runbook context; payment APIs provide live operational data.
- •
Customer-facing escalation workflow
- •Support agents paste a case into Slack.
- •The bot summarizes relevant billing policy and pulls account/payment metadata so humans can resolve faster.
Keep learning
- •The complete AI Agents Roadmap — my full 8-step breakdown
- •Free: The AI Agent Starter Kit — PDF checklist + starter code
- •Work with me — I build AI for banks and insurance companies
By Cyprian Aarons, AI Consultant at Topiax.
Want the complete 8-step roadmap?
Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.
Get the Starter Kit