How to Integrate LangChain for retail banking with PostgreSQL for AI agents
Combining LangChain for retail banking with PostgreSQL gives you a practical way to build agents that can answer customer questions, retrieve account context, and persist conversation or workflow state in a durable store. In retail banking, that matters because agents need more than chat memory — they need controlled access to customer data, auditability, and reliable retrieval across sessions.
Prerequisites
- •Python 3.10+
- •A PostgreSQL instance running locally or in your cloud environment
- •A PostgreSQL database and user with read/write access
- •
pipinstalled - •Access to your LangChain provider for the banking use case
- •Environment variables configured:
- •
DATABASE_URL - •
OPENAI_API_KEYor your model provider key
- •
- •Installed packages:
- •
langchain - •
langchain-openai - •
langchain-community - •
psycopg2-binary - •
sqlalchemy
- •
Integration Steps
- •Install the dependencies.
pip install langchain langchain-openai langchain-community psycopg2-binary sqlalchemy
- •Set up your PostgreSQL connection string and create a simple schema for agent memory or customer notes.
import os
from sqlalchemy import create_engine, text
DATABASE_URL = os.environ["DATABASE_URL"]
engine = create_engine(DATABASE_URL)
with engine.begin() as conn:
conn.execute(text("""
CREATE TABLE IF NOT EXISTS agent_sessions (
session_id TEXT PRIMARY KEY,
customer_id TEXT NOT NULL,
summary TEXT NOT NULL,
updated_at TIMESTAMP DEFAULT NOW()
)
"""))
- •Create a PostgreSQL-backed LangChain store for persistent retrieval.
For banking agents, a common pattern is to store session summaries, KYC notes, or support case metadata in PostgreSQL and retrieve them when the agent needs context.
import os
from langchain_openai import OpenAIEmbeddings
from langchain_community.vectorstores import PGVector
DATABASE_URL = os.environ["DATABASE_URL"]
embeddings = OpenAIEmbeddings()
vectorstore = PGVector.from_existing_index(
embedding=embeddings,
collection_name="retail_banking_agent_docs",
connection_string=DATABASE_URL,
)
If you are starting from scratch, create the collection first and add documents.
from langchain_core.documents import Document
docs = [
Document(
page_content="Savings account overdraft policy: fee applies after grace period.",
metadata={"product": "savings", "topic": "overdraft"}
),
Document(
page_content="Credit card disputes must be filed within 60 days of statement date.",
metadata={"product": "credit_card", "topic": "disputes"}
),
]
vectorstore.add_documents(docs)
- •Build the LangChain agent and connect it to PostgreSQL-backed retrieval.
This example uses a retriever tool so the agent can fetch bank policy context before answering.
import os
from langchain_openai import ChatOpenAI
from langchain.agents import initialize_agent, AgentType
from langchain.tools.retriever import create_retriever_tool
llm = ChatOpenAI(model="gpt-4o-mini", temperature=0)
retriever = vectorstore.as_retriever(search_kwargs={"k": 3})
bank_policy_tool = create_retriever_tool(
retriever=retriever,
name="retail_banking_policy_search",
description="Searches retail banking product policies stored in PostgreSQL-backed vector storage."
)
agent = initialize_agent(
tools=[bank_policy_tool],
llm=llm,
agent=AgentType.ZERO_SHOT_REACT_DESCRIPTION,
verbose=True,
)
- •Persist agent session state back into PostgreSQL after each interaction.
That gives you auditability and lets the next turn resume with real context instead of a blank slate.
from sqlalchemy import text
def save_session(session_id: str, customer_id: str, summary: str):
with engine.begin() as conn:
conn.execute(
text("""
INSERT INTO agent_sessions (session_id, customer_id, summary, updated_at)
VALUES (:session_id, :customer_id, :summary, NOW())
ON CONFLICT (session_id)
DO UPDATE SET summary = EXCLUDED.summary, updated_at = NOW()
"""),
{"session_id": session_id, "customer_id": customer_id, "summary": summary},
)
response = agent.invoke({
"input": "What is the dispute window for credit card transactions?"
})
save_session(
session_id="sess_001",
customer_id="cust_123",
summary=response["output"]
)
Testing the Integration
Run a retrieval query through the agent and confirm it can pull policy data from PostgreSQL-backed storage.
result = agent.invoke({
"input": "Tell me the overdraft fee policy for savings accounts."
})
print(result["output"])
Expected output:
The savings account overdraft policy states that a fee applies after the grace period.
You should also verify the row was written to PostgreSQL:
with engine.begin() as conn:
rows = conn.execute(
text("SELECT session_id, customer_id, summary FROM agent_sessions WHERE session_id = :sid"),
{"sid": "sess_001"}
).fetchall()
print(rows)
Expected output:
[('sess_001', 'cust_123', 'The savings account overdraft policy states that a fee applies after the grace period.')]
Real-World Use Cases
- •
Customer support agents
- •Answer product questions using approved banking policies stored in PostgreSQL.
- •Keep every interaction traceable for compliance review.
- •
Relationship manager assistants
- •Pull recent notes, account summaries, and next-best actions from persistent storage.
- •Resume conversations across sessions without losing context.
- •
Back-office workflow agents
- •Route disputes, loan follow-ups, and KYC tasks based on database state.
- •Update PostgreSQL records as tasks move through approval stages.
Keep learning
- •The complete AI Agents Roadmap — my full 8-step breakdown
- •Free: The AI Agent Starter Kit — PDF checklist + starter code
- •Work with me — I build AI for banks and insurance companies
By Cyprian Aarons, AI Consultant at Topiax.
Want the complete 8-step roadmap?
Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.
Get the Starter Kit