How to Integrate LangChain for fintech with PostgreSQL for startups
Combining LangChain for fintech with PostgreSQL gives you a clean pattern for building finance-aware AI agents that can retrieve, store, and reason over structured business data. For startups, this is the difference between a chat demo and an agent that can answer customer questions, summarize transactions, and persist conversation state with auditability.
Prerequisites
- •Python 3.10+
- •A running PostgreSQL instance
- •A database user with
CREATE,INSERT,SELECT, andUPDATEpermissions - •A LangChain-compatible fintech SDK or package installed in your project
- •
psycopg2-binaryorpsycopgfor PostgreSQL connectivity - •Environment variables configured:
- •
DATABASE_URL - •
OPENAI_API_KEYor your model provider key - •Any fintech API credentials required by your LangChain integration
- •
Install the core packages:
pip install langchain langchain-community psycopg2-binary sqlalchemy python-dotenv
Integration Steps
- •Set up your PostgreSQL connection
Use SQLAlchemy for stable connection management and let LangChain reuse the same database URI. For production systems, keep credentials in environment variables.
import os
from sqlalchemy import create_engine, text
DATABASE_URL = os.getenv("DATABASE_URL")
engine = create_engine(DATABASE_URL, pool_pre_ping=True)
with engine.connect() as conn:
result = conn.execute(text("SELECT version();"))
print(result.fetchone())
- •Create tables for agent memory and fintech records
A startup-grade setup needs persistence for both conversation state and finance objects like accounts or transactions. Keep schemas explicit so you can audit every write.
from sqlalchemy import text
schema_sql = """
CREATE TABLE IF NOT EXISTS agent_messages (
id SERIAL PRIMARY KEY,
session_id TEXT NOT NULL,
role TEXT NOT NULL,
content TEXT NOT NULL,
created_at TIMESTAMP DEFAULT NOW()
);
CREATE TABLE IF NOT EXISTS fintech_transactions (
id SERIAL PRIMARY KEY,
account_id TEXT NOT NULL,
amount NUMERIC(12,2) NOT NULL,
currency TEXT NOT NULL,
description TEXT,
txn_time TIMESTAMP DEFAULT NOW()
);
"""
with engine.begin() as conn:
conn.execute(text(schema_sql))
- •Wire LangChain to PostgreSQL-backed memory
For conversational agents, store messages in Postgres so the system can resume sessions and keep a durable trail. In LangChain, use a message history implementation backed by SQL storage.
from langchain_community.chat_message_histories import SQLChatMessageHistory
session_id = "customer_123"
history = SQLChatMessageHistory(
session_id=session_id,
connection_string=DATABASE_URL
)
history.add_user_message("Show me my last three card transactions.")
history.add_ai_message("I found three recent card transactions.")
print(history.messages)
- •Build a fintech-aware chain that reads from Postgres
A common pattern is: fetch structured financial data from PostgreSQL, then pass it into a LangChain prompt for summarization or decision support. This keeps the model grounded in real records.
from langchain_openai import ChatOpenAI
from langchain_core.prompts import ChatPromptTemplate
llm = ChatOpenAI(model="gpt-4o-mini", temperature=0)
prompt = ChatPromptTemplate.from_messages([
("system", "You are a fintech assistant. Use only the provided transaction data."),
("user", "Summarize these transactions for the customer:\n{transactions}")
])
with engine.connect() as conn:
rows = conn.execute(text("""
SELECT amount, currency, description, txn_time
FROM fintech_transactions
WHERE account_id = :account_id
ORDER BY txn_time DESC
LIMIT 3
"""), {"account_id": "acct_001"}).fetchall()
transactions_text = "\n".join([str(row) for row in rows])
chain = prompt | llm
response = chain.invoke({"transactions": transactions_text})
print(response.content)
- •Persist agent outputs back into PostgreSQL
If the agent generates a customer-facing answer or risk note, store it. That gives you traceability and makes later review possible.
from sqlalchemy import text
agent_reply = response.content
with engine.begin() as conn:
conn.execute(
text("""
INSERT INTO agent_messages (session_id, role, content)
VALUES (:session_id, :role, :content)
"""),
{
"session_id": session_id,
"role": "assistant",
"content": agent_reply
}
)
Testing the Integration
Run a simple end-to-end check: insert sample data, query it through LangChain, and confirm the output is stored in Postgres.
from sqlalchemy import text
with engine.begin() as conn:
conn.execute(text("""
INSERT INTO fintech_transactions (account_id, amount, currency, description)
VALUES
('acct_001', 120.50, 'USD', 'Amazon purchase'),
('acct_001', 42.00, 'USD', 'Uber ride'),
('acct_001', 15.75, 'USD', 'Coffee shop')
"""))
with engine.connect() as conn:
rows = conn.execute(text("""
SELECT amount, currency, description
FROM fintech_transactions
WHERE account_id = 'acct_001'
ORDER BY id DESC
LIMIT 3
""")).fetchall()
print("Fetched rows:", rows)
print("Stored messages:", history.messages[-1].content)
Expected output:
Fetched rows: [(15.75, 'USD', 'Coffee shop'), (42.00, 'USD', 'Uber ride'), (120.50, 'USD', 'Amazon purchase')]
Stored messages: I found three recent card transactions.
Real-World Use Cases
- •Customer support agents that answer balance or transaction questions using Postgres as the system of record.
- •Fraud triage assistants that summarize suspicious activity and persist analyst notes for audit trails.
- •Finance ops copilots that generate monthly spend summaries from transaction tables and save results for reporting workflows.
The production pattern is simple: PostgreSQL stores truth, LangChain handles reasoning and orchestration. Keep those responsibilities separate and your agent system stays maintainable when the startup grows past prototype stage.
Keep learning
- •The complete AI Agents Roadmap — my full 8-step breakdown
- •Free: The AI Agent Starter Kit — PDF checklist + starter code
- •Work with me — I build AI for banks and insurance companies
By Cyprian Aarons, AI Consultant at Topiax.
Want the complete 8-step roadmap?
Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.
Get the Starter Kit