How to Integrate LangChain for lending with PostgreSQL for startups
Combining LangChain for lending with PostgreSQL gives you a practical pattern for startup lending agents: the model can reason over borrower data, loan products, and policy rules while PostgreSQL keeps the system auditable and persistent. That matters when you need every recommendation, score, and decision traceable for underwriting, collections, or compliance.
Prerequisites
- •Python 3.10+
- •A running PostgreSQL instance
- •A database user with read/write access
- •
pipinstalled - •API credentials for your LangChain-powered lending stack
- •The following Python packages:
- •
langchain - •
langchain-openai - •
langchain-community - •
psycopg2-binary - •
sqlalchemy
- •
- •A PostgreSQL schema ready for:
- •borrowers
- •loan applications
- •decision logs
Integration Steps
- •Install dependencies
pip install langchain langchain-openai langchain-community psycopg2-binary sqlalchemy
- •Create a PostgreSQL connection
Use SQLAlchemy for connection management and let LangChain query through that engine.
from sqlalchemy import create_engine, text
POSTGRES_URL = "postgresql+psycopg2://loan_user:loan_pass@localhost:5432/lending_db"
engine = create_engine(POSTGRES_URL)
with engine.connect() as conn:
result = conn.execute(text("SELECT version();"))
print(result.fetchone())
- •Set up your lending data model
This gives the agent structured data to reason over instead of free-form notes.
from sqlalchemy import MetaData, Table, Column, Integer, String, Numeric, Boolean, DateTime, ForeignKey
from sqlalchemy.sql import func
metadata = MetaData()
borrowers = Table(
"borrowers",
metadata,
Column("id", Integer, primary_key=True),
Column("full_name", String(200), nullable=False),
Column("email", String(200), unique=True, nullable=False),
Column("monthly_income", Numeric(12, 2), nullable=False),
Column("credit_score", Integer, nullable=False),
)
loan_applications = Table(
"loan_applications",
metadata,
Column("id", Integer, primary_key=True),
Column("borrower_id", Integer, ForeignKey("borrowers.id"), nullable=False),
Column("amount_requested", Numeric(12, 2), nullable=False),
Column("approved", Boolean, default=False),
Column("decision_reason", String(500)),
Column("created_at", DateTime(timezone=True), server_default=func.now()),
)
metadata.create_all(engine)
- •Connect LangChain to PostgreSQL and build a lending agent
For SQL-backed retrieval and generation, use SQLDatabase plus a chat model. This is the common pattern when the agent needs to inspect borrower records before answering.
from langchain_community.utilities import SQLDatabase
from langchain_openai import ChatOpenAI
from langchain.chains import create_sql_query_chain
db = SQLDatabase(engine=engine)
llm = ChatOpenAI(model="gpt-4o-mini", temperature=0)
query_chain = create_sql_query_chain(llm, db)
question = "Show borrowers with credit_score above 700 and monthly_income above 5000."
sql_query = query_chain.invoke({"question": question})
print(sql_query)
If you want the agent to produce a lending decision summary from retrieved rows:
from langchain_core.prompts import ChatPromptTemplate
prompt = ChatPromptTemplate.from_messages([
("system", "You are a lending assistant. Use only the provided borrower data."),
("user", "Borrower data: {rows}\nQuestion: {question}")
])
decision_chain = prompt | llm
with engine.connect() as conn:
rows = conn.execute(text("""
SELECT full_name, monthly_income::text, credit_score
FROM borrowers
WHERE credit_score >= 700 AND monthly_income >= 5000;
""")).fetchall()
response = decision_chain.invoke({
"rows": str(rows),
"question": "Which borrowers look eligible for a small unsecured loan?"
})
print(response.content)
- •Write decisions back to PostgreSQL
This is where the integration becomes production-useful: the agent reads data, reasons over it, then persists an auditable decision trail.
from sqlalchemy import insert
decision_payload = {
"borrower_id": 1,
"amount_requested": 2500,
"approved": True,
"decision_reason": "Credit score above threshold and income supports repayment."
}
with engine.begin() as conn:
conn.execute(insert(loan_applications).values(**decision_payload))
Testing the Integration
Run a quick end-to-end check: fetch borrower data with SQLAlchemy, ask LangChain to summarize eligibility, then verify the decision row was stored.
from sqlalchemy import text
with engine.begin() as conn:
conn.execute(text("""
INSERT INTO borrowers (full_name, email, monthly_income, credit_score)
VALUES ('Amina Patel', 'amina@example.com', 6500, 742)
ON CONFLICT (email) DO NOTHING;
"""))
with engine.connect() as conn:
borrower_rows = conn.execute(text("""
SELECT full_name, monthly_income::text AS income, credit_score
FROM borrowers
WHERE email = 'amina@example.com';
""")).fetchall()
print(borrower_rows)
result = decision_chain.invoke({
"rows": str(borrower_rows),
"question": "Should this borrower be pre-approved for a $2k loan?"
})
print(result.content)
Expected output:
[('Amina Patel', '6500.00', 742)]
The borrower appears eligible for pre-approval based on income and credit score.
Real-World Use Cases
- •Loan pre-screening agents that pull applicant records from PostgreSQL and generate approval recommendations with explanation text.
- •Collections assistants that summarize delinquent accounts from PostgreSQL and draft next-step actions using LangChain prompts.
- •Policy Q&A copilots that answer underwriting questions by querying product rules stored in PostgreSQL tables.
Keep learning
- •The complete AI Agents Roadmap — my full 8-step breakdown
- •Free: The AI Agent Starter Kit — PDF checklist + starter code
- •Work with me — I build AI for banks and insurance companies
By Cyprian Aarons, AI Consultant at Topiax.
Want the complete 8-step roadmap?
Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.
Get the Starter Kit