How to Integrate Next.js for lending with Vercel AI SDK for AI agents
Why this integration matters
If you’re building lending workflows with AI agents, the hard part is not generating text. It’s connecting the agent to real lending actions: eligibility checks, offer retrieval, document collection, and status updates.
Next.js for lending gives you the application surface and workflow endpoints. Vercel AI SDK gives you the agent layer: tool calling, structured outputs, and conversation orchestration. Put them together and you get an agent that can guide a borrower through a lending journey and call backend lending APIs when it needs to act.
Prerequisites
- •Python 3.10+
- •A Next.js for lending app with API routes or server actions exposed for:
- •
POST /api/lending/eligibility - •
POST /api/lending/offer - •
POST /api/lending/documents
- •
- •A Vercel project with AI SDK enabled
- •An OpenAI API key or compatible model provider key
- •
httpxinstalled for API calls from Python - •
pydanticfor request/response validation - •Environment variables configured:
- •
NEXTJS_LENDING_BASE_URL - •
VERCEL_AI_API_KEY - •
OPENAI_API_KEY
- •
Install the Python dependencies:
pip install httpx pydantic openai
Integration Steps
1) Define the lending API contract
Start by treating Next.js for lending as your system of record for workflow actions. Your agent should never invent loan decisions; it should call a real endpoint and return the result.
from pydantic import BaseModel, Field
from typing import Literal, Optional
class EligibilityRequest(BaseModel):
customer_id: str
annual_income: float = Field(gt=0)
employment_status: Literal["employed", "self_employed", "unemployed"]
requested_amount: float = Field(gt=0)
term_months: int = Field(ge=6, le=360)
class EligibilityResponse(BaseModel):
eligible: bool
risk_band: Literal["low", "medium", "high"]
max_amount: float
reason: Optional[str] = None
Use strict models early. Lending flows fail in production when payloads drift between frontend, agent, and backend.
2) Call Next.js for lending from your agent service
Your Python agent can call the Next.js route directly. This keeps business logic in the lending app while the agent focuses on orchestration.
import os
import httpx
BASE_URL = os.environ["NEXTJS_LENDING_BASE_URL"]
async def check_eligibility(payload: dict) -> dict:
async with httpx.AsyncClient(timeout=20) as client:
response = await client.post(
f"{BASE_URL}/api/lending/eligibility",
json=payload,
headers={"Content-Type": "application/json"},
)
response.raise_for_status()
return response.json()
# Example usage
# result = await check_eligibility({
# "customer_id": "cus_123",
# "annual_income": 85000,
# "employment_status": "employed",
# "requested_amount": 25000,
# "term_months": 48,
# })
This assumes your Next.js route implements a standard JSON contract. In practice, your route can validate input with Zod on the Node side and return a normalized response for the Python agent.
3) Expose Next.js actions as Vercel AI SDK tools
Vercel AI SDK works best when tools are explicit. Define tool schemas that map cleanly to your Next.js endpoints, then let the model choose when to call them.
from openai import OpenAI
client = OpenAI(api_key=os.environ["OPENAI_API_KEY"])
TOOLS = [
{
"type": "function",
"function": {
"name": "check_lending_eligibility",
"description": "Check borrower eligibility using Next.js for lending",
"parameters": {
"type": "object",
"properties": {
"customer_id": {"type": "string"},
"annual_income": {"type": "number"},
"employment_status": {
"type": "string",
"enum": ["employed", "self_employed", "unemployed"]
},
"requested_amount": {"type": "number"},
"term_months": {"type": "integer"}
},
"required": ["customer_id", "annual_income", "employment_status", "requested_amount", "term_months"]
}
}
}
]
If you’re using Vercel AI SDK in a Node runtime, this is the same conceptual mapping: define tools once, bind them to backend actions, then let the model route requests through them.
4) Build the agent loop around tool calls
The agent receives borrower context, asks clarifying questions if needed, then calls your Next.js endpoint via the tool. Keep retries and validation outside the model.
import json
async def run_agent(user_message: str):
messages = [
{"role": "system", "content": (
"You are a lending assistant. "
"Never approve loans without calling the eligibility tool. "
"Return concise borrower-facing answers."
)},
{"role": "user", "content": user_message},
]
response = client.chat.completions.create(
model="gpt-4o-mini",
messages=messages,
tools=TOOLS,
tool_choice="auto",
)
message = response.choices[0].message
if not message.tool_calls:
return message.content
tool_call = message.tool_calls[0]
args = json.loads(tool_call.function.arguments)
eligibility = await check_eligibility(args)
return {
"tool_used": tool_call.function.name,
"eligibility_result": eligibility,
}
This pattern is production-friendly because it keeps decisioning deterministic at the boundary. The model can assist with conversation flow, but only your backend decides eligibility.
5) Persist outcomes back into Next.js for lending
Once eligibility is checked, push results back into your lending workflow so case managers and borrowers see consistent state across systems.
async def submit_offer(customer_id: str, offer_payload: dict) -> dict:
async with httpx.AsyncClient(timeout=20) as client:
response = await client.post(
f"{BASE_URL}/api/lending/offer",
json={
"customer_id": customer_id,
**offer_payload,
},
)
response.raise_for_status()
return response.json()
# Example:
# offer_result = await submit_offer("cus_123", {
# "approved_amount": 20000,
# "apr": 12.9,
# "term_months": 48,
# })
At this point, Next.js owns workflow persistence and UI state. The agent owns interaction and routing.
Testing the Integration
Use a simple end-to-end test that checks eligibility through the agent path and verifies a structured result comes back.
import asyncio
async def main():
result = await run_agent(
'Check whether customer cus_123 qualifies for a $25,000 loan over 48 months '
'with annual income of $85,000 and employment status employed.'
)
print(result)
if __name__ == "__main__":
asyncio.run(main())
Expected output:
{
'tool_used': 'check_lending_eligibility',
'eligibility_result': {
'eligible': True,
'risk_band': 'medium',
'max_amount': 30000,
'reason': None
}
}
If you get an HTTP error here, inspect three things first:
- •The
NEXTJS_LENDING_BASE_URLvalue - •The request schema accepted by
/api/lending/eligibility - •Whether your Vercel/OpenAI key has access to the model you selected
Real-World Use Cases
- •Pre-qualification assistant
- •Let borrowers ask “Can I qualify?” and have the agent call Next.js eligibility endpoints before returning an answer.
- •Document collection workflow
- •Use Vercel AI SDK tool calls to detect missing documents, then trigger Next.js routes that update application status.
- •Loan officer copilot
- •Summarize borrower profiles, fetch offers from Next.js services, and draft next-step recommendations without exposing raw backend complexity to staff.
Keep learning
- •The complete AI Agents Roadmap — my full 8-step breakdown
- •Free: The AI Agent Starter Kit — PDF checklist + starter code
- •Work with me — I build AI for banks and insurance companies
By Cyprian Aarons, AI Consultant at Topiax.
Want the complete 8-step roadmap?
Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.
Get the Starter Kit