How to Integrate Next.js for lending with Vercel AI SDK for multi-agent systems

By Cyprian AaronsUpdated 2026-04-21
next-js-for-lendingvercel-ai-sdkmulti-agent-systemsnextjs-for-lending

Combining Next.js for lending with Vercel AI SDK gives you a clean split between product workflow and agent orchestration. You use Next.js to handle lending-specific UI, routing, and server actions, while Vercel AI SDK coordinates multiple agents that can score applications, explain decisions, and escalate edge cases.

That setup matters when your lending flow needs more than a single model call. You can route borrower data through specialized agents for document extraction, fraud checks, affordability analysis, and customer messaging without turning your app into a monolith.

Prerequisites

  • Python 3.11+
  • Node.js 18+ for the Next.js app
  • A Next.js lending project with API routes or server actions already running
  • Vercel AI SDK installed in the frontend/backend layer
  • Access to your lending backend endpoints or service layer
  • Environment variables configured:
    • NEXT_PUBLIC_APP_URL
    • VERCEL_AI_GATEWAY_API_KEY or your provider key
    • LENDING_API_BASE_URL
  • Familiarity with:
    • fetch() in Next.js route handlers
    • streamText() and generateText() from Vercel AI SDK
    • JSON request/response contracts between services

Integration Steps

  1. Map the lending workflow into agent responsibilities.

    Don’t let one agent do everything. Split the flow into discrete jobs: intake, verification, risk review, and response generation.

    from dataclasses import dataclass
    from typing import Dict, Any
    
    @dataclass
    class LendingTask:
        application_id: str
        applicant_data: Dict[str, Any]
        documents: list[str]
        channel: str = "web"
    
    def route_task(task: LendingTask) -> str:
        income = task.applicant_data.get("monthly_income", 0)
        debt = task.applicant_data.get("monthly_debt", 0)
    
        if income <= 0:
            return "document_agent"
        if debt / max(income, 1) > 0.45:
            return "risk_agent"
        return "approval_agent"
    
  2. Expose a Next.js lending endpoint that forwards work to the agent system.

    In Next.js, create an API route that accepts loan applications and forwards them to your Python orchestration service. The important part is keeping the contract stable: one payload in, one decision object out.

    import os
    import requests
    
    LENDING_API_BASE_URL = os.environ["LENDING_API_BASE_URL"]
    
    def submit_application(payload: dict) -> dict:
        response = requests.post(
            f"{LENDING_API_BASE_URL}/applications",
            json=payload,
            timeout=30,
        )
        response.raise_for_status()
        return response.json()
    
    def forward_to_agents(application_id: str) -> dict:
        response = requests.post(
            f"{LENDING_API_BASE_URL}/agents/orchestrate",
            json={"application_id": application_id},
            timeout=60,
        )
        response.raise_for_status()
        return response.json()
    
  3. Use Vercel AI SDK to orchestrate multi-agent reasoning.

    The Vercel AI SDK gives you structured generation and streaming primitives. In a multi-agent setup, call different models or prompts for each role, then merge their outputs into a final lending decision.

    import os
    from openai import OpenAI
    
    client = OpenAI(api_key=os.environ["OPENAI_API_KEY"])
    
    def run_underwriting_agent(application: dict) -> str:
        prompt = f"""
        You are an underwriting agent.
        Review this lending application and return a concise risk summary.
    
        Application:
        {application}
        """
        result = client.responses.create(
            model="gpt-4.1-mini",
            input=prompt,
        )
        return result.output_text
    
    def run_customer_agent(application: dict) -> str:
        prompt = f"""
        You are a customer communication agent.
        Explain the status of this loan application in plain language.
        
        Application:
        {application}
        """
        result = client.responses.create(
            model="gpt-4.1-mini",
            input=prompt,
        )
        return result.output_text
    
  4. Build the multi-agent coordinator that merges all outputs.

    This is where the integration becomes useful. One agent extracts facts, another scores risk, another drafts customer-facing language, and your coordinator decides what gets written back into the lending system.

    from typing import Any
    
    def orchestrate_application(application: dict) -> dict[str, Any]:
        underwriting_summary = run_underwriting_agent(application)
        customer_message = run_customer_agent(application)
    
        monthly_income = application.get("monthly_income", 0)
        monthly_debt = application.get("monthly_debt", 0)
        dti = monthly_debt / max(monthly_income, 1)
    
        if dti > 0.45:
            decision = "manual_review"
        elif application.get("credit_score", 0) >= 700:
            decision = "approved"
        else:
            decision = "pending"
    
        return {
            "decision": decision,
            "underwriting_summary": underwriting_summary,
            "customer_message": customer_message,
            "metrics": {
                "dti": round(dti, 2),
                "credit_score": application.get("credit_score", None),
            },
        }
    
  5. Write results back to Next.js for rendering and workflow continuation.

    Once the orchestration service returns a result, persist it in your lending backend and let Next.js render the status page or trigger server-side updates.

    import requests
    
    def persist_decision(application_id: str, result: dict) -> dict:
      response = requests.patch(
          f"{LENDING_API_BASE_URL}/applications/{application_id}",
          json={
              "status": result["decision"],
              "agent_notes": result["underwriting_summary"],
              "customer_message": result["customer_message"],
          },
          timeout=30,
      )
      response.raise_for_status()
      return response.json()
    

Testing the Integration

Run a local smoke test against a sample application payload. This verifies the coordinator can classify the case and produce both internal and customer-facing outputs.

sample_application = {
    "application_id": "app_10021",
    "monthly_income": 8500,
    "monthly_debt": 2100,
    "credit_score": 732,
    "employment_status": "full_time",
}

result = orchestrate_application(sample_application)
print(result)

Expected output:

{
  'decision': 'approved',
  'underwriting_summary': '...',
  'customer_message': '...',
  'metrics': {'dti': 0.25, 'credit_score': 732}
}

Real-World Use Cases

  • Loan prequalification assistant

    • Let one agent collect borrower details in Next.js while another evaluates eligibility and returns instant prequalification feedback.
  • Document review pipeline

    • Use one agent for OCR/extraction of pay stubs and bank statements, another for anomaly detection, and another for summarizing missing items.
  • Adverse action explanation generation

    • When an application is declined or sent to manual review, generate compliant customer explanations from structured underwriting findings instead of writing them by hand.

Keep learning

By Cyprian Aarons, AI Consultant at Topiax.

Want the complete 8-step roadmap?

Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.

Get the Starter Kit

Related Guides