How to Integrate Next.js for fintech with Vercel AI SDK for production AI

By Cyprian AaronsUpdated 2026-04-21
next-js-for-fintechvercel-ai-sdkproduction-ainextjs-for-fintech

Combining Next.js for fintech with Vercel AI SDK gives you a clean path to ship AI features into regulated product surfaces without turning your frontend into a science project. The practical use case is straightforward: your Next.js app handles the customer experience and policy controls, while Vercel AI SDK orchestrates model calls, streaming, and tool execution for things like KYC triage, transaction summaries, and support copilots.

Prerequisites

  • Python 3.11+
  • Node.js 18+ for the Next.js app
  • A Next.js fintech codebase already set up with your auth, audit logging, and API routes
  • Vercel AI SDK installed in the Next.js app:
    • npm install ai @ai-sdk/openai
  • A backend service in Python that can call your fintech APIs and expose agent tools
  • Access to your model provider API key
  • A local .env file or secret manager configured with:
    • OPENAI_API_KEY
    • FINTECH_API_BASE_URL
    • FINTECH_API_KEY

Integration Steps

  1. Define the contract between Next.js and the Python agent service

    Keep Next.js responsible for request/response handling and let Python handle orchestration. In production, this avoids leaking business logic into UI code.

    from pydantic import BaseModel, Field
    from typing import Literal
    
    class RiskReviewRequest(BaseModel):
        customer_id: str = Field(..., min_length=1)
        action: Literal["approve", "hold", "escalate"]
        reason: str
    
    class RiskReviewResponse(BaseModel):
        status: str
        audit_id: str
        next_step: str
    
  2. Build a Python tool wrapper around your fintech API

    This is the layer your AI agent will call. Use real HTTP calls, timeouts, and structured responses. For fintech systems, don’t pass raw text around when you can pass typed objects.

    import os
    import httpx
    from typing import Any
    
    FINTECH_API_BASE_URL = os.environ["FINTECH_API_BASE_URL"]
    FINTECH_API_KEY = os.environ["FINTECH_API_KEY"]
    
    async def fetch_customer_profile(customer_id: str) -> dict[str, Any]:
        headers = {"Authorization": f"Bearer {FINTECH_API_KEY}"}
        async with httpx.AsyncClient(timeout=10.0) as client:
            resp = await client.get(
                f"{FINTECH_API_BASE_URL}/customers/{customer_id}",
                headers=headers,
            )
            resp.raise_for_status()
            return resp.json()
    
    async def create_audit_event(payload: dict[str, Any]) -> dict[str, Any]:
        headers = {"Authorization": f"Bearer {FINTECH_API_KEY}"}
        async with httpx.AsyncClient(timeout=10.0) as client:
            resp = await client.post(
                f"{FINTECH_API_BASE_URL}/audit-events",
                json=payload,
                headers=headers,
            )
            resp.raise_for_status()
            return resp.json()
    
  3. Expose an AI endpoint in Python that the Next.js app can call

    Your Next.js route will hit this service. If you’re using Vercel AI SDK on the frontend, this backend can still power tool execution and policy checks before returning a final response.

    from fastapi import FastAPI
    from pydantic import BaseModel
    from openai import OpenAI
    
    app = FastAPI()
    client = OpenAI()
    
    class ChatRequest(BaseModel):
        customer_id: str
        message: str
    
    @app.post("/agent/chat")
    async def chat(req: ChatRequest):
        profile = await fetch_customer_profile(req.customer_id)
    
        prompt = f"""
        You are a fintech assistant.
        Customer profile: {profile}
        User message: {req.message}
        Return a concise answer and mention if escalation is needed.
        """
    
        completion = client.responses.create(
            model="gpt-4.1-mini",
            input=prompt,
        )
    
        audit = await create_audit_event({
            "customer_id": req.customer_id,
            "message": req.message,
            "model_output": completion.output_text,
        })
    
        return {
            "answer": completion.output_text,
            "audit_id": audit["id"],
        }
    
  4. Call the Python agent from a Next.js route using Vercel AI SDK

    This is where Vercel AI SDK earns its keep in production UI flows. Use streamText for streaming responses and keep the route thin.

    # This snippet shows the backend contract that your Next.js route will consume.
    # In the Next.js app, you'd typically call this endpoint from an API route or server action.
    
    import requests
    
    def call_agent_service(customer_id: str, message: str) -> dict:
        response = requests.post(
            "http://localhost:8000/agent/chat",
            json={"customer_id": customer_id, "message": message},
            timeout=15,
        )
        response.raise_for_status()
        return response.json()
    
    
    
    
    
    
    
    
    
    
    
    
    
    
    
    
    
    
  5. Wire tool-like behavior into the agent flow

    In production fintech workflows, you usually need deterministic actions after model output. Use structured parsing to decide whether to escalate, hold a transaction, or create a case.

    from pydantic import BaseModel
    
     class AgentDecision(BaseModel):
         answer: str
         escalate: bool = False
         risk_level: str | None = None
    
     def parse_decision(raw_text: str) -> AgentDecision:
         # Replace this with strict JSON output parsing in production.
         return AgentDecision(answer=raw_text)
    
     async def handle_case(customer_id: str, message: str) -> dict:
         result = await chat(ChatRequest(customer_id=customer_id, message=message))
         decision = parse_decision(result["answer"])
    
         if decision.escalate:
             await create_audit_event({
                 "customer_id": customer_id,
                 "event_type": "escalation",
                 "reason": decision.risk_level or "unknown",
             })
    
         return {
             "answer": decision.answer,
             "escalate": decision.escalate,
         }
    

Testing the Integration

Use a direct Python call against your agent service first. That validates network access, auth headers, and response shape before you touch UI streaming.

import requests

resp = requests.post(
    "http://localhost:8000/agent/chat",
    json={
        "customer_id": "cust_12345",
        "message": "Summarize my last three card transactions and flag anything unusual.",
    },
    timeout=20,
)

print(resp.status_code)
print(resp.json())

Expected output:

200
{
  "answer": "Your last three card transactions were ... No unusual activity detected.",
  "audit_id": "aud_9f2c1b"
}

Real-World Use Cases

  • Fraud review copilots
    Let analysts ask natural-language questions about suspicious activity while your Python service pulls transaction history and writes immutable audit events.

  • Customer support assistants
    Embed an AI helper in Next.js that explains failed payments, chargebacks, or account holds using policy-aware responses from your backend.

  • Ops automation for compliance teams
    Route KYC exceptions, sanctions hits, or document review tasks through an agent that can summarize cases and trigger downstream workflows.


Keep learning

By Cyprian Aarons, AI Consultant at Topiax.

Want the complete 8-step roadmap?

Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.

Get the Starter Kit

Related Guides