How to Integrate Next.js for insurance with Vercel AI SDK for AI agents

By Cyprian AaronsUpdated 2026-04-21
next-js-for-insurancevercel-ai-sdkai-agentsnextjs-for-insurance

Opening

If you’re building insurance workflows with AI agents, the useful pattern is not “chatbot on top of a policy PDF.” It’s an agent that can read policy data from your Next.js insurance app, reason over it with Vercel AI SDK, and then take controlled actions like quoting, triage, or document collection.

This integration gives you a clean split: Next.js for insurance handles the product surface and business workflow, while Vercel AI SDK handles model orchestration, tool calling, and streaming responses.

Prerequisites

  • Node.js 18+ and Python 3.10+
  • A Next.js insurance app with:
    • API routes or server actions
    • authentication in place
    • access to policy/claims/customer records
  • A Vercel project with AI SDK installed
  • An OpenAI-compatible model provider configured for Vercel AI SDK
  • Python packages:
    • requests
    • pydantic
    • python-dotenv
  • Environment variables ready:
    • NEXT_PUBLIC_APP_URL
    • NEXT_API_TOKEN
    • OPENAI_API_KEY or your model provider key

Integration Steps

  1. Set up the Next.js insurance API surface

Your Next.js app should expose narrow endpoints for policy lookup, claim status, and document intake. Keep the endpoints deterministic; let the agent decide when to call them, but keep the business logic in the app.

import requests
from pydantic import BaseModel

BASE_URL = "https://insurance-app.example.com"
TOKEN = "your-next-api-token"

class PolicyLookup(BaseModel):
    policy_number: str

def get_policy(policy_number: str) -> dict:
    resp = requests.get(
        f"{BASE_URL}/api/policies/{policy_number}",
        headers={"Authorization": f"Bearer {TOKEN}"},
        timeout=10,
    )
    resp.raise_for_status()
    return resp.json()

policy = get_policy("POL-104882")
print(policy)

In practice, this endpoint is backed by your Next.js route handler:

// app/api/policies/[policyNumber]/route.ts
export async function GET(
  request: Request,
  { params }: { params: { policyNumber: string } }
) {
  // fetch from DB or internal service
  return Response.json({
    policyNumber: params.policyNumber,
    status: "active",
    premium: 124.5,
    coverage: ["collision", "comprehensive"]
  })
}
  1. Expose a claims action that the agent can call safely

For insurance workflows, agents should not directly mutate core systems. Put write operations behind a single-purpose endpoint with validation and audit logging.

import requests
from pydantic import BaseModel, Field

class ClaimIntake(BaseModel):
    policy_number: str = Field(..., min_length=5)
    incident_type: str
    summary: str
    severity: str

def create_claim(payload: ClaimIntake) -> dict:
    resp = requests.post(
        f"{BASE_URL}/api/claims",
        json=payload.model_dump(),
        headers={"Authorization": f"Bearer {TOKEN}"},
        timeout=10,
    )
    resp.raise_for_status()
    return resp.json()

claim = create_claim(
    ClaimIntake(
        policy_number="POL-104882",
        incident_type="auto accident",
        summary="Rear-end collision at low speed",
        severity="medium",
    )
)
print(claim)

Your Next.js route should validate input before touching downstream systems.

  1. Wire Vercel AI SDK into an agent endpoint

Vercel AI SDK gives you tool calling and streaming. The common pattern is to host the agent in Next.js, then let it call your insurance APIs as tools.

from openai import OpenAI

client = OpenAI(api_key="YOUR_OPENAI_API_KEY")

def agent_reply(user_message: str) -> str:
    response = client.responses.create(
        model="gpt-4.1-mini",
        input=user_message,
        tools=[
            {
                "type": "function",
                "name": "get_policy",
                "description": "Fetch policy details from the insurance system",
                "parameters": {
                    "type": "object",
                    "properties": {
                        "policy_number": {"type": "string"}
                    },
                    "required": ["policy_number"]
                },
            }
        ],
    )
    return response.output_text

print(agent_reply("Check policy POL-104882 and summarize coverage"))

If you’re using Vercel AI SDK in a Next.js route, the equivalent server-side pattern is streamText() plus tool() definitions. The Python side here is useful when your orchestration service sits outside the web app.

  1. Build an orchestration layer in Python for insurance-specific routing

Use Python for pre-processing, policy normalization, PII redaction, and routing to the right tool. This keeps the model from seeing more than it needs.

import re
import requests

def redact_pii(text: str) -> str:
    text = re.sub(r"\b\d{3}-\d{2}-\d{4}\b", "[SSN_REDACTED]", text)
    text = re.sub(r"\b\d{16}\b", "[CARD_REDACTED]", text)
    return text

def route_insurance_request(message: str) -> dict:
    clean_message = redact_pii(message)

    resp = requests.post(
        f"{BASE_URL}/api/agent/route",
        json={"message": clean_message},
        headers={"Authorization": f"Bearer {TOKEN}"},
        timeout=10,
    )
    resp.raise_for_status()
    return resp.json()

result = route_insurance_request("Need claim help for POL-104882")
print(result)

On the Next.js side, your /api/agent/route endpoint can decide whether to call quote lookup, claims intake, or document upload based on the message intent.

  1. Connect streaming responses back into the insurance UI

The last step is user experience. Stream the agent response into your Next.js interface so adjusters or customers see progress while tools execute.

import requests

def stream_agent_response(prompt: str):
    with requests.post(
        f"{BASE_URL}/api/agent/chat",
        json={"prompt": prompt},
        headers={"Authorization": f"Bearer {TOKEN}"},
        stream=True,
        timeout=30,
    ) as resp:
        resp.raise_for_status()
        for line in resp.iter_lines(decode_unicode=True):
            if line:
                print(line)

stream_agent_response("Summarize this customer’s claim status and next steps.")

In a production setup, that endpoint typically wraps Vercel AI SDK streaming primitives such as streamText() so tokens flow back immediately instead of waiting for full completion.

Testing the Integration

Use one end-to-end check that touches both systems: fetch a policy through your Next.js API and ask the agent to summarize it.

def test_end_to_end():
    policy = get_policy("POL-104882")
    assert policy["status"] == "active"

    prompt = f"""
You are an insurance assistant.
Policy number: {policy['policyNumber']}
Coverage: {', '.join(policy['coverage'])}
Premium: {policy['premium']}
Summarize this for a customer.
"""
    reply = agent_reply(prompt)
    print(reply)

test_end_to_end()

Expected output:

This policy is active and includes collision and comprehensive coverage.
The current premium is $124.50.

If this fails, check these first:

  • auth header on the Next.js API routes
  • model provider key for Vercel AI SDK
  • request timeouts between orchestration and app endpoints
  • schema mismatches between tool inputs and API payloads

Real-World Use Cases

  • Claims triage assistant

    • The agent reads incoming claim details from your Next.js app.
    • It classifies severity with Vercel AI SDK and routes low-risk cases automatically.
  • Policy Q&A copilot

    • Customers ask questions like “Am I covered for rental cars?”
    • The agent pulls exact coverage data from your insurance backend instead of guessing.
  • Document intake workflow

    • Uploads from FNOL forms or repair estimates land in Next.js.
    • The agent extracts fields, validates missing data, and asks for only what’s needed next.

Keep learning

By Cyprian Aarons, AI Consultant at Topiax.

Want the complete 8-step roadmap?

Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.

Get the Starter Kit

Related Guides