How to Integrate Next.js for insurance with Vercel AI SDK for startups
Why this integration matters
If you’re building an insurance AI agent, you need two things working together: a system that understands policy, claims, and customer context, and a model layer that can reason over that data. Next.js for insurance gives you the app surface for quoting, claims intake, and agent workflows; Vercel AI SDK gives you the orchestration layer for streaming responses, tool calls, and structured outputs.
The useful pattern is simple: Next.js handles the insurance workflow UI and API routes, while Vercel AI SDK drives the assistant logic that can summarize a claim, extract fields from documents, or route a case to the right handler.
Prerequisites
- •Python 3.11+
- •Node.js 18+ installed locally
- •A Next.js app already created for your insurance product
- •A Vercel project connected to that app
- •
@vercel/aiinstalled in your frontend/backend workspace - •
requestsinstalled in your Python environment - •API keys configured:
- •
VERCEL_AI_GATEWAY_API_KEYor your provider key - •Any insurance backend API key if your Next.js app exposes protected endpoints
- •
- •A running Next.js API route for insurance operations like:
- •quote creation
- •claim submission
- •policy lookup
Integration Steps
1) Define the insurance workflow contract
Start by making the insurance side explicit. Your Next.js app should expose stable endpoints for quote lookup, claim intake, and policy retrieval.
from dataclasses import dataclass
from typing import Optional
@dataclass
class InsuranceRequest:
customer_id: str
policy_id: Optional[str] = None
claim_id: Optional[str] = None
intent: str = "claim_status"
Keep this contract boring. The AI layer should not guess what a “claim” means; it should pass a typed payload into your Next.js API.
2) Call your Next.js insurance API from Python
Use Python as the integration glue if you’re orchestrating agents outside the web app. In production, this is often a worker service or backend-for-frontend talking to your Next.js routes.
import os
import requests
NEXTJS_BASE_URL = os.getenv("NEXTJS_BASE_URL", "http://localhost:3000")
NEXTJS_API_KEY = os.getenv("NEXTJS_API_KEY", "")
def fetch_policy(policy_id: str) -> dict:
url = f"{NEXTJS_BASE_URL}/api/insurance/policies/{policy_id}"
headers = {
"Authorization": f"Bearer {NEXTJS_API_KEY}",
"Content-Type": "application/json",
}
response = requests.get(url, headers=headers, timeout=20)
response.raise_for_status()
return response.json()
policy = fetch_policy("POL-10482")
print(policy["status"])
This is the first boundary: Next.js owns the source of truth for product data. The agent never reads directly from random databases.
3) Use Vercel AI SDK to generate structured agent output
On the agent side, use Vercel AI SDK’s generateText with a structured prompt so the model returns a usable decision object. In a real setup, this runs in your Node route or server action; here’s the pattern you want to mirror.
import os
from openai import OpenAI
client = OpenAI(api_key=os.environ["OPENAI_API_KEY"])
def classify_insurance_intent(message: str) -> dict:
prompt = f"""
You are an insurance operations assistant.
Classify the user message into one of:
- claim_status
- quote_request
- policy_lookup
- document_upload
Return JSON with keys: intent, confidence, reason.
Message: {message}
"""
result = client.responses.create(
model="gpt-4.1-mini",
input=prompt,
)
return {"text": result.output_text}
print(classify_insurance_intent("I need to check my home claim status"))
If you are using Vercel AI SDK in Node, this maps directly to generateText() or streamText() with tool calling. The important part is that your Python orchestration expects structured output, not free-form chat text.
4) Bridge tool calls back into your Next.js insurance endpoints
This is where the agent becomes useful. The model decides what to do; Python executes the action against your Next.js API.
import json
def handle_agent_action(action: dict) -> dict:
intent = action.get("intent")
if intent == "policy_lookup":
return fetch_policy(action["policy_id"])
if intent == "claim_status":
url = f"{NEXTJS_BASE_URL}/api/insurance/claims/{action['claim_id']}"
headers = {"Authorization": f"Bearer {NEXTJS_API_KEY}"}
resp = requests.get(url, headers=headers, timeout=20)
resp.raise_for_status()
return resp.json()
raise ValueError(f"Unsupported intent: {intent}")
agent_action = {
"intent": "claim_status",
"claim_id": "CLM-88321"
}
print(json.dumps(handle_agent_action(agent_action), indent=2))
For startup teams, this keeps blast radius low. Your AI layer can only call approved endpoints exposed by Next.js.
5) Stream results back into your UI path
Vercel AI SDK is strongest when you stream responses into the UI. Your Python service can prepare the payload, then your Next.js frontend consumes it via an API route or server action.
def build_agent_response(user_message: str) -> dict:
classification = classify_insurance_intent(user_message)
# In production parse JSON from model output here.
return {
"message": user_message,
"classification": classification,
"next_action": "fetch_claim_or_policy"
}
payload = build_agent_response("Show me my auto claim update")
print(payload)
Your Next.js layer then takes this payload and uses streamText() on the server side to render partial results while tools execute. That gives users fast feedback instead of waiting on a full round trip.
Testing the Integration
Use a single smoke test that hits both sides: classify intent with the model, then fetch a policy record from Next.js.
def test_end_to_end():
user_message = "I want an update on policy POL-10482"
classification = classify_insurance_intent(user_message)
print("classification:", classification)
policy = fetch_policy("POL-10482")
print("policy_number:", policy["policy_number"])
print("status:", policy["status"])
test_end_to_end()
Expected output:
classification: {'text': '{"intent":"policy_lookup","confidence":0.94,"reason":"User asked about a policy update"}'}
policy_number: POL-10482
status: active
If classification works but policy lookup fails, your issue is usually auth or route shape. If both fail, check environment variables and whether your Next.js route is deployed with the expected path.
Real-World Use Cases
- •
Claims triage assistant
- •Classifies incoming messages
- •Pulls claim status from Next.js APIs
- •Drafts next-step instructions for adjusters
- •
Quote intake copilot
- •Extracts applicant details from chat or uploaded forms
- •Calls quote endpoints in your insurance app
- •Returns structured quote summaries to sales teams
- •
Policy servicing bot
- •Handles address changes, beneficiary updates, and coverage questions
- •Uses Vercel AI SDK tool calls for routing decisions
- •Writes only approved actions back through Next.js APIs
Keep learning
- •The complete AI Agents Roadmap — my full 8-step breakdown
- •Free: The AI Agent Starter Kit — PDF checklist + starter code
- •Work with me — I build AI for banks and insurance companies
By Cyprian Aarons, AI Consultant at Topiax.
Want the complete 8-step roadmap?
Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.
Get the Starter Kit