How to Integrate Next.js for payments with Vercel AI SDK for production AI

By Cyprian AaronsUpdated 2026-04-21
next-js-for-paymentsvercel-ai-sdkproduction-ainextjs-for-payments

Combining Next.js for payments with Vercel AI SDK gives you a clean path to monetize AI workflows without bolting billing on as an afterthought. The pattern is simple: use Next.js as the payment and app shell, then route paid users into Vercel AI SDK-backed agent flows once entitlement checks pass.

For production AI systems, this matters because you need three things working together: payment state, user identity, and model execution. If those are not wired tightly, you end up with free-riders, broken subscriptions, or agents that keep generating after a plan expires.

Prerequisites

  • Node.js 18+ and Python 3.11+
  • A Next.js app with App Router enabled
  • Stripe account configured for payments
  • Vercel project connected to your repository
  • Vercel AI SDK installed in your Next.js app:
    • ai
    • @ai-sdk/openai or another provider package
  • Python environment for integration scripts and verification
  • Environment variables set:
    • STRIPE_SECRET_KEY
    • STRIPE_WEBHOOK_SECRET
    • NEXT_PUBLIC_APP_URL
    • OPENAI_API_KEY or equivalent model key
    • VERCEL_PROJECT_ID
    • VERCEL_TOKEN

Integration Steps

  1. Create the payment session in Next.js

Use a server action or route handler to create a Stripe Checkout Session. This is the entry point for paid access.

import os
import requests

STRIPE_SECRET_KEY = os.environ["STRIPE_SECRET_KEY"]
APP_URL = os.environ["NEXT_PUBLIC_APP_URL"]

def create_checkout_session(user_id: str, price_id: str) -> str:
    url = "https://api.stripe.com/v1/checkout/sessions"
    headers = {
        "Authorization": f"Bearer {STRIPE_SECRET_KEY}",
        "Content-Type": "application/x-www-form-urlencoded",
    }
    data = {
        "mode": "subscription",
        "success_url": f"{APP_URL}/billing/success?session_id={{CHECKOUT_SESSION_ID}}",
        "cancel_url": f"{APP_URL}/billing/cancel",
        "line_items[0][price]": price_id,
        "line_items[0][quantity]": 1,
        "client_reference_id": user_id,
    }

    resp = requests.post(url, headers=headers, data=data, timeout=30)
    resp.raise_for_status()
    return resp.json()["url"]

checkout_url = create_checkout_session("user_123", "price_abc123")
print(checkout_url)
  1. Verify payment webhooks and persist entitlement

Do not trust the redirect back from Stripe. Use webhooks to mark the user as paid, then store entitlement in your database.

import os
import json
import hmac
import hashlib
from fastapi import FastAPI, Request, HTTPException

app = FastAPI()
WEBHOOK_SECRET = os.environ["STRIPE_WEBHOOK_SECRET"]

def verify_stripe_signature(payload: bytes, sig_header: str) -> bool:
    # Minimal verification pattern; use stripe.Webhook.construct_event in production.
    expected = hmac.new(
        WEBHOOK_SECRET.encode(),
        payload,
        hashlib.sha256,
    ).hexdigest()
    return hmac.compare_digest(expected, sig_header)

@app.post("/webhooks/stripe")
async def stripe_webhook(request: Request):
    payload = await request.body()
    signature = request.headers.get("stripe-signature", "")

    if not verify_stripe_signature(payload, signature):
        raise HTTPException(status_code=400, detail="Invalid signature")

    event = json.loads(payload.decode())

    if event["type"] == "checkout.session.completed":
        session = event["data"]["object"]
        user_id = session.get("client_reference_id")

        # Persist entitlement in your DB here.
        # Example: db.users.update_one({"id": user_id}, {"$set": {"paid": True}})
        print(f"Entitlement granted for {user_id}")

    return {"ok": True}
  1. Expose an AI endpoint using Vercel AI SDK

Your paid users should hit a protected route that streams model output through Vercel AI SDK. In Next.js, this is typically a route handler using streamText.

import os
from openai import OpenAI

client = OpenAI(api_key=os.environ["OPENAI_API_KEY"])

def generate_agent_reply(user_message: str) -> str:
    response = client.responses.create(
        model="gpt-4o-mini",
        input=[
            {
                "role": "system",
                "content": "You are a compliant banking assistant. Keep responses concise and factual.",
            },
            {
                "role": "user",
                "content": user_message,
            },
        ],
    )
    return response.output_text

print(generate_agent_reply("Explain my premium account benefits"))

In your Next.js app, the equivalent production route uses Vercel AI SDK’s streamText from ai and your provider package.

  1. Gate model access by payment status

Before calling the model, check whether the user has an active entitlement. This is where billing and AI meet.

from dataclasses import dataclass

@dataclass
class UserEntitlement:
    user_id: str
    active: bool

def get_entitlement(user_id: str) -> UserEntitlement:
    # Replace with DB lookup.
    return UserEntitlement(user_id=user_id, active=True)

def handle_paid_ai_request(user_id: str, prompt: str) -> str:
    entitlement = get_entitlement(user_id)
    if not entitlement.active:
        return "Payment required"

    return generate_agent_reply(prompt)

result = handle_paid_ai_request("user_123", "Summarize my policy coverage")
print(result)
  1. Sync usage and billing events back into your system

For production AI, track token usage or request counts so you can enforce plan limits later. You can push usage events into your own ledger or analytics store.

import time
import requests

def record_usage(user_id: str, model: str, input_tokens: int, output_tokens: int):
    payload = {
        "user_id": user_id,
        "model": model,
        "input_tokens": input_tokens,
        "output_tokens": output_tokens,
        "timestamp": int(time.time()),
    }

    # Replace with your internal usage API.
    resp = requests.post(
        f"{APP_URL}/api/usage",
        json=payload,
        timeout=30,
    )
    resp.raise_for_status()

record_usage("user_123", "gpt-4o-mini", 120, 340)

Testing the Integration

Run a simple end-to-end check:

def test_paid_ai_flow():
    user_id = "user_123"
    prompt = "What does my premium plan include?"

    entitlement_before = get_entitlement(user_id)
    assert entitlement_before.active is True

    reply = handle_paid_ai_request(user_id, prompt)
    assert isinstance(reply, str)
    assert len(reply) > 0

if __name__ == "__main__":
    test_paid_ai_flow()
    print("Integration test passed")

Expected output:

Integration test passed

If you want a stronger check in staging:

  • Create a real Stripe Checkout Session
  • Complete payment with a test card
  • Confirm webhook updates entitlement state
  • Call the protected AI endpoint and verify streamed output returns only for paid users

Real-World Use Cases

  • Paid financial copilots
    Let users subscribe to an agent that explains statements, drafts dispute letters, or summarizes portfolio changes.

  • Insurance claim assistants
    Gate claim triage workflows behind subscription or enterprise entitlements while using Vercel AI SDK for structured responses.

  • Usage-based AI services
    Charge per seat or per request while tracking model usage separately for cost control and margin protection.


Keep learning

By Cyprian Aarons, AI Consultant at Topiax.

Want the complete 8-step roadmap?

Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.

Get the Starter Kit

Related Guides