How to Integrate OpenAI for insurance with AWS Lambda for startups

By Cyprian AaronsUpdated 2026-04-21
openai-for-insuranceaws-lambdastartups

OpenAI for insurance and AWS Lambda is a practical combo when you need an AI agent that can triage claims, extract policy data, or answer coverage questions without running a full server. Lambda gives you event-driven execution with low ops overhead; OpenAI handles the language-heavy work like summarization, classification, and document extraction.

For startups, this means you can build insurance workflows that react to uploads, emails, or API calls in near real time. You keep infrastructure small, but still ship an agent that feels much bigger than your team.

Prerequisites

  • An AWS account with permission to create:
    • Lambda functions
    • IAM roles
    • CloudWatch logs
  • AWS CLI configured locally:
    • aws configure
  • Python 3.11 or later
  • boto3 installed for AWS SDK access
  • OpenAI Python SDK installed
  • An OpenAI API key stored as an environment variable:
    • OPENAI_API_KEY
  • A Lambda execution role with basic logging permissions:
    • AWSLambdaBasicExecutionRole
  • If your Lambda needs outbound internet access:
    • Ensure it is not locked inside a private VPC without NAT

Integration Steps

1) Install dependencies and set up local config

Use the OpenAI Python SDK inside your Lambda handler and boto3 for any AWS calls you want to make from the function.

pip install openai boto3

For local development, export your API key:

export OPENAI_API_KEY="your-openai-key"

If you want to invoke Lambda from another service during testing, also configure AWS credentials:

aws configure

2) Create the Lambda handler that calls OpenAI

This example takes an insurance claim note, sends it to OpenAI for structured extraction, and returns a JSON payload your downstream system can use.

import json
import os
from openai import OpenAI

client = OpenAI(api_key=os.environ["OPENAI_API_KEY"])

def lambda_handler(event, context):
    claim_text = event.get("claim_text", "")

    response = client.responses.create(
        model="gpt-4.1-mini",
        input=(
            "Extract structured insurance claim data from the text below. "
            "Return JSON with keys: claimant_name, incident_type, urgency, summary.\n\n"
            f"TEXT:\n{claim_text}"
        ),
    )

    result_text = response.output_text

    return {
        "statusCode": 200,
        "headers": {"Content-Type": "application/json"},
        "body": json.dumps({
            "raw_model_output": result_text,
        }),
    }

A few notes:

  • client.responses.create(...) is the current OpenAI API pattern.
  • Keep prompts strict if you need machine-readable output.
  • For production systems, add validation before trusting model output.

3) Add AWS-side orchestration with boto3

If this Lambda is part of a larger agent system, it often needs to read from S3 or write results somewhere else. Here’s a pattern where the function reads an input document from S3, processes it with OpenAI, then writes the result back.

import json
import os
import boto3
from openai import OpenAI

s3 = boto3.client("s3")
client = OpenAI(api_key=os.environ["OPENAI_API_KEY"])

def lambda_handler(event, context):
    bucket = event["bucket"]
    key = event["key"]

    obj = s3.get_object(Bucket=bucket, Key=key)
    claim_text = obj["Body"].read().decode("utf-8")

    response = client.responses.create(
        model="gpt-4.1-mini",
        input=f"Summarize this insurance claim in 5 bullet points:\n\n{claim_text}",
    )

    summary = response.output_text

    output_key = key.replace(".txt", "-summary.json")
    s3.put_object(
        Bucket=bucket,
        Key=output_key,
        Body=json.dumps({"summary": summary}).encode("utf-8"),
        ContentType="application/json",
    )

    return {
        "statusCode": 200,
        "body": json.dumps({"output_key": output_key}),
    }

This is a solid startup pattern because it keeps the Lambda stateless and lets S3 act as your document bus.

4) Package and deploy the Lambda function

Create a deployment package or use a zip-based workflow. For a simple setup:

mkdir package
pip install openai boto3 -t package/
cp lambda_function.py package/
cd package && zip -r ../lambda.zip .

Then create or update the function:

aws lambda create-function \
  --function-name insurance-ai-agent \
  --runtime python3.11 \
  --handler lambda_function.lambda_handler \
  --role arn:aws:iam::123456789012:role/service-role/your-lambda-role \
  --zip-file fileb://lambda.zip \
  --environment Variables="{OPENAI_API_KEY=your-openai-key}"

If the function already exists:

aws lambda update-function-code \
  --function-name insurance-ai-agent \
  --zip-file fileb://lambda.zip

For production, store OPENAI_API_KEY in AWS Secrets Manager or Parameter Store instead of plain environment variables.

5) Invoke Lambda from another service in your agent flow

A common startup architecture is: API Gateway → Lambda → OpenAI → DynamoDB/S3. If another service needs to trigger processing directly, use boto3’s Lambda client.

import json
import boto3

lambda_client = boto3.client("lambda")

payload = {
    "claim_text": (
        "Customer reports hail damage to roof after storm on Tuesday. "
        "Photos attached. Policy number ends in 4821."
    )
}

response = lambda_client.invoke(
    FunctionName="insurance-ai-agent",
    InvocationType="RequestResponse",
    Payload=json.dumps(payload).encode("utf-8"),
)

result = json.loads(response["Payload"].read().decode("utf-8"))
print(result)

That gives you a clean way to chain workflows across services without building a separate app server.

Testing the Integration

Run a local test first by calling the handler directly.

from lambda_function import lambda_handler

event = {
    "claim_text": (
        "John Carter filed a claim after water damage in his apartment. "
        "The leak started from the upstairs unit on Monday morning."
    )
}

result = lambda_handler(event, None)
print(result)

Expected output:

{
  "statusCode": 200,
  "headers": {
    "Content-Type": "application/json"
  },
  "body": "{\"raw_model_output\":\"{...}\"}"
}

If you deployed to AWS, test with the CLI:

aws lambda invoke \
  --function-name insurance-ai-agent \
  --payload '{"claim_text":"Fire damage reported in kitchen after electrical fault."}' \
  response.json

cat response.json

Real-World Use Cases

  • Claims triage
    • Classify incoming claims by severity and route urgent cases to human adjusters.
  • Policy Q&A agent
    • Answer coverage questions from customer emails or chat messages using policy documents stored in S3.
  • Document extraction pipeline
    • Pull structured fields from FNOL forms, repair estimates, and loss reports into DynamoDB or your CRM.

The main pattern here is simple: let Lambda handle orchestration and let OpenAI handle language understanding. That split keeps your system maintainable while giving startups enough flexibility to ship insurance agents fast.


Keep learning

By Cyprian Aarons, AI Consultant at Topiax.

Want the complete 8-step roadmap?

Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.

Get the Starter Kit

Related Guides