How to Integrate OpenAI for insurance with AWS Lambda for production AI
OpenAI for insurance plus AWS Lambda gives you a clean production pattern for agentic workflows: Lambda handles event-driven execution, retries, and IAM boundaries, while OpenAI handles language understanding, extraction, summarization, and policy reasoning. In insurance systems, that usually means claims triage, FNOL intake, document classification, and customer support automation without running a long-lived service.
Prerequisites
- •AWS account with permission to create:
- •Lambda functions
- •IAM roles/policies
- •CloudWatch log groups
- •AWS CLI configured locally with
aws configure - •Python 3.11 installed
- •An OpenAI API key set as an environment variable:
- •
export OPENAI_API_KEY="your_key"
- •
- •
boto3installed:- •
pip install boto3
- •
- •
openaiPython SDK installed:- •
pip install openai
- •
- •A Lambda deployment package or container image ready for Python
- •Basic familiarity with:
- •AWS Lambda handler signatures
- •JSON event payloads
Integration Steps
- •Set up the OpenAI client inside your Lambda code.
For insurance workloads, keep the prompt narrow and the output structured. You want deterministic JSON back from OpenAI so downstream systems can route claims, detect missing fields, or trigger human review.
import os
import json
from openai import OpenAI
client = OpenAI(api_key=os.environ["OPENAI_API_KEY"])
def extract_claim_data(text: str) -> dict:
response = client.responses.create(
model="gpt-4.1-mini",
input=[
{
"role": "system",
"content": (
"You extract insurance claim data from customer messages. "
"Return only valid JSON with keys: claimant_name, incident_date, "
"loss_type, summary, urgency."
),
},
{"role": "user", "content": text},
],
)
return json.loads(response.output_text)
- •Create a Lambda handler that receives the event payload.
This is the boundary between AWS and your agent logic. In production, your event usually comes from API Gateway, SQS, EventBridge, or Step Functions.
import json
def lambda_handler(event, context):
message = event.get("message", "")
if not message:
return {
"statusCode": 400,
"body": json.dumps({"error": "message is required"}),
}
extracted = extract_claim_data(message)
return {
"statusCode": 200,
"body": json.dumps({
"source": "openai-for-insurance",
"result": extracted,
}),
}
- •Add AWS SDK calls for operational control.
In real systems you do more than call a model. You log to CloudWatch by default through Lambda, and often write results to DynamoDB or publish events to SNS/SQS for downstream processing. Here’s a simple DynamoDB write after extraction.
import os
import boto3
import json
dynamodb = boto3.resource("dynamodb")
table = dynamodb.Table(os.environ["CLAIMS_TABLE"])
def store_claim(extracted: dict) -> None:
table.put_item(
Item={
"claim_id": f"{extracted.get('claimant_name', 'unknown')}-{extracted.get('incident_date', 'na')}",
"claimant_name": extracted.get("claimant_name"),
"incident_date": extracted.get("incident_date"),
"loss_type": extracted.get("loss_type"),
"summary": extracted.get("summary"),
"urgency": extracted.get("urgency"),
}
)
- •Wire the storage step into the Lambda flow.
This makes the function useful in production: ingest text, classify it with OpenAI for insurance use cases, then persist the result for claims ops or agent handoff.
def lambda_handler(event, context):
message = event.get("message", "")
if not message:
return {
"statusCode": 400,
"body": json.dumps({"error": "message is required"}),
}
extracted = extract_claim_data(message)
store_claim(extracted)
return {
"statusCode": 200,
"body": json.dumps({
"stored": True,
"result": extracted,
}),
}
- •Package and deploy the Lambda function.
If you’re using ZIP deployment, keep dependencies in a build directory and upload them together with your handler file. For production AI workloads, pin versions so model integration changes don’t break runtime behavior unexpectedly.
mkdir -p build
pip install openai boto3 -t build/
cp lambda_function.py build/
cd build && zip -r ../claims_lambda.zip .
aws lambda create-function \
--function-name openai-insurance-claims \
--runtime python3.11 \
--handler lambda_function.lambda_handler \
--role arn:aws:iam::123456789012:role/lambda-execution-role \
--zip-file fileb://claims_lambda.zip
Testing the Integration
Use a direct Lambda invoke to verify the end-to-end path: event input → OpenAI extraction → DynamoDB persistence → response.
import json
import boto3
lambda_client = boto3.client("lambda")
payload = {
"message": (
"I was rear-ended on 2026-04-19 near downtown Seattle. "
"My name is Jordan Lee. The car has bumper damage and I need help filing a claim."
)
}
response = lambda_client.invoke(
FunctionName="openai-insurance-claims",
InvocationType="RequestResponse",
Payload=json.dumps(payload).encode("utf-8"),
)
result = json.loads(response["Payload"].read())
print(result["statusCode"])
print(json.loads(result["body"]))
Expected output:
200
{
"stored": true,
"result": {
"claimant_name": "Jordan Lee",
"incident_date": "2026-04-19",
"loss_type": "auto collision",
"summary": "...",
"urgency": "..."
}
}
Real-World Use Cases
- •
FNOL intake automation
- •Parse first notice of loss from email, chat, or web forms.
- •Route high-severity claims to human adjusters immediately.
- •
Policy document classification
- •Extract named entities from endorsements, declarations pages, and coverage letters.
- •Store structured metadata in DynamoDB or push it into a claims workflow.
- •
Claims triage agents
- •Use OpenAI for insurance-specific summarization and urgency scoring.
- •Run it on Lambda behind API Gateway for low-latency request handling and easy scaling.
Keep learning
- •The complete AI Agents Roadmap — my full 8-step breakdown
- •Free: The AI Agent Starter Kit — PDF checklist + starter code
- •Work with me — I build AI for banks and insurance companies
By Cyprian Aarons, AI Consultant at Topiax.
Want the complete 8-step roadmap?
Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.
Get the Starter Kit