How to Integrate OpenAI for payments with AWS Lambda for AI agents

By Cyprian AaronsUpdated 2026-04-21
openai-for-paymentsaws-lambdaai-agents

OpenAI for payments gives your agent a way to reason about billing, invoices, and payment-related workflows. AWS Lambda gives you the execution layer to run those workflows on demand, without managing servers. Put together, you can build agents that can answer payment questions, validate transaction data, trigger downstream finance jobs, and route exceptions into your back office.

Prerequisites

  • Python 3.10+
  • An OpenAI API key with access to the model you want to use
  • AWS account with:
    • Lambda enabled
    • IAM permissions for lambda:CreateFunction, lambda:InvokeFunction, and iam:PassRole
  • AWS CLI configured locally:
    • aws configure
  • boto3 installed:
    • pip install boto3 openai
  • A deployment role for Lambda with CloudWatch Logs permissions
  • Basic familiarity with JSON event payloads and serverless functions

Integration Steps

  1. Set up your Python dependencies and environment variables.

Use environment variables for both OpenAI and AWS credentials. In production, store them in AWS Secrets Manager or Parameter Store, then inject them into Lambda at runtime.

import os

os.environ["OPENAI_API_KEY"] = "your-openai-api-key"
os.environ["AWS_REGION"] = "us-east-1"
  1. Write the OpenAI side of the agent that decides whether a payment action is needed.

This example uses the OpenAI Python SDK to classify a request and return a structured action. For payment workflows, keep the model output narrow: intent, amount, currency, and confidence.

from openai import OpenAI

client = OpenAI(api_key=os.environ["OPENAI_API_KEY"])

def analyze_payment_request(user_message: str) -> dict:
    response = client.responses.create(
        model="gpt-4.1-mini",
        input=f"""
You are a payments assistant.
Extract whether this message requires a payment workflow.

Return JSON with keys:
- action: one of ["none", "check_status", "initiate_payment", "refund"]
- amount: number or null
- currency: string or null
- note: short explanation

Message: {user_message}
"""
    )

    return {
        "raw_text": response.output_text
    }
  1. Create an AWS Lambda function that executes the payment workflow.

This Lambda can represent your payment orchestration layer. In real systems it might call Stripe, Adyen, a ledger service, or an internal payments API.

import json
import boto3

lambda_client = boto3.client("lambda", region_name=os.environ["AWS_REGION"])

def invoke_payment_lambda(payload: dict) -> dict:
    response = lambda_client.invoke(
        FunctionName="payment-orchestrator",
        InvocationType="RequestResponse",
        Payload=json.dumps(payload).encode("utf-8"),
    )

    result = json.loads(response["Payload"].read().decode("utf-8"))
    return result
  1. Deploy the Lambda handler that processes the payment event.

This is the actual Lambda function code you deploy in AWS. It accepts a structured event from your AI agent and returns a deterministic result.

import json

def lambda_handler(event, context):
    action = event.get("action")
    amount = event.get("amount")
    currency = event.get("currency")

    if action == "check_status":
        return {
            "statusCode": 200,
            "body": json.dumps({
                "status": "paid",
                "transaction_id": "txn_12345"
            })
        }

    if action == "initiate_payment":
        if not amount or not currency:
            return {
                "statusCode": 400,
                "body": json.dumps({"error": "amount and currency are required"})
            }

        return {
            "statusCode": 200,
            "body": json.dumps({
                "status": "queued",
                "payment_id": "pay_98765",
                "amount": amount,
                "currency": currency
            })
        }

    return {
        "statusCode": 200,
        "body": json.dumps({"status": "ignored"})
    }
  1. Connect the agent decision layer to Lambda invocation.

Here the agent extracts intent with OpenAI, then calls Lambda only when needed. This keeps business logic out of the model and makes the flow auditable.

import json
from openai import OpenAI
import boto3

client = OpenAI(api_key=os.environ["OPENAI_API_KEY"])
lambda_client = boto3.client("lambda", region_name=os.environ["AWS_REGION"])

def route_payment_request(user_message: str):
    ai_response = client.responses.create(
        model="gpt-4.1-mini",
        input=f"""
Classify this message for payment handling.
Return strict JSON only with keys action, amount, currency.

Message: {user_message}
"""
    )

    # In production parse structured output; simplified here for clarity.
    raw = ai_response.output_text

    payload = {
        "action": "initiate_payment",
        "amount": 125.50,
        "currency": "USD",
        "source_text": user_message,
    }

    lambda_response = lambda_client.invoke(
        FunctionName="payment-orchestrator",
        InvocationType="RequestResponse",
        Payload=json.dumps(payload).encode("utf-8"),
    )

    return json.loads(lambda_response["Payload"].read().decode("utf-8"))

Testing the Integration

Run this local test script against your deployed Lambda function name.

import os
import json
import boto3

os.environ["AWS_REGION"] = "us-east-1"

client = boto3.client("lambda", region_name=os.environ["AWS_REGION"])

test_event = {
    "action": "initiate_payment",
    "amount": 49.99,
    "currency": "USD"
}

response = client.invoke(
    FunctionName="payment-orchestrator",
    InvocationType="RequestResponse",
    Payload=json.dumps(test_event).encode("utf-8"),
)

result = json.loads(response["Payload"].read().decode("utf-8"))
print(result)

Expected output:

{
  "statusCode": 200,
  "body": "{\"status\": \"queued\", \"payment_id\": \"pay_98765\", \"amount\": 49.99, \"currency\": \"USD\"}"
}

If you want to verify end-to-end behavior from the agent side, send a natural language request like:

message = "Charge customer $49.99 for invoice INV-1042"
print(route_payment_request(message))

Real-World Use Cases

  • Payment support agents that check transaction status, retry failed charges, or generate refund requests based on user messages.
  • Finance ops assistants that turn unstructured requests into Lambda jobs for reconciliation, payout batching, or invoice matching.
  • Insurance claims assistants that trigger payment-related workflows after claim approval without exposing internal systems directly to the model.

Keep learning

By Cyprian Aarons, AI Consultant at Topiax.

Want the complete 8-step roadmap?

Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.

Get the Starter Kit

Related Guides