AutoGen Tutorial (TypeScript): deploying to AWS Lambda for beginners

By Cyprian AaronsUpdated 2026-04-21
autogendeploying-to-aws-lambda-for-beginnerstypescript

This tutorial shows how to package a small AutoGen TypeScript agent for AWS Lambda and invoke it through API Gateway. You need this when you want a serverless chat endpoint, a lightweight agent tool, or a backend function that runs only on demand instead of keeping a Node server alive.

What You'll Need

  • Node.js 20+
  • AWS account with permission to create:
    • Lambda
    • IAM roles
    • API Gateway HTTP API
  • An OpenAI API key
  • A TypeScript project initialized with:
    • typescript
    • @types/aws-lambda
    • @aws-sdk/client-secrets-manager
  • AutoGen TypeScript packages:
    • @autogen/core
    • @autogen/openai
  • AWS CLI configured locally
  • Docker installed if you want to test the Lambda container locally

Step-by-Step

  1. Create the project and install dependencies. Keep the Lambda handler small and move all agent logic into one file so deployment stays simple.
mkdir autogen-lambda && cd autogen-lambda
npm init -y
npm i @autogen/core @autogen/openai @aws-sdk/client-secrets-manager aws-lambda
npm i -D typescript @types/node @types/aws-lambda esbuild
npx tsc --init
  1. Add your environment variables and a basic AutoGen agent. For beginners, use a single assistant agent that takes a prompt and returns one response.
// src/agent.ts
import { AssistantAgent } from "@autogen/core";
import { OpenAIChatCompletionClient } from "@autogen/openai";

export async function runAgent(prompt: string): Promise<string> {
  const modelClient = new OpenAIChatCompletionClient({
    model: "gpt-4o-mini",
    apiKey: process.env.OPENAI_API_KEY!,
  });

  const agent = new AssistantAgent({
    name: "lambda_assistant",
    modelClient,
    systemMessage: "You are a concise assistant.",
  });

  const result = await agent.run([{ role: "user", content: prompt }]);
  return result.messages.at(-1)?.content?.toString() ?? "";
}
  1. Create the Lambda handler. This version accepts JSON input from API Gateway and returns JSON output that is easy to test from curl or Postman.
// src/handler.ts
import type { APIGatewayProxyHandlerV2 } from "aws-lambda";
import { runAgent } from "./agent";

export const handler: APIGatewayProxyHandlerV2 = async (event) => {
  const body = event.body ? JSON.parse(event.body) : {};
  const prompt = body.prompt ?? "Say hello from Lambda.";

  const reply = await runAgent(prompt);

  return {
    statusCode: 200,
    headers: { "content-type": "application/json" },
    body: JSON.stringify({
      ok: true,
      prompt,
      reply,
    }),
  };
};
  1. Add a build step for Lambda. Use esbuild so you ship one small JavaScript bundle instead of uploading your whole source tree.
{
  "name": "autogen-lambda",
  "version": "1.0.0",
  "type": "module",
  "scripts": {
    "build": "esbuild src/handler.ts --bundle --platform=node --target=node20 --format=esm --outfile=dist/index.mjs"
  }
}
  1. Deploy the function to AWS Lambda. Store the OpenAI key in Lambda environment variables for the first pass; later, move it to Secrets Manager.
npm run build

zip -j function.zip dist/index.mjs

aws lambda create-function \
  --function-name autogen-ts-lambda \
  --runtime nodejs20.x \
  --handler index.handler \
  --zip-file fileb://function.zip \
  --role arn:aws:iam::<YOUR_ACCOUNT_ID>:role/<LAMBDA_EXEC_ROLE> \
  --environment Variables="{OPENAI_API_KEY=$OPENAI_API_KEY}"
  1. Put API Gateway in front of it. This gives you an HTTPS endpoint and makes the function callable from any client.
aws apigatewayv2 create-api \
  --name autogen-ts-api \
  --protocol-type HTTP

# After creating an integration and route in the console or CLI,
# point POST /chat to your Lambda function.
# Then deploy the API and copy the invoke URL.

Testing It

Invoke the function directly first so you can isolate Lambda issues from API Gateway issues. Use the AWS CLI with a simple JSON payload containing prompt, then confirm you get back valid JSON with ok, prompt, and reply.

aws lambda invoke \
  --function-name autogen-ts-lambda \
  --payload '{"prompt":"Write one sentence about AWS Lambda."}' \
  response.json

cat response.json

If that works, call the HTTP endpoint with curl and make sure your request body is passed through correctly. Check CloudWatch logs if you get a timeout, empty response, or model authentication error.

Next Steps

  • Move OPENAI_API_KEY into AWS Secrets Manager and fetch it at cold start.
  • Add tool use, then expose only safe tools inside Lambda.
  • Wrap this handler with input validation using Zod before calling AutoGen.

Keep learning

By Cyprian Aarons, AI Consultant at Topiax.

Want the complete 8-step roadmap?

Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.

Get the Starter Kit

Related Guides