CrewAI Tutorial (TypeScript): deploying to AWS Lambda for beginners
This tutorial shows you how to package a TypeScript CrewAI-style agent workflow into an AWS Lambda function and invoke it with API Gateway. You need this when you want your agent logic to run on demand, behind an HTTP endpoint, without managing servers.
What You'll Need
- •Node.js 20+
- •AWS account with permissions for:
- •Lambda
- •IAM roles
- •CloudWatch Logs
- •API Gateway
- •AWS CLI configured locally
- •A CrewAI-compatible TypeScript project setup
- •An LLM API key:
- •
OPENAI_API_KEYif you use OpenAI models
- •
- •These npm packages:
- •
crewai - •
@aws-sdk/client-lambdais not required for this tutorial, but useful later - •
esbuild - •
typescript - •
tsx
- •
- •Basic familiarity with:
- •Agents
- •Tasks
- •Crews
Step-by-Step
- •Create a minimal TypeScript project and install dependencies. We’ll use
esbuildto bundle the Lambda handler into a single file, which keeps deployment simple and avoids module resolution issues in Lambda.
mkdir crewai-lambda-demo
cd crewai-lambda-demo
npm init -y
npm install crewai dotenv
npm install -D typescript tsx esbuild @types/node
npx tsc --init --rootDir src --outDir dist --module NodeNext --moduleResolution NodeNext --target ES2022 --strict
mkdir src
- •Add your CrewAI workflow in a normal TypeScript module. Keep the agent and task definitions small and deterministic so Lambda stays fast and cheap.
// src/crew.ts
import { Agent, Crew, Task } from "crewai";
export async function runCrew(input: string) {
const agent = new Agent({
role: "Support Analyst",
goal: "Answer user questions accurately",
backstory: "You are a concise support analyst for internal tooling.",
verbose: false,
allowDelegation: false,
llm: "gpt-4o-mini",
});
const task = new Task({
description: `Answer this request in one paragraph: ${input}`,
expectedOutput: "A short, direct answer.",
agent,
});
const crew = new Crew({
agents: [agent],
tasks: [task],
verbose: false,
});
return await crew.kickoff();
}
- •Create the Lambda handler. This reads JSON input from API Gateway, calls your crew, and returns a proper HTTP response. Keep the handler thin; all business logic should live in separate modules.
// src/handler.ts
import "dotenv/config";
import { runCrew } from "./crew.js";
type ApiGatewayEvent = {
body?: string | null;
};
export const handler = async (event: ApiGatewayEvent) => {
const body = event.body ? JSON.parse(event.body) : {};
const input = String(body.input ?? "Summarize why Lambda is useful for agent workflows.");
const result = await runCrew(input);
return {
statusCode: 200,
headers: { "content-type": "application/json" },
body: JSON.stringify({
ok: true,
output: String(result),
}),
};
};
- •Add a build script that produces a single deployment artifact. AWS Lambda runs best when you give it one bundled file plus any runtime assets you truly need.
{
"name": "crewai-lambda-demo",
"version": "1.0.0",
"type": "module",
"scripts": {
"build": "esbuild src/handler.ts --bundle --platform=node --target=node20 --format=esm --outfile=dist/index.mjs",
"test": "tsx src/local-test.ts"
}
}
- •Add a local test runner before deploying. This catches wiring problems early and lets you verify your prompt flow without waiting on AWS.
// src/local-test.ts
import "dotenv/config";
import { handler } from "./handler.js";
const response = await handler({
body: JSON.stringify({
input: "Explain what AWS Lambda does in one sentence.",
}),
});
console.log(response.statusCode);
console.log(response.body);
- •Deploy the bundle to AWS Lambda and wire it to an HTTP endpoint. Use the console for your first pass if you’re new to Lambda; once it works, move the same steps into IaC.
npm run build
zip -j function.zip dist/index.mjs
aws lambda create-function \
--function-name crewai-demo \
--runtime nodejs20.x \
--handler index.handler \
--role arn:aws:iam::123456789012:role/lambda-execution-role \
--zip-file fileb://function.zip \
--timeout 30 \
--memory-size 1024 \
--environment Variables="{OPENAI_API_KEY=your_key_here}"
Testing It
First, run the local test with npm test. If that works, your CrewAI code is valid and your handler can parse requests correctly.
Then invoke the deployed Lambda from the AWS console or with the CLI using a sample event body like {"input":"Write a short answer about serverless agents."}. Check CloudWatch Logs if the function fails; most issues at this stage are missing environment variables, IAM permissions, or bundle/import problems.
If you connect API Gateway, send an HTTP POST request to the endpoint with a JSON body containing input. A successful response should return ok: true plus the model output as JSON.
Next Steps
- •Move secrets out of Lambda environment variables and into AWS Secrets Manager.
- •Replace manual deployment with CDK or Terraform so you can version infrastructure.
- •Add retries, timeouts, and structured logging before putting this behind production traffic.
Keep learning
- •The complete AI Agents Roadmap — my full 8-step breakdown
- •Free: The AI Agent Starter Kit — PDF checklist + starter code
- •Work with me — I build AI for banks and insurance companies
By Cyprian Aarons, AI Consultant at Topiax.
Want the complete 8-step roadmap?
Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.
Get the Starter Kit