CrewAI Tutorial (Python): deploying to AWS Lambda for beginners

By Cyprian AaronsUpdated 2026-04-21
crewaideploying-to-aws-lambda-for-beginnerspython

This tutorial shows you how to package a simple CrewAI Python agent and deploy it as an AWS Lambda function behind a Lambda handler. You’d use this when you want an agent to run on-demand from API Gateway, EventBridge, or a direct Lambda invoke without managing servers.

What You'll Need

  • Python 3.11 installed locally
  • An AWS account with permission to create:
    • Lambda functions
    • IAM roles
    • CloudWatch logs
  • AWS CLI configured locally: aws configure
  • A CrewAI project with these packages:
    • crewai
    • crewai-tools
    • boto3 not required for this tutorial, but useful later
  • An LLM API key:
    • OPENAI_API_KEY for OpenAI models
  • A Linux-compatible build environment for Lambda packaging
    • easiest path: Docker or AWS Cloud9
  • Basic familiarity with:
    • Python virtual environments
    • ZIP-based Lambda deployments

Step-by-Step

  1. Create a minimal CrewAI agent and wrap it in a Lambda handler.
    Keep the agent small and deterministic. For Lambda, you want short execution time and predictable outputs, not a multi-agent orchestration marathon.
# app.py
import os
from crewai import Agent, Task, Crew, Process
from crewai.llm import LLM

def build_crew():
    llm = LLM(
        model="gpt-4o-mini",
        api_key=os.environ["OPENAI_API_KEY"],
    )

    analyst = Agent(
        role="Support Analyst",
        goal="Summarize the user's request clearly",
        backstory="You write concise summaries for internal support teams.",
        llm=llm,
        verbose=False,
    )

    task = Task(
        description="Summarize this input in one sentence: {input_text}",
        expected_output="A one-sentence summary.",
        agent=analyst,
    )

    return Crew(
        agents=[analyst],
        tasks=[task],
        process=Process.sequential,
        verbose=False,
    )

def lambda_handler(event, context):
    input_text = event.get("input_text", "No input provided")
    crew = build_crew()
    result = crew.kickoff(inputs={"input_text": input_text})
    return {
        "statusCode": 200,
        "body": str(result),
    }
  1. Add your dependencies in a requirements file.
    Lambda does not install anything for you unless you package it yourself or use a container image. Start with pinned versions so your deployment is repeatable.
# requirements.txt
crewai==0.86.0
crewai-tools==0.17.0
openai==1.59.6
  1. Test the handler locally before touching AWS.
    If it fails here, packaging it for Lambda will only make debugging slower. Use a real event payload so you know the handler contract is correct.
# test_local.py
import os
from app import lambda_handler

os.environ["OPENAI_API_KEY"] = "your-openai-key-here"

event = {
    "input_text": "Customer cannot reset their password after multiple attempts."
}

response = lambda_handler(event, None)
print(response)
  1. Package the code for AWS Lambda.
    The simplest beginner-friendly route is to install dependencies into a local folder and zip everything together. On macOS or Windows, use Docker if native wheels cause problems.
mkdir -p build/python

pip install -r requirements.txt -t build/python

cp app.py build/
cd build

zip -r ../crewai-lambda.zip .
cd ..
  1. Create the Lambda function and configure environment variables.
    Use Python 3.11 runtime and give the function enough memory to handle model latency comfortably. Store the API key in environment variables for now; move it to Secrets Manager later.
aws iam create-role \
  --role-name crewai-lambda-role \
  --assume-role-policy-document file://trust-policy.json

aws iam attach-role-policy \
  --role-name crewai-lambda-role \
  --policy-arn arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole

aws lambda create-function \
  --function-name crewai-tutorial \
  --runtime python3.11 \
  --handler app.lambda_handler \
  --zip-file fileb://crewai-lambda.zip \
  --role arn:aws:iam::<YOUR_ACCOUNT_ID>:role/crewai-lambda-role \
  --timeout 30 \
  --memory-size 512 \
  --environment Variables="{OPENAI_API_KEY=your-openai-key-here}"
  1. Invoke the function and inspect logs.
    A successful invoke proves your bundle is valid, your handler signature is correct, and CrewAI can run inside Lambda’s runtime constraints.
aws lambda invoke \
  --function-name crewai-tutorial \
  --payload '{"input_text":"The user wants to change their billing address."}' \
  response.json

cat response.json

Testing It

Start by checking that response.json contains a statusCode of 200 and a non-empty body. Then open CloudWatch Logs for the function and confirm there are no import errors or timeout messages.

If you see dependency errors like No module named crewai, your ZIP structure is wrong or you built it on the wrong platform. If you see timeouts, reduce model usage, lower complexity, or increase the Lambda timeout to match your workload.

A good next check is to invoke the function multiple times with different inputs and confirm outputs stay short and consistent. For production use, also test failure cases like missing input_text and invalid API keys.

Next Steps

  • Move the API key from Lambda environment variables to AWS Secrets Manager.
  • Put API Gateway in front of the function so your agent can be called over HTTP.
  • Add structured JSON output from CrewAI so downstream systems can parse results reliably.

Keep learning

By Cyprian Aarons, AI Consultant at Topiax.

Want the complete 8-step roadmap?

Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.

Get the Starter Kit

Related Guides