LangGraph Tutorial (TypeScript): deploying to AWS Lambda for beginners
This tutorial shows you how to package a LangGraph TypeScript app and run it on AWS Lambda behind a simple handler. You need this when your graph is already working locally, but you want a cheap, serverless deployment path for internal tools, chat endpoints, or scheduled agent jobs.
What You'll Need
- •Node.js 20+
- •AWS account with permission to create:
- •Lambda functions
- •IAM roles
- •CloudWatch logs
- •AWS CLI configured locally
- •A TypeScript LangGraph project
- •These npm packages:
- •
@langchain/langgraph - •
@langchain/core - •
@langchain/openai - •
aws-lambda - •
esbuild - •
typescript
- •
- •An API key for your model provider, for example:
- •
OPENAI_API_KEY
- •
- •Basic familiarity with:
- •async/await
- •AWS Lambda handlers
- •JSON event payloads
Step-by-Step
- •Start with a minimal LangGraph graph that can run in Lambda without any local state. The key point is that Lambda should receive an input object, run the graph, and return plain JSON.
import { StateGraph, Annotation } from "@langchain/langgraph";
import { ChatOpenAI } from "@langchain/openai";
const State = Annotation.Root({
input: Annotation<string>(),
output: Annotation<string>(),
});
const model = new ChatOpenAI({
model: "gpt-4o-mini",
temperature: 0,
});
async function generate(state: typeof State.State) {
const response = await model.invoke([
{
role: "user",
content: state.input,
},
]);
return { output: response.content.toString() };
}
const graph = new StateGraph(State)
.addNode("generate", generate)
.addEdge("__start__", "generate")
.addEdge("generate", "__end__")
.compile();
export { graph };
- •Add an AWS Lambda handler that calls the compiled graph. Keep the handler thin; all business logic should stay inside the graph so the Lambda layer stays easy to test.
import type { APIGatewayProxyHandlerV2 } from "aws-lambda";
import { graph } from "./graph";
export const handler: APIGatewayProxyHandlerV2 = async (event) => {
const body = event.body ? JSON.parse(event.body) : {};
const input = typeof body.input === "string" ? body.input : "Hello from Lambda";
const result = await graph.invoke({ input });
return {
statusCode: 200,
headers: {
"content-type": "application/json",
},
body: JSON.stringify(result),
};
};
- •Add a TypeScript config and build script so you can bundle everything into one file for Lambda. For beginners, bundling with esbuild is simpler than managing multiple output files and module resolution issues.
{
"name": "langgraph-lambda",
"private": true,
"type": "module",
"scripts": {
"build": "esbuild src/handler.ts --bundle --platform=node --target=node20 --format=esm --outfile=dist/index.mjs",
"typecheck": "tsc --noEmit"
},
"dependencies": {
"@langchain/core": "^0.3.0",
"@langchain/langgraph": "^0.2.0",
"@langchain/openai": "^0.5.0",
"aws-lambda": "^1.0.7"
},
"devDependencies": {
"@types/node": "^22.0.0",
"esbuild": "^0.24.0",
"typescript": "^5.6.0"
}
}
- •Create your TypeScript config and environment setup. On Lambda, environment variables are the cleanest way to pass secrets like API keys.
{
"compilerOptions": {
"target": "ES2022",
"module": "NodeNext",
"moduleResolution": "NodeNext",
"strict": true,
"esModuleInterop": true,
"skipLibCheck": true,
"resolveJsonModule": true,
"outDir": "./dist"
},
"include": ["src/**/*.ts"]
}
export OPENAI_API_KEY="your-key-here"
npm install
npm run typecheck
npm run build
- •Deploy the bundled file to Lambda and wire it to an HTTP event source. The simplest beginner path is a single Lambda function with an API Gateway trigger.
aws lambda create-function \
--function-name langgraph-lambda \
--runtime nodejs20.x \
--handler index.handler \
--zip-file fileb://function.zip \
--role arn:aws:iam::123456789012:role/lambda-execution-role \
--environment Variables="{OPENAI_API_KEY=$OPENAI_API_KEY}"
mkdir -p deploy
cp dist/index.mjs deploy/index.mjs
cd deploy && zip -r ../function.zip .
- •Test locally before pushing to AWS if you want faster feedback loops. You can call the compiled handler directly with a fake API Gateway event shape.
import { handler } from "./src/handler";
const event = {
body: JSON.stringify({ input: "Write one sentence about serverless agents." }),
} as any;
const result = await handler(event);
console.log(result.statusCode);
console.log(result.body);
Testing It
Invoke the deployed Lambda with a simple JSON payload containing input. If everything is wired correctly, you should get back a 200 response with the graph output in the body.
Check CloudWatch logs for two things:
- •cold start time on first invocation
- •any model/API errors coming from your provider key or network configuration
If you get a timeout, increase the Lambda timeout to at least 15 seconds while testing. If you get import errors, inspect the bundled file first; most beginner issues come from incomplete bundling or mismatched module formats.
A good smoke test is sending the same prompt twice and confirming both responses are valid JSON and consistent with temperature set to zero.
Next Steps
- •Add memory using DynamoDB instead of relying on in-memory state.
- •Split your graph into multiple nodes for tool calling and routing.
- •Add API Gateway request validation so bad payloads never reach your graph.
Keep learning
- •The complete AI Agents Roadmap — my full 8-step breakdown
- •Free: The AI Agent Starter Kit — PDF checklist + starter code
- •Work with me — I build AI for banks and insurance companies
By Cyprian Aarons, AI Consultant at Topiax.
Want the complete 8-step roadmap?
Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.
Get the Starter Kit