LangGraph Tutorial (TypeScript): deploying to AWS Lambda for intermediate developers
This tutorial shows you how to package a LangGraph TypeScript app and deploy it behind AWS Lambda. You need this when you want a stateless, serverless agent endpoint that can handle API Gateway requests without running a long-lived Node process.
What You'll Need
- •Node.js 20+
- •AWS account with permission to create:
- •Lambda
- •API Gateway HTTP API
- •IAM roles
- •AWS CLI configured locally
- •A LangSmith or model provider API key, depending on the model you use
- •
@langchain/langgraph - •
@langchain/openai - •
zod - •
esbuild - •
typescript - •
@types/aws-lambda
Step-by-Step
- •Create a new TypeScript project and install the dependencies. Keep the runtime small and avoid framework baggage; Lambda cold starts will punish you for it.
mkdir langgraph-lambda && cd langgraph-lambda
npm init -y
npm i @langchain/langgraph @langchain/openai zod
npm i -D typescript esbuild @types/aws-lambda @types/node
npx tsc --init
- •Build a minimal LangGraph app in
src/graph.ts. This graph takes a user message, calls an LLM once, and returns the assistant response in a shape that is easy to serialize from Lambda.
import { ChatOpenAI } from "@langchain/openai";
import { StateGraph, START, END } from "@langchain/langgraph";
import { Annotation } from "@langchain/langgraph";
const State = Annotation.Root({
messages: Annotation<any[]>({
reducer: (left, right) => left.concat(right),
default: () => [],
}),
});
const model = new ChatOpenAI({
model: "gpt-4o-mini",
apiKey: process.env.OPENAI_API_KEY,
});
async function callModel(state: typeof State.State) {
const response = await model.invoke(state.messages);
return { messages: [response] };
}
const graph = new StateGraph(State)
.addNode("callModel", callModel)
.addEdge(START, "callModel")
.addEdge("callModel", END);
export const app = graph.compile();
- •Add a Lambda handler in
src/handler.ts. The handler converts the incoming HTTP request into graph state, invokes the compiled graph, and returns JSON that API Gateway can forward directly.
import type { APIGatewayProxyHandlerV2 } from "aws-lambda";
import { HumanMessage } from "@langchain/core/messages";
import { app } from "./graph";
export const handler: APIGatewayProxyHandlerV2 = async (event) => {
const body = event.body ? JSON.parse(event.body) : {};
const input = body.input ?? "Say hello in one sentence.";
const result = await app.invoke({
messages: [new HumanMessage(input)],
});
const lastMessage = result.messages[result.messages.length - 1];
return {
statusCode: 200,
headers: { "content-type": "application/json" },
body: JSON.stringify({
output: lastMessage.content,
}),
};
};
- •Add a production build config and compile everything into a single Lambda bundle. This keeps deployment simple and avoids shipping your entire
node_modulestree as-is.
// build.mjs
import esbuild from "esbuild";
await esbuild.build({
entryPoints: ["src/handler.ts"],
bundle: true,
platform: "node",
target: "node20",
outfile: "dist/index.js",
format: "cjs",
sourcemap: true,
});
- •Update your
package.jsonscripts so you can build locally before uploading to Lambda. This also gives you a repeatable way to catch TypeScript or bundling issues before deployment.
{
"name": "langgraph-lambda",
"version": "1.0.0",
"type": "module",
"scripts": {
"build": "node build.mjs",
"typecheck": "tsc --noEmit"
}
}
- •Deploy the bundle to Lambda and wire it to an HTTP API. Set
OPENAI_API_KEYas an environment variable on the function, point the handler atindex.handler, and make sure the runtime is Node.js 20.x.
| Setting | Value |
|---|---|
| Runtime | Node.js 20.x |
| Handler | index.handler |
| Code entry | dist/index.js |
| Env var | OPENAI_API_KEY=... |
A simple deployment path is:
- •run
npm run build - •zip
dist/index.jsplus any required runtime files if you are not using a container image - •upload the zip to Lambda
- •attach an API Gateway HTTP API with proxy integration
Testing It
Send a POST request to your API Gateway endpoint with JSON like { "input": "Write one sentence about DynamoDB." }. If everything is wired correctly, you should get back a JSON response with an output field containing the model’s answer.
If you get a runtime import error, check that esbuild bundled the dependency tree correctly and that your handler path matches the compiled file name. If you get an auth error, verify the Lambda environment variable is set and that your model provider key has access to the selected model.
For local validation, run:
npm run typecheck
npm run build
node dist/index.js
That last command won’t execute the handler by itself, but it confirms the bundle loads without syntax or module resolution errors.
Next Steps
- •Add conversation state persistence with DynamoDB instead of sending only one-turn inputs.
- •Replace the single-node graph with conditional routing for tool use and fallback paths.
- •Package this as a Lambda container image if your dependency tree grows or cold start tuning becomes important.
Keep learning
- •The complete AI Agents Roadmap — my full 8-step breakdown
- •Free: The AI Agent Starter Kit — PDF checklist + starter code
- •Work with me — I build AI for banks and insurance companies
By Cyprian Aarons, AI Consultant at Topiax.
Want the complete 8-step roadmap?
Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.
Get the Starter Kit