LangGraph Tutorial (TypeScript): deploying with Docker for intermediate developers
This tutorial shows you how to package a LangGraph TypeScript app into a Docker image and run it the same way in local dev, CI, and production. You need this when your graph works on your machine but you want a repeatable deployment artifact with pinned dependencies, environment variables, and a clean runtime.
What You'll Need
- •Node.js 20+
- •Docker Desktop or Docker Engine
- •An OpenAI API key
- •A LangGraph TypeScript project with
@langchain/langgraphinstalled - •
typescript,tsx, and@types/node - •A
.envfile for local development - •Basic familiarity with:
- •async/await
- •LangGraph state graphs
- •environment variables
Step-by-Step
- •
Create a minimal LangGraph app entrypoint.
Keep the graph small and deterministic first. The point here is to prove the container build works before you add more nodes, tools, or persistence.
import "dotenv/config"; import { ChatOpenAI } from "@langchain/openai"; import { StateGraph, Annotation } from "@langchain/langgraph"; const State = Annotation.Root({ messages: Annotation<string[]>({ reducer: (left, right) => left.concat(right), default: () => [], }), }); const model = new ChatOpenAI({ model: "gpt-4o-mini", temperature: 0, }); async function callModel(state: typeof State.State) { const response = await model.invoke(state.messages); return { messages: [response.content.toString()] }; } const graph = new StateGraph(State) .addNode("model", callModel) .addEdge("__start__", "model") .addEdge("model", "__end__") .compile(); const result = await graph.invoke({ messages: ["Write one short sentence about Docker."] }); console.log(result); - •
Add a proper project setup for TypeScript execution.
For Docker, you want a build step that emits plain JavaScript into
dist/. That keeps the runtime image smaller and avoids shipping TypeScript tooling into production.{ "name": "langgraph-docker-ts", "private": true, "type": "module", "scripts": { "dev": "tsx src/index.ts", "build": "tsc -p tsconfig.json", "start": "node dist/index.js" }, "dependencies": { "@langchain/langgraph": "^0.2.0", "@langchain/openai": "^0.5.0", "dotenv": "^16.4.5" }, "devDependencies": { "@types/node": "^22.0.0", "tsx": "^4.19.0", "typescript": "^5.6.3" } } - •
Add a TypeScript config that works cleanly in containers.
Use Node ESM output so your compiled code matches the
"type": "module"package setting. This avoids common Docker/runtime mismatches where Node refuses to load emitted files.{ "compilerOptions": { "target": "ES2022", "module": "NodeNext", "moduleResolution": "NodeNext", "outDir": "./dist", "rootDir": "./src", "strict": true, "esModuleInterop": true, "skipLibCheck": true, "resolveJsonModule": true, "forceConsistentCasingInFileNames": true }, "include": ["src/**/*.ts"] } - •
Build a multi-stage Docker image.
The first stage installs dependencies and compiles TypeScript. The second stage copies only what the app needs at runtime, which is the pattern you want for most internal AI services.
FROM node:20-alpine AS builder WORKDIR /app COPY package*.json tsconfig.json ./ RUN npm ci COPY src ./src RUN npm run build FROM node:20-alpine AS runner WORKDIR /app ENV NODE_ENV=production COPY package*.json ./ RUN npm ci --omit=dev COPY --from=builder /app/dist ./dist CMD ["node", "dist/index.js"] - •
Add environment handling and a
.dockerignore.Don’t bake secrets into the image. Pass them at runtime through environment variables so the same image can move between local Docker, ECS, Kubernetes, or an internal platform without rebuilding.
node_modules dist .env .git Dockerfile docker-compose.yml npm-debug.log* coverage .DS_Store - •
Run it locally with Docker.
Build once, then run with your API key injected at runtime. If this fails, fix it here before adding any extra graph nodes or persistence layers.
docker build -t langgraph-ts .
docker run --rm \
-e OPENAI_API_KEY="$OPENAI_API_KEY" \
langgraph-ts
Testing It
You should see a JSON-like object printed to stdout containing the messages array returned by the graph. If you get an auth error, confirm OPENAI_API_KEY is set in your shell before running Docker.
If the container exits immediately with a module error, check that package.json has "type": "module" and that tsconfig.json uses NodeNext. If the build fails inside Docker but works locally, compare your local Node version with the base image version; keep both on Node 20+.
For a cleaner developer loop, run the app outside Docker first:
npm install
npm run dev
If both local execution and container execution work, you have a deployable baseline.
Next Steps
- •Add persistent state with SQLite or Postgres-backed checkpointers.
- •Wrap the graph in an HTTP API using Fastify or Express so Docker runs a service instead of a one-off script.
- •Add health checks and structured logging before deploying to Kubernetes or ECS.
Keep learning
- •The complete AI Agents Roadmap — my full 8-step breakdown
- •Free: The AI Agent Starter Kit — PDF checklist + starter code
- •Work with me — I build AI for banks and insurance companies
By Cyprian Aarons, AI Consultant at Topiax.
Want the complete 8-step roadmap?
Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.
Get the Starter Kit