LangChain Tutorial (TypeScript): deploying with Docker for intermediate developers

By Cyprian AaronsUpdated 2026-04-21
langchaindeploying-with-docker-for-intermediate-developerstypescript

This tutorial shows you how to package a LangChain TypeScript app into a Docker image and run it like a real service. You need this when your local node setup works, but you want the same behavior in staging, CI, or on a container platform.

What You'll Need

  • Node.js 20+
  • Docker Desktop or Docker Engine
  • An OpenAI API key
  • A TypeScript project with npm
  • These packages:
    • langchain
    • @langchain/openai
    • dotenv
    • typescript
    • tsx for local execution
  • Basic familiarity with:
    • async/await
    • environment variables
    • Docker images and containers

Step-by-Step

  1. Start with a minimal TypeScript project and install the runtime dependencies. Keep the app small so you can verify the container is correct before adding more LangChain logic.
mkdir langchain-docker-demo
cd langchain-docker-demo
npm init -y
npm install langchain @langchain/openai dotenv
npm install -D typescript tsx @types/node
npx tsc --init
  1. Create a simple LangChain script that reads from the OpenAI API and prints one response. This keeps the deployment path realistic: environment variables in, text out.
// src/index.ts
import "dotenv/config";
import { ChatOpenAI } from "@langchain/openai";
import { HumanMessage } from "langchain/schema";

async function main() {
  const model = new ChatOpenAI({
    model: "gpt-4o-mini",
    temperature: 0,
  });

  const response = await model.invoke([
    new HumanMessage("Write one sentence about Dockerizing TypeScript apps."),
  ]);

  console.log(response.content);
}

main().catch((error) => {
  console.error(error);
  process.exit(1);
});
  1. Add scripts and a clean TypeScript config so both local runs and container runs behave the same. Use CommonJS output here because it avoids extra friction when compiling and running under Node in Docker.
{
  "name": "langchain-docker-demo",
  "version": "1.0.0",
  "type": "commonjs",
  "scripts": {
    "dev": "tsx src/index.ts",
    "build": "tsc",
    "start": "node dist/index.js"
  },
  "dependencies": {
    "@langchain/openai": "^0.6.0",
    "dotenv": "^16.4.5",
    "langchain": "^0.3.0"
  },
  "devDependencies": {
    "@types/node": "^22.0.0",
    "tsx": "^4.19.2",
    "typescript": "^5.6.3"
  }
}
{
  "compilerOptions": {
    "target": "ES2022",
    "module": "CommonJS",
    "moduleResolution": "Node",
    "outDir": "./dist",
    "rootDir": "./src",
    "strict": true,
    "esModuleInterop": true,
    "skipLibCheck": true,
    "resolveJsonModule": true
  },
  "include": ["src"]
}
  1. Add your Dockerfile next. The pattern is simple: install dependencies, compile TypeScript, then run the compiled JavaScript in a slim runtime image.
FROM node:20-slim AS builder

WORKDIR /app

COPY package*.json tsconfig.json ./
RUN npm ci

COPY src ./src
RUN npm run build

FROM node:20-slim

WORKDIR /app

COPY package*.json ./
RUN npm ci --omit=dev

COPY --from=builder /app/dist ./dist

CMD ["node", "dist/index.js"]
  1. Pass your API key through environment variables instead of baking it into the image. That keeps secrets out of layers, logs, and git history.
# .env
OPENAI_API_KEY=your_openai_api_key_here
# build and run locally with Docker
docker build -t langchain-docker-demo .
docker run --rm --env-file .env langchain-docker-demo
  1. If you want a tighter developer loop, test locally before building the image. That lets you catch LangChain or TypeScript issues without waiting on Docker builds every time.
npm run dev
npm run build
npm start

Testing It

Run npm run dev first and confirm you get a single natural-language sentence back from the model. Then build the image with docker build -t langchain-docker-demo . and run it using docker run --rm --env-file .env langchain-docker-demo.

If the container exits cleanly and prints a response, your deployment path is correct end to end. If it fails, check three things first: the API key value, whether dist/index.js exists after build, and whether your installed package versions match the imports in the code.

For production-style debugging, inspect the built image with docker exec only if you switch to an interactive command; otherwise keep the container immutable and fix issues in code or config.

Next Steps

  • Add structured output with Zod so your container returns JSON instead of plain text.
  • Wrap the model call in an HTTP API using Fastify or Express, then deploy that container behind a load balancer.
  • Add health checks and request logging so your LangChain service is observable in staging and production.

Keep learning

By Cyprian Aarons, AI Consultant at Topiax.

Want the complete 8-step roadmap?

Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.

Get the Starter Kit

Related Guides