LangChain Tutorial (Python): deploying with Docker for advanced developers

By Cyprian AaronsUpdated 2026-04-21
langchaindeploying-with-docker-for-advanced-developerspython

This tutorial shows you how to package a real LangChain Python app into a Docker image, run it locally, and keep the setup close to what you’d deploy in a containerized environment. You need this when your agent works on your laptop but needs the same runtime, dependencies, and environment variables in CI, staging, or production.

What You'll Need

  • Python 3.11+
  • Docker Desktop or Docker Engine
  • An OpenAI API key
  • pip and venv
  • These Python packages:
    • langchain
    • langchain-openai
    • python-dotenv
  • A basic understanding of environment variables and containers

Step-by-Step

  1. Start with a minimal LangChain app that reads its API key from the environment. Keep the code small and deterministic so the container build is easy to debug.
# app.py
import os

from langchain_openai import ChatOpenAI
from langchain_core.prompts import ChatPromptTemplate

def main() -> None:
    api_key = os.environ["OPENAI_API_KEY"]

    model = ChatOpenAI(
        model="gpt-4o-mini",
        temperature=0,
        api_key=api_key,
    )

    prompt = ChatPromptTemplate.from_messages([
        ("system", "You are a concise assistant."),
        ("user", "Summarize Docker deployment for LangChain in one sentence."),
    ])

    chain = prompt | model
    result = chain.invoke({})
    print(result.content)

if __name__ == "__main__":
    main()
  1. Pin your dependencies. For container builds, unpinned or loosely pinned packages are how you get broken deployments after a transitive update.
# requirements.txt
langchain==0.3.27
langchain-openai==0.3.35
python-dotenv==1.1.1
  1. Add a Dockerfile that installs dependencies first, then copies your application code. This keeps rebuilds fast because Docker can reuse the dependency layer when only your source changes.
FROM python:3.11-slim

WORKDIR /app

ENV PYTHONDONTWRITEBYTECODE=1 \
    PYTHONUNBUFFERED=1

COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt

COPY app.py .

CMD ["python", "app.py"]
  1. Build the image locally, then run it with the API key injected at runtime. Do not bake secrets into the image; treat the container as disposable and the secret as external state.
docker build -t langchain-docker-demo .
docker run --rm \
  -e OPENAI_API_KEY="$OPENAI_API_KEY" \
  langchain-docker-demo
  1. If you want a cleaner local developer flow, use a .env file and load it from your shell before running Docker. This is useful when multiple services share the same environment variables.
cat > .env <<'EOF'
OPENAI_API_KEY=your_real_key_here
EOF

set -a
source .env
set +a

docker run --rm \
  -e OPENAI_API_KEY \
  langchain-docker-demo
  1. For advanced setups, add a non-root user and keep the image smaller by excluding build artifacts later with .dockerignore. This matters once you start shipping more than one agent or adding test files and notebooks.
FROM python:3.11-slim

WORKDIR /app

ENV PYTHONDONTWRITEBYTECODE=1 \
    PYTHONUNBUFFERED=1

RUN useradd -m appuser
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt

COPY app.py .
USER appuser

CMD ["python", "app.py"]

Testing It

Run the container and confirm you get a model response instead of an import error or auth failure. If it fails, check three things first: the API key is present, the package versions match requirements.txt, and your Docker image actually copied app.py.

A good smoke test is to rebuild after changing only the prompt text in app.py. If Docker caching is working correctly, dependency installation should be skipped and only the application layer should refresh.

If you need more confidence, run the same command twice and compare output stability with temperature=0. That gives you a predictable baseline before you wire this into CI or deploy behind an internal API.

Next Steps

  • Add FastAPI so your LangChain chain runs behind an HTTP endpoint inside Docker.
  • Split configuration into typed settings with pydantic-settings for staging and production.
  • Add health checks and structured logging before moving this into Kubernetes or ECS.

Keep learning

By Cyprian Aarons, AI Consultant at Topiax.

Want the complete 8-step roadmap?

Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.

Get the Starter Kit

Related Guides