LlamaIndex Tutorial (TypeScript): deploying with Docker for beginners
This tutorial shows you how to package a TypeScript LlamaIndex app into a Docker container and run it locally with one command. You need this when you want the same runtime everywhere: your laptop, CI, staging, or a small production service.
What You'll Need
- •Node.js 20+
- •Docker Desktop or Docker Engine
- •An OpenAI API key
- •A basic TypeScript project with
npm - •These packages:
- •
llamaindex - •
dotenv - •
typescript - •
tsxorts-nodefor local runs
- •
- •A
.envfile with your API key
Step-by-Step
- •
Set up the project and install dependencies.
Start with a clean TypeScript app and install the LlamaIndex SDK plus the tooling needed to run TypeScript directly. This keeps the example simple and avoids adding a build pipeline before you need one.
mkdir llamaindex-docker-ts cd llamaindex-docker-ts npm init -y npm install llamaindex dotenv npm install -D typescript tsx @types/node - •
Add a TypeScript config and environment file.
Docker will copy these files into the image, so keep them minimal and explicit. The
.envfile is where your OpenAI key lives; do not bake secrets into the image.{ "compilerOptions": { "target": "ES2022", "module": "NodeNext", "moduleResolution": "NodeNext", "strict": true, "esModuleInterop": true, "skipLibCheck": true, "outDir": "dist" }, "include": ["src/**/*.ts"] }cat > .env <<'EOF' OPENAI_API_KEY=your_openai_api_key_here EOF - •
Create a minimal LlamaIndex app in TypeScript.
This example uses an in-memory document, indexes it, then asks a question. It is small enough to run in Docker without external services beyond the OpenAI API.
// src/index.ts import "dotenv/config"; import { Document, VectorStoreIndex } from "llamaindex"; async function main() { const docs = [ new Document({ text: "Docker packages an application and its dependencies into a portable container.", }), ]; const index = await VectorStoreIndex.fromDocuments(docs); const queryEngine = index.asQueryEngine(); const response = await queryEngine.query({ query: "What does Docker do?", }); console.log(String(response)); } main().catch((error) => { console.error(error); process.exit(1); }); - •
Add scripts so you can run it locally before Dockerizing it.
Always verify the app works on your machine first. If it fails locally, Docker will just make debugging slower.
{ "name": "llamaindex-docker-ts", "version": "1.0.0", "type": "module", "scripts": { "dev": "tsx src/index.ts", "start": "node dist/index.js", "build": "tsc" }, "dependencies": { "dotenv": "^16.4.5", "llamaindex": "^0.6.0" }, "devDependencies": { "@types/node": "^22.0.0", "tsx": "^4.16.2", "typescript": "^5.5.4" } } - •
Create a Dockerfile that installs dependencies, builds the app, and runs it.
This is the production pattern: multi-step build, then run only the compiled output in the final container. It keeps the runtime image smaller and avoids shipping dev tools into production.
FROM node:20-alpine AS build WORKDIR /app COPY package*.json tsconfig.json ./ RUN npm ci COPY src ./src RUN npm run build FROM node:20-alpine WORKDIR /app ENV NODE_ENV=production COPY package*.json ./ RUN npm ci --omit=dev COPY --from=build /app/dist ./dist CMD ["node", "./dist/index.js"] - •
Build and run the container with your environment variable passed in.
Use
--env-fileso Docker reads your local.envfile without hardcoding secrets into commands or images. If you prefer Compose later, this same pattern maps cleanly toenvironment:entries.docker build -t llamaindex-docker-ts . docker run --rm --env-file .env llamaindex-docker-ts
Testing It
You should see a response printed to stdout that answers the question about Docker using your indexed document. If you get an authentication error, check that OPENAI_API_KEY is set correctly in .env.
If the container exits immediately with a module or build error, inspect whether "type": "module" matches your TypeScript config and whether all files are under src/. For dependency issues, rebuild from scratch with docker build --no-cache -t llamaindex-docker-ts ..
A good sanity check is to change the document text and rerun the container. If the answer changes accordingly, your index is being built inside Docker correctly.
Next Steps
- •Move from a single hardcoded document to loading files from
/datainside the container. - •Add
docker-compose.ymlso you can manage environment variables and volume mounts more cleanly. - •Expose this as an HTTP API with Fastify or Express so other services can call your LlamaIndex app directly.
Keep learning
- •The complete AI Agents Roadmap — my full 8-step breakdown
- •Free: The AI Agent Starter Kit — PDF checklist + starter code
- •Work with me — I build AI for banks and insurance companies
By Cyprian Aarons, AI Consultant at Topiax.
Want the complete 8-step roadmap?
Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.
Get the Starter Kit