LlamaIndex Tutorial (TypeScript): adding authentication for beginners
This tutorial shows you how to add authentication to a LlamaIndex TypeScript app so only approved users can call your agent or query endpoint. You need this when your index is exposed through an API, especially if you’re handling internal company data, customer records, or any other protected content.
What You'll Need
- •Node.js 18+ installed
- •A TypeScript project with
npmorpnpm - •LlamaIndex TypeScript packages:
- •
llamaindex - •
express - •
jsonwebtoken - •
dotenv - •
zod
- •
- •An OpenAI API key
- •A simple auth secret for signing JWTs
- •Basic familiarity with LlamaIndex documents, indexes, and query engines
Step-by-Step
- •Start by installing the packages and setting up your environment variables. For a beginner-friendly auth layer, JWT is enough: one endpoint issues tokens, another validates them before touching your index.
npm install llamaindex express jsonwebtoken dotenv zod
npm install -D typescript tsx @types/express @types/jsonwebtoken @types/node
Create a .env file:
OPENAI_API_KEY=your_openai_key_here
JWT_SECRET=super-long-random-secret
PORT=3000
- •Build a tiny auth module that signs and verifies tokens. Keep it boring and explicit; the goal is to protect your LlamaIndex routes, not build a full identity system.
// auth.ts
import jwt from "jsonwebtoken";
const JWT_SECRET = process.env.JWT_SECRET;
if (!JWT_SECRET) throw new Error("Missing JWT_SECRET");
export type AuthPayload = {
userId: string;
role: "user" | "admin";
};
export function signToken(payload: AuthPayload): string {
return jwt.sign(payload, JWT_SECRET, { expiresIn: "1h" });
}
export function verifyToken(token: string): AuthPayload {
return jwt.verify(token, JWT_SECRET) as AuthPayload;
}
- •Create a protected route that issues a token after validating a username and password. In production you’d check a database or identity provider; here we use hardcoded credentials so the example stays runnable.
// server.ts
import "dotenv/config";
import express from "express";
import { z } from "zod";
import { signToken } from "./auth";
const app = express();
app.use(express.json());
const loginSchema = z.object({
username: z.string(),
password: z.string(),
});
app.post("/login", (req, res) => {
const result = loginSchema.safeParse(req.body);
if (!result.success) return res.status(400).json({ error: "Invalid body" });
const { username, password } = result.data;
if (username !== "demo" || password !== "demo123") {
return res.status(401).json({ error: "Invalid credentials" });
}
const token = signToken({ userId: "user-123", role: "user" });
return res.json({ token });
});
- •Add a middleware that checks the bearer token before allowing access to your LlamaIndex endpoint. This is the part that actually protects retrieval and generation from unauthenticated callers.
import { verifyToken } from "./auth";
import type { Request, Response, NextFunction } from "express";
function requireAuth(req: Request, res: Response, next: NextFunction) {
const header = req.headers.authorization;
if (!header?.startsWith("Bearer ")) {
return res.status(401).json({ error: "Missing bearer token" });
}
try {
const token = header.slice("Bearer ".length);
(req as Request & { user?: unknown }).user = verifyToken(token);
next();
} catch {
return res.status(401).json({ error: "Invalid or expired token" });
}
}
- •Wire LlamaIndex into the protected route. This example creates a small in-memory index and answers questions only after auth passes.
import { Document, VectorStoreIndex } from "llamaindex";
const docs = [
new Document({
text: "Topiax supports secure AI workflows for financial services teams.",
metadata: { source: "internal-note" },
}),
];
const index = await VectorStoreIndex.fromDocuments(docs);
const queryEngine = index.asQueryEngine();
app.post("/query", requireAuth, async (req, res) => {
const bodySchema = z.object({ question: z.string().min(1) });
const result = bodySchema.safeParse(req.body);
if (!result.success) return res.status(400).json({ error: "Invalid body" });
const response = await queryEngine.query({ query: result.data.question });
return res.json({ answer: response.toString() });
});
app.listen(Number(process.env.PORT ?? 3000), () => {
console.log(`Server running on http://localhost:${process.env.PORT ?? 3000}`);
});
- •Run the server and test the flow end to end. First get a token from
/login, then use it on/query; without the token you should get a401.
npx tsx server.ts
Example request flow:
curl -s -X POST http://localhost:3000/login \
-H 'Content-Type: application/json' \
-d '{"username":"demo","password":"demo123"}'
Then call the protected query endpoint:
curl -s -X POST http://localhost:3000/query \
-H 'Content-Type: application/json' \
-H 'Authorization: Bearer YOUR_TOKEN_HERE' \
-d '{"question":"What does Topiax support?"}'
Testing It
Test three cases in order. First hit /query with no Authorization header and confirm you get 401 Missing bearer token. Then send an invalid or expired token and confirm you get 401 Invalid or expired token. Finally log in with the demo credentials and verify the response contains an answer from your LlamaIndex query engine.
If you want to be stricter, add role checks before allowing certain queries. That’s useful when some users can only read public docs while admins can access sensitive collections.
Next Steps
- •Move credentials out of hardcoded values and into a real identity provider like Auth0, Clerk, or Azure AD
- •Add per-user document filtering so authenticated users only retrieve data they’re allowed to see
- •Replace the in-memory documents with a real vector store like Pinecone, Postgres pgvector, or Qdrant
Keep learning
- •The complete AI Agents Roadmap — my full 8-step breakdown
- •Free: The AI Agent Starter Kit — PDF checklist + starter code
- •Work with me — I build AI for banks and insurance companies
By Cyprian Aarons, AI Consultant at Topiax.
Want the complete 8-step roadmap?
Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.
Get the Starter Kit