LlamaIndex Tutorial (TypeScript): adding authentication for intermediate developers

By Cyprian AaronsUpdated 2026-04-21
llamaindexadding-authentication-for-intermediate-developerstypescript

This tutorial shows you how to put an authentication layer in front of a LlamaIndex TypeScript app so only verified users can ask questions and hit your index. You need this when your agent is exposed through an API, a dashboard, or a shared internal tool and you cannot let every request access your data.

What You'll Need

  • Node.js 18+
  • A TypeScript project with ts-node or a build step
  • llamaindex installed
  • express, jsonwebtoken, and dotenv
  • An OpenAI API key in .env
  • A JWT secret in .env
  • Basic familiarity with LlamaIndex query engines and Express routes

Install the packages:

npm install llamaindex express jsonwebtoken dotenv
npm install -D typescript ts-node @types/express @types/jsonwebtoken @types/node

Step-by-Step

  1. Set up your environment variables first. Keep the model key and auth secret separate; do not hardcode either one in source control.
OPENAI_API_KEY=sk-your-openai-key
JWT_SECRET=super-long-random-secret
PORT=3000
  1. Create a small authentication helper that signs and verifies JWTs. In a real bank or insurance setup, this would sit behind your SSO or identity provider, but this version is enough to protect the LlamaIndex endpoint.
import jwt from "jsonwebtoken";

const JWT_SECRET = process.env.JWT_SECRET || "dev-secret";

export type AuthTokenPayload = {
  sub: string;
  role: "user" | "admin";
};

export function signToken(payload: AuthTokenPayload): string {
  return jwt.sign(payload, JWT_SECRET, { expiresIn: "1h" });
}

export function verifyToken(token: string): AuthTokenPayload {
  return jwt.verify(token, JWT_SECRET) as AuthTokenPayload;
}
  1. Build the LlamaIndex query layer separately from auth. This keeps your retrieval logic reusable and makes it easy to test without involving HTTP.
import { Document, VectorStoreIndex } from "llamaindex";

const docs = [
  new Document({
    text: "Claims are paid after validation of policy coverage and incident details.",
  }),
  new Document({
    text: "KYC checks are required before opening a premium account.",
  }),
];

const index = await VectorStoreIndex.fromDocuments(docs);
const queryEngine = index.asQueryEngine();

export async function answerQuestion(question: string): Promise<string> {
  const response = await queryEngine.query({ query: question });
  return response.toString();
}
  1. Add an Express middleware that checks for a bearer token before the request reaches LlamaIndex. This is the important part: auth should fail fast before any expensive model or retrieval call happens.
import express, { Request, Response, NextFunction } from "express";
import dotenv from "dotenv";
import { answerQuestion } from "./query";
import { verifyToken } from "./auth";

dotenv.config();

const app = express();
app.use(express.json());

function requireAuth(req: Request, res: Response, next: NextFunction) {
  const header = req.header("authorization");
  if (!header?.startsWith("Bearer ")) {
    return res.status(401).json({ error: "Missing bearer token" });
  }

  try {
    const token = header.slice("Bearer ".length);
    (req as Request & { user?: unknown }).user = verifyToken(token);
    next();
  } catch {
    return res.status(401).json({ error: "Invalid or expired token" });
  }
}
  1. Expose two routes: one to mint a test token and one protected route that calls LlamaIndex. For production, replace the /login route with your identity provider flow; do not issue tokens from your app unless you really mean to own auth end-to-end.
import { signToken } from "./auth";

app.post("/login", (req, res) => {
  const { userId } = req.body as { userId?: string };
  if (!userId) {
    return res.status(400).json({ error: "userId is required" });
  }

  const token = signToken({ sub: userId, role: "user" });
  res.json({ token });
});

app.post("/ask", requireAuth, async (req, res) => {
  const { question } = req.body as { question?: string };
  if (!question) {
    return res.status(400).json({ error: "question is required" });
  }

  const answer = await answerQuestion(question);
  res.json({ answer });
});

app.listen(Number(process.env.PORT || 3000), () => {
  console.log(`Server running on port ${process.env.PORT || 3000}`);
});
  1. If you want role-based access, check the decoded token before allowing certain prompts or admin-only actions. This pattern is common when some users can query customer-facing knowledge while others can access operational documents.
app.post("/admin-ask", requireAuth, async (req, res) => {
  const user = (req as Request & { user?: { sub: string; role: string } }).user;
  if (user?.role !== "admin") {
    return res.status(403).json({ error: "Admin access required" });
  }

  const { question } = req.body as { question?: string };
  if (!question) {
    return res.status(400).json({ error: "question is required" });
  }

   const answer = await answerQuestion(question);
   res.json({ answer, requestedBy: user.sub });
});

Testing It

Start the server with npx ts-node src/server.ts, then call /login with a userId to get a token back. Use that token in an Authorization: Bearer <token> header when calling /ask.

If you omit the header, you should get a 401 Missing bearer token. If you send an invalid or expired token, you should get 401 Invalid or expired token.

A successful request should return the LlamaIndex answer JSON instead of an auth error. If you added the admin route, test it with both a normal user token and an admin token so you can confirm the 403 path works too.

Next Steps

  • Replace the local JWT demo with OAuth2/OIDC via Azure AD, Okta, or Auth0.
  • Add per-user audit logging for every prompt sent to LlamaIndex.
  • Put document-level authorization in front of retrieval so users only search data they’re allowed to see.

Keep learning

By Cyprian Aarons, AI Consultant at Topiax.

Want the complete 8-step roadmap?

Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.

Get the Starter Kit

Related Guides