How to Fix 'authentication failed' in LlamaIndex (TypeScript)

By Cyprian AaronsUpdated 2026-04-21
authentication-failedllamaindextypescript

When LlamaIndex throws authentication failed, it usually means the SDK reached the provider, but the credentials were missing, wrong, expired, or not being read from the environment. In TypeScript projects, this most often shows up when you instantiate an OpenAI-backed LlamaIndex component without a valid API key.

The error typically appears during embedding calls, chat completions, or index construction. If you see something like Error: 401 Unauthorized or OpenAIError: Authentication failed, start with your API key wiring before touching anything else.

The Most Common Cause

The #1 cause is simple: the OpenAI API key is not being passed correctly to the LlamaIndex TypeScript client.

This usually happens when:

  • .env is loaded too late
  • process.env.OPENAI_API_KEY is undefined
  • you created the client before setting env vars
  • you used the wrong environment variable name

Broken vs fixed pattern

BrokenFixed
```ts
import { OpenAI, Settings } from "llamaindex";

const llm = new OpenAI({ apiKey: process.env.OPENAI_API_KEY, });

Settings.llm = llm; |ts import "dotenv/config"; import { OpenAI, Settings } from "llamaindex";

const apiKey = process.env.OPENAI_API_KEY; if (!apiKey) { throw new Error("OPENAI_API_KEY is missing"); }

const llm = new OpenAI({ apiKey, });

Settings.llm = llm;


The broken version looks fine at a glance, but if `dotenv` was never loaded, `process.env.OPENAI_API_KEY` is `undefined`. LlamaIndex then forwards that to OpenAI, and you get a failure like:

```text
Error: 401 Unauthorized
OpenAIError: Authentication failed. No API key provided.

If you're using embeddings too, the same issue applies there:

import "dotenv/config";
import { OpenAIEmbedding } from "llamaindex";

const embedModel = new OpenAIEmbedding({
  apiKey: process.env.OPENAI_API_KEY!,
});

Do not rely on the non-null assertion operator (!) as a fix. It only hides the problem until runtime.

Other Possible Causes

1. Wrong environment variable name

LlamaIndex TypeScript does not guess your secret name. If your app expects OPENAI_API_KEY but you set OPEN_AI_KEY, authentication fails.

// Broken
console.log(process.env.OPEN_AI_KEY);

// Fixed
console.log(process.env.OPENAI_API_KEY);

If you're running in Docker or CI, verify the variable exists in that runtime, not just on your laptop.

2. .env file is not loaded early enough

If you import and initialize LlamaIndex before loading env vars, the client may be created with no credentials.

// Broken
import { OpenAI } from "llamaindex";
import dotenv from "dotenv";

dotenv.config();

const llm = new OpenAI({ apiKey: process.env.OPENAI_API_KEY });
// Fixed
import "dotenv/config";
import { OpenAI } from "llamaindex";

const llm = new OpenAI({ apiKey: process.env.OPENAI_API_KEY! });

With ESM TypeScript projects, import "dotenv/config" at the top is the cleanest option.

3. Using Azure/OpenRouter/another provider with the wrong client

A common mistake is pointing LlamaIndex at a non-OpenAI endpoint but still using the default OpenAI client settings.

// Broken: endpoint belongs to Azure/OpenRouter style setup,
// but you're still using default OpenAI auth assumptions.
new OpenAI({
  apiKey: process.env.AZURE_OPENAI_KEY!,
});

For Azure OpenAI, use the Azure-specific configuration supported by your stack rather than forcing it through a plain OpenAI client. The exact class depends on your LlamaIndex TS version and provider integration.

Typical misconfigurations include:

  • wrong base URL
  • wrong deployment name/model name
  • using an Azure key where an OpenAI key is expected

4. Key revoked, rotated, or expired

If the code worked yesterday and fails today with a 401, assume credential rotation first.

const llm = new OpenAI({
  apiKey: process.env.OPENAI_API_KEY!, // value may now be stale
});

Check:

  • secret manager version
  • CI/CD injected env vars
  • local .env
  • provider dashboard for revoked keys

The error often looks like:

OpenAIError: Authentication failed due to invalid API key.
Status code: 401

5. Passing whitespace or quotes into the secret

This happens more than people admit. A copied key with trailing spaces or quotes can fail auth even though it looks correct.

# Broken
OPENAI_API_KEY="sk-proj-abc123 "
# Fixed
OPENAI_API_KEY=sk-proj-abc123

If you load secrets from a file or vault export, trim them before use:

const apiKey = process.env.OPENAI_API_KEY?.trim();

How to Debug It

  1. Print whether the key exists, not the full value

    console.log("API key present:", Boolean(process.env.OPENAI_API_KEY));
    

    If this prints false, stop there. Your issue is env loading or deployment config.

  2. Check where LlamaIndex is failing

    • During embedding creation?
    • During index construction?
    • During chat/query execution?

    That tells you whether the bad credential is attached to OpenAIEmbedding, OpenAI, or both.

  3. Inspect startup order Make sure this happens before any LlamaIndex imports that instantiate clients:

    import "dotenv/config";
    
  4. Test the raw provider outside LlamaIndex If direct provider calls fail too, it’s not a LlamaIndex bug.

    • Wrong key format?
    • Revoked secret?
    • Wrong account/project?

Prevention

  • Load secrets at process startup and fail fast if required env vars are missing.
  • Centralize model/client creation so every LLM and embedding instance uses the same validated config.
  • Add a startup health check that verifies auth before serving traffic.

A good production pattern looks like this:

import "dotenv/config";
import { OpenAI } from "llamaindex";

function requireEnv(name: string): string {
  const value = process.env[name]?.trim();
  if (!value) throw new Error(`${name} is missing`);
  return value;
}

export const llm = new OpenAI({
  apiKey: requireEnv("OPENAI_API_KEY"),
});

That avoids silent failures and makes "authentication failed" turn into a clear config error before your app gets to production traffic.


Keep learning

By Cyprian Aarons, AI Consultant at Topiax.

Want the complete 8-step roadmap?

Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.

Get the Starter Kit

Related Guides