How to Fix 'authentication failed' in LangGraph (TypeScript)

By Cyprian AaronsUpdated 2026-04-21
authentication-failedlanggraphtypescript

What the error means

authentication failed in LangGraph usually means your app reached a LangGraph service or dependency, but the credentials it sent were missing, expired, or wrong. In TypeScript projects, this typically shows up when creating a Client, calling a remote graph, or wiring an LLM provider through LangChain/LangGraph with the wrong env vars.

The key thing: this is rarely a LangGraph “bug”. It’s almost always an auth/config issue in your client setup, deployment environment, or API key propagation.

The Most Common Cause

The #1 cause is a missing or incorrect LangGraph Platform API key in your TypeScript client. People often set the key in .env, but never load it into Node at runtime, or they pass the wrong variable name to the Client constructor.

Here’s the broken pattern versus the fixed pattern.

BrokenFixed
Reads env var that is never loadedLoads env vars before creating the client
Uses wrong variable nameUses the exact expected key
Sends empty apiKeyPasses a real token
// broken.ts
import { Client } from "@langchain/langgraph-sdk";

const client = new Client({
  apiUrl: "https://api.langgraph.dev",
  apiKey: process.env.LANGGRAPH_API_KEY, // undefined at runtime
});

await client.assistants.search();
// fixed.ts
import "dotenv/config";
import { Client } from "@langchain/langgraph-sdk";

const client = new Client({
  apiUrl: "https://api.langgraph.dev",
  apiKey: process.env.LANGGRAPH_API_KEY!,
});

const assistants = await client.assistants.search();
console.log(assistants);

If the key is wrong or missing, you’ll usually see one of these:

  • 401 Unauthorized
  • authentication failed
  • Error: Request failed with status code 401
  • LangGraphError: authentication failed

In production, this often happens because .env works locally but not in Docker, Vercel, GitHub Actions, or a serverless runtime.

Other Possible Causes

1) You’re mixing up LangGraph Cloud auth and model-provider auth

LangGraph auth and OpenAI/Anthropic auth are separate. A valid OpenAI key does not authenticate you to LangGraph Cloud.

// broken
const client = new Client({
  apiUrl: "https://api.langgraph.dev",
  apiKey: process.env.OPENAI_API_KEY, // wrong credential type
});
// fixed
const client = new Client({
  apiUrl: "https://api.langgraph.dev",
  apiKey: process.env.LANGGRAPH_API_KEY,
});

If your graph calls an LLM node, you also need the model provider key configured separately:

import { ChatOpenAI } from "@langchain/openai";

const model = new ChatOpenAI({
  apiKey: process.env.OPENAI_API_KEY,
});

2) Your environment variable is present locally but missing in deployment

This happens constantly with Next.js, Docker, and CI/CD. The code looks fine, but runtime env injection is broken.

# broken deployment config
LANGGRAPH_APIKEY=lg_xxx   # typo: missing underscore
# fixed deployment config
LANGGRAPH_API_KEY=lg_xxx

In Next.js, make sure server-side code reads secrets only on the server:

// server-only file
import "server-only";
import { Client } from "@langchain/langgraph-sdk";

3) Your API key is expired, revoked, or scoped incorrectly

Some org setups rotate keys regularly. If a teammate rotated the key last week, your local .env may still contain the old one.

const client = new Client({
  apiUrl: "https://api.langgraph.dev",
  apiKey: "lg_old_revoked_key",
});

Fix by generating a fresh key and checking whether it belongs to the correct workspace/org.

4) You are calling the wrong endpoint for your deployment

A valid token sent to the wrong base URL can still fail auth. This happens when dev/staging/prod endpoints get mixed up.

// broken
new Client({
  apiUrl: "https://api.langgraph.dev", // not your actual workspace endpoint
  apiKey: process.env.LANGGRAPH_API_KEY!,
});
// fixed
new Client({
  apiUrl: process.env.LANGGRAPH_URL!, // exact deployed endpoint for your graph service
  apiKey: process.env.LANGGRAPH_API_KEY!,
});

Check whether you’re using:

  • LangGraph Platform hosted URL
  • Self-hosted LangGraph endpoint
  • A proxy or gateway in front of it

Each one can require different auth headers or tokens.

How to Debug It

  1. Print what your app actually sees at runtime

    Don’t trust .env files. Log only presence/shape, not full secrets.

    console.log({
      hasLangGraphKey: !!process.env.LANGGRAPH_API_KEY,
      langGraphUrl: process.env.LANGGRAPH_URL,
    });
    
  2. Inspect the exact request failure

    Look for status code and response body. A real auth issue will usually be 401 with a message like authentication failed.

    try {
      await client.assistants.search();
    } catch (error) {
      console.error(error);
    }
    
  3. Verify you are using the right class and package

    For remote LangGraph access, use Client from @langchain/langgraph-sdk.
    For graph execution inside your app, use LangGraph runtime classes like StateGraph.

    If you’re trying to authenticate a remote service with a local-only graph class, you’re debugging the wrong layer.

  4. Test outside your app

    Use curl/Postman against the same endpoint with the same token. If curl fails too, your issue is credentials or endpoint config, not TypeScript code.

Prevention

  • Keep all secrets in one place and load them explicitly with dotenv/config or platform-native secret injection.
  • Separate credentials by concern:
    • LANGGRAPH_API_KEY for LangGraph access
    • OPENAI_API_KEY / provider keys for model calls
  • Add startup checks so bad config fails fast:
if (!process.env.LANGGRAPH_API_KEY) {
  throw new Error("Missing LANGGRAPH_API_KEY");
}
if (!process.env.LANGGRAPH_URL) {
  throw new Error("Missing LANGGRAPH_URL");
}

If you hit authentication failed, start with the client constructor and runtime env first. In most TypeScript LangGraph setups, that’s where the problem lives.


Keep learning

By Cyprian Aarons, AI Consultant at Topiax.

Want the complete 8-step roadmap?

Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.

Get the Starter Kit

Related Guides