How to Fix 'invalid API key in production' in LangChain (TypeScript)

By Cyprian AaronsUpdated 2026-04-21
invalid-api-key-in-productionlangchaintypescript

When LangChain says invalid API key in production, it usually means the key that works locally is not the key your deployed app is actually using. In TypeScript projects, this almost always shows up after a deploy to Vercel, Docker, AWS, or a serverless runtime where environment variables are different from your laptop.

The error often comes from OpenAI / ChatOpenAI initialization inside LangChain, and the underlying response is usually an HTTP 401 with something like:

  • AuthenticationError: Incorrect API key provided
  • Error: 401 Unauthorized
  • OpenAIError: The api_key client option must be set either by passing apiKey or by setting the OPENAI_API_KEY environment variable

The Most Common Cause

The #1 cause is reading process.env.OPENAI_API_KEY too early, or hardcoding a value that exists locally but not in production.

This happens a lot when people instantiate the model at module scope during build time, or when they rely on .env.local and forget that production does not load that file.

Broken vs fixed

Broken patternFixed pattern
Reads env at import timeReads env at runtime
Assumes .env.local exists in prodUses platform secrets/config
Creates client before env is readyCreates client after env is loaded
// ❌ Broken
import { ChatOpenAI } from "@langchain/openai";

export const model = new ChatOpenAI({
  apiKey: process.env.OPENAI_API_KEY,
  model: "gpt-4o-mini",
});

// Later in a route handler
const res = await model.invoke("Hello");
// ✅ Fixed
import { ChatOpenAI } from "@langchain/openai";

export function getModel() {
  const apiKey = process.env.OPENAI_API_KEY;

  if (!apiKey) {
    throw new Error("OPENAI_API_KEY is missing");
  }

  return new ChatOpenAI({
    apiKey,
    model: "gpt-4o-mini",
  });
}

// Later in a route handler
const model = getModel();
const res = await model.invoke("Hello");

Why this matters:

  • In Next.js, serverless functions can evaluate modules during build or cold start.
  • In Docker, the image may be built without the runtime secret.
  • In CI/CD, you may have set the variable in one environment but not another.

Other Possible Causes

1) Wrong environment variable name

LangChain’s OpenAI integration expects OPENAI_API_KEY, not OPEN_AI_KEY, OPENAI_KEY, or API_KEY.

// ❌ Broken
process.env.OPEN_AI_KEY;
process.env.OPENAI_KEY;
// ✅ Fixed
process.env.OPENAI_API_KEY;

If you pass the key manually, keep the naming consistent:

import { ChatOpenAI } from "@langchain/openai";

const model = new ChatOpenAI({
  apiKey: process.env.OPENAI_API_KEY!,
});

2) The key is present locally but missing in production

This is common when .env.local works on your machine but the deployment platform has no secret configured.

# Local only
OPENAI_API_KEY=sk-proj-local-dev-key

On Vercel, AWS Lambda, Render, Fly.io, or Docker Compose, you need to set it in that platform’s secret store:

# docker-compose.yml
services:
  app:
    environment:
      OPENAI_API_KEY: ${OPENAI_API_KEY}

If ${OPENAI_API_KEY} is empty at runtime, LangChain will fail with a 401.

3) Using an expired or revoked key

A key can work for weeks and then stop after rotation or revocation. The error still looks like an auth issue, but the fix is to replace the secret.

const model = new ChatOpenAI({
  apiKey: "sk-proj-old-revoked-key",
});

Replace it with a valid secret and redeploy. If you rotate keys often, make sure old deployments are not pinned to stale secrets.

4) Client-side code trying to call OpenAI directly

If you put LangChain OpenAI code in browser code, you expose your key and often break auth because production bundlers strip or replace server-only env vars.

// ❌ Broken: browser/client component
"use client";

import { ChatOpenAI } from "@langchain/openai";

Keep LangChain LLM calls on the server:

// ✅ Fixed: server route / server action / backend service
import { ChatOpenAI } from "@langchain/openai";

For Next.js App Router, put it in a server route:

// app/api/chat/route.ts
import { ChatOpenAI } from "@langchain/openai";

export async function POST() {
  const model = new ChatOpenAI({
    apiKey: process.env.OPENAI_API_KEY!,
    model: "gpt-4o-mini",
  });

  return Response.json(await model.invoke("ping"));
}

How to Debug It

  1. Log whether the variable exists at runtime
    • Don’t print the full key.
    • Print presence and length only.
console.log("OPENAI_API_KEY present:", !!process.env.OPENAI_API_KEY);
console.log("OPENAI_API_KEY length:", process.env.OPENAI_API_KEY?.length ?? 0);
  1. Check where the error happens

    • If it fails during import/build, you likely initialized ChatOpenAI too early.
    • If it fails on request, inspect runtime env injection and secret config.
  2. Call OpenAI outside LangChain once

    • This isolates whether LangChain is masking a plain auth issue.
const res = await fetch("https://api.openai.com/v1/models", {
  headers: {
    Authorization: `Bearer ${process.env.OPENAI_API_KEY}`,
  },
});

console.log(res.status);
console.log(await res.text());
  1. Confirm deployment secrets
    • Check Vercel project env vars.
    • Check Docker runtime env.
    • Check CI/CD pipeline variables.
    • Make sure staging and production do not point at different keys by mistake.

Prevention

  • Initialize ChatOpenAI inside server request handlers or factory functions, not at module top level.
  • Fail fast if process.env.OPENAI_API_KEY is missing so bad deployments break immediately.
  • Store secrets only in your deployment platform’s secret manager; do not depend on .env.local beyond local development.

If you’re seeing AuthenticationError: Incorrect API key provided in LangChain TypeScript, assume environment mismatch first. In production systems, that’s usually where the bug lives.


Keep learning

By Cyprian Aarons, AI Consultant at Topiax.

Want the complete 8-step roadmap?

Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.

Get the Starter Kit

Related Guides