How to Fix 'invalid API key during development' in LangChain (TypeScript)
If you see invalid API key while running LangChain in TypeScript, the SDK is usually telling you one of two things: the key is missing, or the wrong value is being passed into the model constructor. This shows up a lot during local development when .env loading, import order, or environment variable names are off.
The error often appears with OpenAI-backed classes like ChatOpenAI, and the stack trace usually points to a request failing before the first token comes back. In practice, this is almost always a configuration issue, not a LangChain bug.
The Most Common Cause
The #1 cause is this: you created the model before your environment variables were loaded, or you passed the wrong env var name. With LangChain JS/TS, ChatOpenAI reads process.env.OPENAI_API_KEY, so if that value is undefined at construction time, you’ll get errors like:
- •
Error: OpenAI API key not found - •
Error: Incorrect API key provided - •
401 Unauthorized - •
invalid_api_key
Here’s the broken pattern versus the fixed pattern.
| Broken | Fixed |
|---|---|
| ```ts | |
| import { ChatOpenAI } from "@langchain/openai"; |
const llm = new ChatOpenAI({ apiKey: process.env.LANGCHAIN_API_KEY, });
console.log(await llm.invoke("Hello"));
|ts
import "dotenv/config";
import { ChatOpenAI } from "@langchain/openai";
const llm = new ChatOpenAI({ apiKey: process.env.OPENAI_API_KEY, });
console.log(await llm.invoke("Hello"));
A few things are wrong in the broken version:
- `LANGCHAIN_API_KEY` is not the OpenAI key LangChain needs for `ChatOpenAI`
- `dotenv` was never loaded
- if this file runs before env initialization, `process.env.OPENAI_API_KEY` will be empty
If you want to be explicit and production-safe, validate early:
```ts
import "dotenv/config";
import { ChatOpenAI } from "@langchain/openai";
const apiKey = process.env.OPENAI_API_KEY;
if (!apiKey) {
throw new Error("OPENAI_API_KEY is missing");
}
const llm = new ChatOpenAI({ apiKey });
Other Possible Causes
1. You used the wrong provider key for the model class
LangChain has provider-specific classes. A Google or Anthropic key will not work with ChatOpenAI.
// Broken
const llm = new ChatOpenAI({
apiKey: process.env.ANTHROPIC_API_KEY,
});
// Fixed
const llm = new ChatOpenAI({
apiKey: process.env.OPENAI_API_KEY,
});
If you’re using Anthropic, use ChatAnthropic. If you’re using OpenAI, use ChatOpenAI.
2. Your .env file is not being loaded in Node/TS
This happens when you forget to import dotenv or run code through a test runner that doesn’t load env files automatically.
// Broken
import { ChatOpenAI } from "@langchain/openai";
// Fixed
import "dotenv/config";
import { ChatOpenAI } from "@langchain/openai";
If you prefer explicit loading:
import dotenv from "dotenv";
dotenv.config();
Also check your runtime:
- •
tsx src/index.ts - •
node dist/index.js - •Jest/Vitest runners may need separate env setup
3. Your environment variable contains quotes or whitespace
A copied key can include invisible spaces or shell quotes that break auth.
# Broken
OPENAI_API_KEY=" sk-proj-abc123 "
# Fixed
OPENAI_API_KEY=sk-proj-abc123
In code, trim defensively if keys come from external config:
const apiKey = process.env.OPENAI_API_KEY?.trim();
4. You are mixing server and client environments
In Next.js or other hybrid setups, the key may exist on the server but not in browser code. LangChain model calls must happen server-side.
// Broken: client component / browser bundle
const llm = new ChatOpenAI({
apiKey: process.env.OPENAI_API_KEY,
});
// Fixed: server-only route handler or server action
export async function POST() {
const llm = new ChatOpenAI({
apiKey: process.env.OPENAI_API_KEY,
});
}
If this code ends up in a browser bundle, your env var will be undefined unless it’s intentionally exposed — which you should not do for API keys.
How to Debug It
- •
Print the resolved key source before constructing the model
- •Check whether it exists and whether it’s empty.
console.log("OPENAI_API_KEY exists:", !!process.env.OPENAI_API_KEY); - •
Confirm which class you’re using
- •
ChatOpenAIexpects an OpenAI key. - •
ChatAnthropicexpects an Anthropic key. - •Don’t pass a generic “LangChain key” unless your wrapper explicitly uses one.
- •
- •
Inspect the exact runtime error
- •If you see:
- •
Error: OpenAI API key not found - •
401 Unauthorized - •
Incorrect API key provided
- •
- •then the problem is auth/configuration, not prompt formatting or chain logic.
- •If you see:
- •
Verify env loading in your actual execution path
- •The file that sets up dotenv must run before any LangChain imports that instantiate clients.
- •In monorepos and tests, make sure each package/test runner loads its own env file.
Prevention
- •Validate required env vars at startup instead of letting requests fail later.
- •Keep provider keys named clearly:
- •
OPENAI_API_KEY - •
ANTHROPIC_API_KEY - •
GOOGLE_API_KEY
- •
- •Instantiate LangChain clients only after config is loaded.
- •In CI and local dev, use a shared
.env.exampleso missing keys are obvious before runtime.
If this error keeps happening intermittently, treat it like a config bootstrap bug. In LangChain TypeScript apps, “invalid API key during development” usually means your app started with incomplete environment state, not that LangChain itself rejected a valid secret.
Keep learning
- •The complete AI Agents Roadmap — my full 8-step breakdown
- •Free: The AI Agent Starter Kit — PDF checklist + starter code
- •Work with me — I build AI for banks and insurance companies
By Cyprian Aarons, AI Consultant at Topiax.
Want the complete 8-step roadmap?
Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.
Get the Starter Kit