How to Fix 'invalid API key' in LangChain (Python)

By Cyprian AaronsUpdated 2026-04-21
invalid-api-keylangchainpython

What this error actually means

invalid API key in LangChain usually means the underlying provider SDK rejected the credential before any model call happened. You’ll see it when LangChain tries to initialize ChatOpenAI, OpenAI, AzureChatOpenAI, or another provider wrapper with a missing, malformed, or mismatched key.

Typical symptoms:

  • It works in one script, fails in another
  • The key is set in your shell, but not in your IDE
  • You copied an Azure key into the OpenAI wrapper
  • The error comes from the provider SDK, not LangChain itself

The Most Common Cause

The #1 cause is using the wrong environment variable name or passing the key incorrectly.

With modern LangChain, the OpenAI wrappers expect OPENAI_API_KEY unless you explicitly pass api_key. If you set OPEN_AI_KEY, API_KEY, or rely on a .env file that never gets loaded, you’ll get errors like:

  • openai.AuthenticationError: Incorrect API key provided
  • AuthenticationError: No API key provided
  • ValueError: Did not find openai_api_key, please add an environment variable OPENAI_API_KEY

Broken vs fixed

BrokenFixed
```python
from langchain_openai import ChatOpenAI

llm = ChatOpenAI(model="gpt-4o-mini") |python import os from langchain_openai import ChatOpenAI

llm = ChatOpenAI( model="gpt-4o-mini", api_key=os.environ["OPENAI_API_KEY"], )

| ```bash
export OPEN_AI_KEY="sk-..."
``` | ```bash
export OPENAI_API_KEY="sk-..."
``` |

If you use `.env`, make sure it is actually loaded before instantiating the client:

```python
from dotenv import load_dotenv
load_dotenv()

from langchain_openai import ChatOpenAI

llm = ChatOpenAI(model="gpt-4o-mini")

Without load_dotenv(), LangChain won’t magically read your file.

Other Possible Causes

1) You’re using the wrong provider class for your key

OpenAI and Azure OpenAI keys are not interchangeable.

# Wrong: Azure key with OpenAI class
from langchain_openai import ChatOpenAI
llm = ChatOpenAI(api_key="AZURE_KEY", model="gpt-4o-mini")
# Right: Azure OpenAI class with Azure config
from langchain_openai import AzureChatOpenAI

llm = AzureChatOpenAI(
    azure_deployment="my-deployment",
    api_version="2024-02-15-preview",
    azure_endpoint="https://my-resource.openai.azure.com/",
    api_key="AZURE_KEY",
)

2) Your key has extra whitespace or quotes

This happens a lot when copying from secrets managers or shell scripts.

api_key = " sk-proj-abc123\n"
llm = ChatOpenAI(api_key=api_key)

Fix it by stripping the value:

api_key = os.getenv("OPENAI_API_KEY", "").strip()
llm = ChatOpenAI(api_key=api_key)

3) You set the variable in one process, but run Python in another

A common example is setting env vars in one terminal and running code from PyCharm, VS Code, Docker, or a notebook kernel that never inherited them.

export OPENAI_API_KEY="sk-..."
python app.py

But inside Jupyter:

import os
print(os.getenv("OPENAI_API_KEY"))  # None

Fix by setting the variable in the same runtime, or loading it explicitly with dotenv.

4) You’re on an old LangChain/OpenAI package combination

Old imports can trigger confusing auth failures because the wrapper behavior changed across versions.

# Old style that may break depending on versions installed
from langchain.chat_models import ChatOpenAI

Use the current package split:

from langchain_openai import ChatOpenAI

Also check your installed versions:

pip show langchain langchain-openai openai

If you have mismatched major versions, upgrade together.

How to Debug It

  1. Print what LangChain will actually use

    import os
    print(repr(os.getenv("OPENAI_API_KEY")))
    

    If this prints None, empty string, or includes spaces/newlines, you found the issue.

  2. Check whether the error comes from LangChain or the provider SDK Look for messages like:

    • openai.AuthenticationError: Incorrect API key provided
    • AuthenticationError: No API key provided

    That means LangChain passed something down correctly, but the provider rejected it.

  3. Verify you’re using the right class

    • OpenAI: ChatOpenAI
    • Azure OpenAI: AzureChatOpenAI
    • Anthropic: ChatAnthropic

    Using the wrong wrapper often looks like an auth issue.

  4. Test outside LangChain Try a direct SDK call with the same env var.

    from openai import OpenAI
    client = OpenAI(api_key=os.environ["OPENAI_API_KEY"])
    print(client.models.list())
    

    If this fails too, your problem is credential/configuration, not LangChain.

Prevention

  • Keep one canonical env var name per provider:
    • OPENAI_API_KEY
    • ANTHROPIC_API_KEY
    • AZURE_OPENAI_API_KEY
  • Load .env files at process startup and fail fast if required secrets are missing.
  • Pin compatible versions of:
    • langchain
    • langchain-openai
    • openai

If you want fewer production surprises, validate config before creating any LLM client:

import os

required = ["OPENAI_API_KEY"]
missing = [k for k in required if not os.getenv(k)]
if missing:
    raise RuntimeError(f"Missing env vars: {missing}")

That saves you from discovering “invalid API key” halfway through a request path.


Keep learning

By Cyprian Aarons, AI Consultant at Topiax.

Want the complete 8-step roadmap?

Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.

Get the Starter Kit

Related Guides