If you’ve ever written from dotenv import load_dotenv; load_dotenv() at the top of a Python script, you’re not alone. It’s a rite of passage for developers managing environment variables—until it bites you.
This little convenience from python-dotenv promises to load your .env file into os.environ with minimal fuss, but beneath its simplicity lurks a mess of anti-patterns that clash with Python’s core principles.
It’s implicit, fragile, and a maintenance nightmare. Let’s tear it apart and build something better with Pydantic, python-decouple, and good old os.getenv().
The Sins of load_dotenv()
It’s Un-Pythonic Magic
Python’s Zen (PEP 20) chants “explicit is better than implicit,” but load_dotenv() is a sorcerer’s trick. Call it, and it silently mutates os.environ—no meaningful return value, no clear signal of success.
Did it find .env?
Did it override existing vars?
You’re left guessing.
With load_dotenv(), you’re stuck probing os.environ to see what happened. That’s not how Python rolls.
No meaningful "Return", Just Side Effects
A good function gives you something back. load_dotenv()? Meh! It returns boolean. Is that useful? Nope. It alters global state like a rogue script kiddie. You call it, and your environment changes behind a curtain.
Want to know what it loaded? Tough luck—go fish in os.environ.
A Pythonic version might return a dict of loaded values, letting you decide what to do next. Instead, it’s a void that forces trust over transparency.
It Breaks Single Responsibility
The Single Responsibility Principle says one function, one job. load_dotenv() laughs at that. It locates .env, parses it, decides override rules, and shoves everything into os.environ—all in one opaque blob.
That’s four responsibilities mashed together, making it brittle and hard to debug. A saner design would split those steps, but no, it’s a monolith that does too much and hides too much.
Fragile in Production
In dev, .env files are cute. In production? They’re a liability. Platforms like Docker or Heroku inject environment variables directly—no file needed. load_dotenv() ties you to a file-based workflow that falls apart outside your laptop.
Worse, its order-dependence (call it too late, and imported modules miss the vars) turns your codebase into a timing puzzle. It’s a dev crutch that doesn’t scale.
The Better Way: Three Pythonic Alternatives
Ditch the magic. Here are three solutions that respect Python’s ethos, keep your code clean, and work from dev to prod.
1. Pydantic Settings: The Modern Powerhouse
Pydantic’s BaseSettings (or BaseModel with settings config in 2.x) brings type safety and clarity to your config game.
from pydantic_settings import BaseSettings
class Settings(BaseSettings):
database_url: str
api_key: str
class Config:
env_file = ".env" # Optional
settings = Settings()
print(settings.database_url)
Why It Rocks:
- Explicitly defines what you need, with types.
- Pulls from env vars or
.envwithout mutating globals. - Raises loud errors if something’s missing (no silent fails).
- Scales seamlessly to production—no file dependency required.
Trade-Off: Adds pydantic as a dependency, but the robustness is worth it.
2. Python-Decouple: Lightweight and Pragmatic
python-decouple keeps it simple, fetching values without the bloat.
from decouple import config
database_url = config("DATABASE_URL")
api_key = config("API_KEY", default="fallback", cast=str)
print(database_url)
Why It Rocks:
- Explicit key-by-key access—no blanket loads.
- Returns values, not side effects.
- Supports defaults for resilience.
- Leaner than Pydantic, still avoids
load_dotenv()’s traps.
Trade-Off: Another dependency, less validation than Pydantic.
3. Export + os.getenv(): The No-Dependency Classic
Skip libraries. Set vars with export (or your platform’s equivalent) and use Python’s built-in os.getenv().
from os import getenv
database_url = getenv("DATABASE_URL")
api_key = getenv("API_KEY", "default_value")
print(database_url)
Why It Rocks:
- Zero dependencies—pure Python.
- Explicit reads, no hidden writes.
- Works anywhere env vars are set (shell, Docker, CI).
- Total control, no file assumptions.
Trade-Off: Manual setup in dev (e.g., export MY_VAR=value), no .env parsing out of the box.
Making It Work for You
- Big Project? Pydantic’s your friend—structured, typed, future-proof.
-
Quick and Dirty?
python-decouplegets you there without fuss. -
Minimalist?
os.getenv()with a shell script (e.g.,source .env) keeps it raw and real.
For dev convenience, Pydantic and python-decouple can still use .env files—but they don’t force it. With os.getenv(), pair it with a script like:
#!/bin/bash
set -a
source .env
set +a
python app.py
The Verdict
load_dotenv() isn’t evil—it’s just lazy. It trades short-term ease for long-term pain, breaking Python’s clarity and control. Swap it for Pydantic’s polish, python-decouple’s simplicity, or os.getenv()’s purity. Your code will thank you, your team will thank you, and you’ll never debug a mysterious env var again.
What’s your pick? Drop a comment—I’m betting on Pydantic for the win.



Top comments (4)
I’ve been using the final method a lot more lately as it mirrors a production environment more closely. I’ll still maintain a .env file but just source it so when running the script it’s in the environment and I don’t have to export each environment separately. Pydantic seems great though I’ll have to check it out, really seems better for the dev experience
I use it for ad hoc projects too. In production we often have Pydantic as we usually offer a FastAPI way to communicate with our services. Given that, Pydantic has taken over due to its validation
Adding another pain point to using load_dotenv.
String values with a "#" are treated as a comment start even if in a quoted string.
I switched to pydantic and won't be looking back.
Thanks for this article!
I want to elaborate on this: for developer convenience, I usually recommend direnv (from direnv.net) or something similar. The 12factor app should be GIVEN its environment, not MAKE its own environment.