DEV Community

Basstardd
Basstardd

Posted on

The Two-Field Mental Model: What Python Frameworks actually Hide From You?

Frameworks hide imperative complexity behind declarative simplicity. That's fine—until it isn't. And this is the price we pay for comfort and convenience. When abstractions leak, understanding the machinery underneath becomes non-optional. But in most of cases we are just staring at the wall of unintelligible framework code. And that is a kind of a problem....

You write @pytest.fixture. Pytest auto-magically handles the rest.

You write async def. FastAPI and Uvicorn's event loop figures out the rest.

You write Column(String). SQLAlchemy maps it to your database. It allows you to define class attributes but actually wait, they are class instance attributes at the end? Did we just break Python? No, just a bit of masking that allows app devs to practice declarative programming without thinking about the inner workings of the framework or library.

This is the basic deal we make with frameworks and libraries: we declare what we want, and they handle how it happens. This is mainly a good deal—until your fixture mysteriously runs twice, and you end up scratching your head without a clue what could possibly go wrong. At that point, "the framework will handles it" becomes "I have no idea what's going on?"

About Two Fields

Most Python libraries/frameworks operate on two distinct layers:

Layer Nature You See It?
Observable Field Declarative Yes—this is your code
Hidden Field Imperative No—this is the machinery

Declarative = expressing what you want (intent, structure, outcome)
Imperative = executing how it happens (step-by-step machinery)

The observable field is comfortable. The hidden field is where bugs live.

Seeing Through the Abstraction

Let's pull back the curtain on three common patterns.

1. Async/Await with FastAPI-Uvicorn

What you write:

@app.get()
async def fetch_data():
    return await db.query()
Enter fullscreen mode Exit fullscreen mode

Clean. Declarative. "Fetch data asynchronously."

What actually happens:

coro = fetch_data()        # Creates a coroutine object (not executed yet!)
await coro                  # Event loop: schedule, suspend, resume, complete
Enter fullscreen mode Exit fullscreen mode

The async def doesn't run your function—it creates a coroutine object. That object implements the coroutine protocol (__await__, send(), throw()). The event loop drives execution by repeatedly calling send() until completion.

At the level of FastAPI, you define an async router and path—and that's it. Under the hood, this coroutine obj have life on its own! Function needs to be converted into a coroutine object and managed by the event loop. All this happens somewhere deep underground where the majority of devs never dare to go.


2. Pytest Fixtures

What you write:

@pytest.fixture
def db_session():
    session = create_session()
    yield session
    session.close()
Enter fullscreen mode Exit fullscreen mode

Elegant. Setup, provide value, teardown.

What actually happens:

gen = fixture_func()       # Generator object created
value = next(gen)          # __next__ called → setup runs → value yielded
# ... your test executes with 'value' ...
next(gen)                  # Resumes after yield → cleanup runs → StopIteration
Enter fullscreen mode Exit fullscreen mode

That yield isn't magic—it's part of the generator function and iterator protocol. Pytest calls this generator function, gets a generator, advances it to get your fixture value, runs your test, then advances it again to trigger cleanup (which ends with StopIteration).


Here it's important to notice one "creative misuse" of generators. This is a clever use of yield—not for its intended purpose (iterating multiple values), but specifically to exploit its suspend/resume behavior for cleanup.

If you look semantically, return would be more appropriate for "give back one value." But return can't do cleanup, so we "abuse" yield for its superpower: pausing execution and resuming later.

# What we WANT semantically:
def fixture():
    resource = setup()
    return resource      # ← more appropriate for "one value"
    cleanup(resource)    # ← but this is impossible

# What we MUST do:
def fixture():
    resource = setup()
    yield resource       # ← "creative" use, not really iterating
    cleanup(resource)    # ← now this works!
Enter fullscreen mode Exit fullscreen mode

By the way, this pattern is so common it has a name: "generator-based resource management" or "generator-based context managers."


Now back to main article thread =>

In short, you can live long and happy life at the surface level of the async fixture generator function, but underneath the surface a lot is happening. Pytest manages all of these details for you, allowing your test to run successfully with the context values it needs.


3. SQLAlchemy ORM

What you write:

class User(Base):
    name = Column(String)      # Looks like a class attribute
Enter fullscreen mode Exit fullscreen mode

What actually happens:

# During class creation:
Column.__set_name__(User, 'name')    # Column captures its attribute name

# During instantiation:
user = User(name="Alice")
Column.__set__(user, "Alice")        # Descriptor intercepts assignment

# During access:
user.name
Column.__get__(user, User)           # Descriptor intercepts, returns from instance.__dict__
Enter fullscreen mode Exit fullscreen mode

That name = Column(String) is a descriptor. When you access user.name, Python doesn't just return a value—it calls Column.__get__(). The Column object on the class orchestrates everything; actual data lives on the instance.

When this bites you: You try to access User.name expecting a default value, but get a Column object instead. Class-level access and instance-level access behave completely differently.


The Protocols Behind the Magic

The hidden field isn't random—it's systematic. Python's protocols are the machinery:

  • Descriptor Protocol__get__, __set__, __set_name__
  • Iterator Protocol__iter__, __next__
  • Context Manager Protocol__enter__, __exit__
  • Coroutine Protocol__await__, send(), throw()
  • Metaclass Machinery__new__, __init_subclass__

Every framework you use is built on these. When you understand the protocols, you understand the frameworks.


Why This Matters

For debugging: When declarative code misbehaves, the bug lives in the hidden field. You can't debug what you can't see.

For mental models: Understanding protocols makes any library transparent. New framework? Same protocols, different application.

For API design: If you build libraries, this is your job—hiding imperative complexity behind declarative interfaces.

For mastery: There's a difference between using Python and understanding Python. The gap is the hidden field.


The Practical Takeaway

You don't need to memorize every protocol. You need to know they exist and where to look.

Next time something breaks:

  1. Identify what abstraction you're using
  2. Ask: "What protocol enables this?"
  3. Look at what the machinery actually does

So is the deal worth it? Absolutely. Just know that someday you'll need to read the fine print.

Top comments (0)