DEV Community

Cover image for πŸ‘¨β€πŸ³ Part 4: Coroutines Waiters Who Listen
Anik Sikder
Anik Sikder

Posted on

πŸ‘¨β€πŸ³ Part 4: Coroutines Waiters Who Listen

In the last episode, our restaurant became a smooth-running machine with sous-chefs, conveyor belts, and kitchen gadgets all powered by lazy pipelines.

But what if our waiters could not only serve dishes but also take your order while serving?

Welcome to the world of coroutines where generators become two-way communication channels.


🧠 1. From Generators β†’ Coroutines

So far, our waiters (generators) could only send food out using yield.
Now we’ll make them listen too by using .send().

🍽️ Normal generator (one-way)

def waiter():
    yield "Serving dish 1"
    yield "Serving dish 2"

w = waiter()
print(next(w))
print(next(w))
Enter fullscreen mode Exit fullscreen mode

Output:

Serving dish 1
Serving dish 2
Enter fullscreen mode Exit fullscreen mode

The waiter talks, we listen.
But what if we want to talk back?


🎀 2. Enter .send() β†’ Talking to Waiters

With .send(), you can send data into a running generator.
Let’s make the waiter respond to your requests:

def interactive_waiter():
    print("πŸ‘¨β€πŸ³ Ready to take your order.")
    while True:
        dish = yield "What would you like?"
        print(f"🍲 Serving {dish}...")

w = interactive_waiter()
print(next(w))          # start the waiter
print(w.send("pasta"))  # send order
print(w.send("steak"))
Enter fullscreen mode Exit fullscreen mode

Output:

πŸ‘¨β€πŸ³ Ready to take your order.
What would you like?
🍲 Serving pasta...
What would you like?
🍲 Serving steak...
What would you like?
Enter fullscreen mode Exit fullscreen mode

✨ next() starts the generator up to the first yield.
✨ send() sends a value into the paused generator.

Think of .send() as the customer talking back to the waiter mid-meal.


🧾 3. Real-World Mini Project: Live Event Processor

Imagine we’re building a live analytics system that reacts to new events as they stream in.

def event_collector():
    total = 0
    try:
        while True:
            value = yield f"Total so far: {total}"
            total += value
    except GeneratorExit:
        print(f"Final total: {total}")
Enter fullscreen mode Exit fullscreen mode

Now we can feed it events dynamically:

collector = event_collector()
print(next(collector))        # start
print(collector.send(10))
print(collector.send(5))
print(collector.send(20))
collector.close()             # stop waiter
Enter fullscreen mode Exit fullscreen mode

Output:

Total so far: 0
Total so far: 10
Total so far: 15
Total so far: 35
Final total: 35
Enter fullscreen mode Exit fullscreen mode

πŸ‘‰ .close() politely tells the waiter β€œYou can go home now.”
πŸ‘‰ .throw() (optional) lets you raise an exception inside the generator.


🧨 4. .throw() β†’ Throw Problems at the Waiter

Sometimes, the kitchen runs into trouble 🍳πŸ’₯.
You can throw exceptions into a coroutine to simulate errors.

def chef():
    try:
        while True:
            dish = yield
            print(f"Cooking {dish}")
    except ValueError:
        print("πŸ”₯ Wrong ingredient!")

c = chef()
next(c)
c.send("pasta")
c.throw(ValueError)   # injects error
c.send("salad")
Enter fullscreen mode Exit fullscreen mode

Output:

Cooking pasta
πŸ”₯ Wrong ingredient!
Cooking salad
Enter fullscreen mode Exit fullscreen mode

πŸ‘‰ throw() sends an exception inside the generator, where you can handle it gracefully.


🧩 5. Coroutine Pipelines β†’ Reactive Conveyor Belts

Now let’s combine our powers from earlier delegation + coroutines.

We’ll build a live restaurant pipeline:

  • Orders stream in.
  • One coroutine filters vegetarian dishes.
  • Another coroutine logs them.
def logger():
    while True:
        item = yield
        print(f"🧾 Logged: {item}")

def vegetarian_filter(target):
    while True:
        dish = yield
        if "πŸ₯©" not in dish:
            target.send(dish)

log = logger()
next(log)
veg = vegetarian_filter(log)
next(veg)

# Send live dishes
for dish in ["πŸ₯— salad", "🍝 pasta", "πŸ₯© steak", "🍰 cake"]:
    veg.send(dish)
Enter fullscreen mode Exit fullscreen mode

Output:

🧾 Logged: πŸ₯— salad
🧾 Logged: 🍝 pasta
🧾 Logged: 🍰 cake
Enter fullscreen mode Exit fullscreen mode

This is a real coroutine pipeline: data flows in, stage by stage, with each coroutine doing one job and passing it onward.


βš™οΈ 6. Async/Await β†’ Waiters in Parallel (Modern Coroutines)

Python 3.5+ gave us asyncio, where coroutines can multitask
many waiters serving at once, without blocking each other!

import asyncio

async def waiter(name, delay):
    print(f"{name} started taking orders...")
    await asyncio.sleep(delay)
    print(f"{name} finished!")

async def main():
    await asyncio.gather(
        waiter("πŸ‘¨β€πŸ³ Chef 1", 2),
        waiter("πŸ‘©β€πŸ³ Chef 2", 3)
    )

asyncio.run(main())
Enter fullscreen mode Exit fullscreen mode

Output:

πŸ‘¨β€πŸ³ Chef 1 started taking orders...
πŸ‘©β€πŸ³ Chef 2 started taking orders...
πŸ‘¨β€πŸ³ Chef 1 finished!
πŸ‘©β€πŸ³ Chef 2 finished!
Enter fullscreen mode Exit fullscreen mode

Multiple waiters (async coroutines) handle customers simultaneously no blocking, no chaos.


🎨 ASCII Mental Model

 Customer
    ↓
 [ interactive coroutine ]
    ↓
 [ filter coroutine ]
    ↓
 [ logger coroutine ]
Enter fullscreen mode Exit fullscreen mode

Each waiter not only serves dishes but talks, reacts, and coordinates in real time.


🎬 Wrap-Up: The Grand Kitchen Finale

Over the 4 parts, we’ve built a complete mental model of Python iteration from solo waiters to a full restaurant orchestra:

Part Concept Analogy Key Idea
1 Iterables & Generators Buffet & Waiter Lazy one-way serving
2 itertools Kitchen Gadgets Tools for smart iteration
3 Delegation Sous-Chefs & Pipelines Composable generator chains
4 Coroutines Interactive Waiters Two-way, reactive pipelines

πŸŽ‰ Final Takeaway:

Iterators feed data, generators process data, and coroutines interact with data.

Python’s iteration model isn’t just about looping
it’s a design pattern for scalable, memory-friendly, and interactive data flow.

Top comments (0)