In the last episode, our restaurant became a smooth-running machine with sous-chefs, conveyor belts, and kitchen gadgets all powered by lazy pipelines.
But what if our waiters could not only serve dishes but also take your order while serving?
Welcome to the world of coroutines where generators become two-way communication channels.
π§ 1. From Generators β Coroutines
So far, our waiters (generators) could only send food out using yield.
Now weβll make them listen too by using .send().
π½οΈ Normal generator (one-way)
def waiter():
yield "Serving dish 1"
yield "Serving dish 2"
w = waiter()
print(next(w))
print(next(w))
Output:
Serving dish 1
Serving dish 2
The waiter talks, we listen.
But what if we want to talk back?
π€ 2. Enter .send() β Talking to Waiters
With .send(), you can send data into a running generator.
Letβs make the waiter respond to your requests:
def interactive_waiter():
print("π¨βπ³ Ready to take your order.")
while True:
dish = yield "What would you like?"
print(f"π² Serving {dish}...")
w = interactive_waiter()
print(next(w)) # start the waiter
print(w.send("pasta")) # send order
print(w.send("steak"))
Output:
π¨βπ³ Ready to take your order.
What would you like?
π² Serving pasta...
What would you like?
π² Serving steak...
What would you like?
β¨ next() starts the generator up to the first yield.
β¨ send() sends a value into the paused generator.
Think of
.send()as the customer talking back to the waiter mid-meal.
π§Ύ 3. Real-World Mini Project: Live Event Processor
Imagine weβre building a live analytics system that reacts to new events as they stream in.
def event_collector():
total = 0
try:
while True:
value = yield f"Total so far: {total}"
total += value
except GeneratorExit:
print(f"Final total: {total}")
Now we can feed it events dynamically:
collector = event_collector()
print(next(collector)) # start
print(collector.send(10))
print(collector.send(5))
print(collector.send(20))
collector.close() # stop waiter
Output:
Total so far: 0
Total so far: 10
Total so far: 15
Total so far: 35
Final total: 35
π .close() politely tells the waiter βYou can go home now.β
π .throw() (optional) lets you raise an exception inside the generator.
𧨠4. .throw() β Throw Problems at the Waiter
Sometimes, the kitchen runs into trouble π³π₯.
You can throw exceptions into a coroutine to simulate errors.
def chef():
try:
while True:
dish = yield
print(f"Cooking {dish}")
except ValueError:
print("π₯ Wrong ingredient!")
c = chef()
next(c)
c.send("pasta")
c.throw(ValueError) # injects error
c.send("salad")
Output:
Cooking pasta
π₯ Wrong ingredient!
Cooking salad
π throw() sends an exception inside the generator, where you can handle it gracefully.
π§© 5. Coroutine Pipelines β Reactive Conveyor Belts
Now letβs combine our powers from earlier delegation + coroutines.
Weβll build a live restaurant pipeline:
- Orders stream in.
- One coroutine filters vegetarian dishes.
- Another coroutine logs them.
def logger():
while True:
item = yield
print(f"π§Ύ Logged: {item}")
def vegetarian_filter(target):
while True:
dish = yield
if "π₯©" not in dish:
target.send(dish)
log = logger()
next(log)
veg = vegetarian_filter(log)
next(veg)
# Send live dishes
for dish in ["π₯ salad", "π pasta", "π₯© steak", "π° cake"]:
veg.send(dish)
Output:
π§Ύ Logged: π₯ salad
π§Ύ Logged: π pasta
π§Ύ Logged: π° cake
This is a real coroutine pipeline: data flows in, stage by stage, with each coroutine doing one job and passing it onward.
βοΈ 6. Async/Await β Waiters in Parallel (Modern Coroutines)
Python 3.5+ gave us asyncio, where coroutines can multitask
many waiters serving at once, without blocking each other!
import asyncio
async def waiter(name, delay):
print(f"{name} started taking orders...")
await asyncio.sleep(delay)
print(f"{name} finished!")
async def main():
await asyncio.gather(
waiter("π¨βπ³ Chef 1", 2),
waiter("π©βπ³ Chef 2", 3)
)
asyncio.run(main())
Output:
π¨βπ³ Chef 1 started taking orders...
π©βπ³ Chef 2 started taking orders...
π¨βπ³ Chef 1 finished!
π©βπ³ Chef 2 finished!
Multiple waiters (async coroutines) handle customers simultaneously no blocking, no chaos.
π¨ ASCII Mental Model
Customer
β
[ interactive coroutine ]
β
[ filter coroutine ]
β
[ logger coroutine ]
Each waiter not only serves dishes but talks, reacts, and coordinates in real time.
π¬ Wrap-Up: The Grand Kitchen Finale
Over the 4 parts, weβve built a complete mental model of Python iteration from solo waiters to a full restaurant orchestra:
| Part | Concept | Analogy | Key Idea |
|---|---|---|---|
| 1 | Iterables & Generators | Buffet & Waiter | Lazy one-way serving |
| 2 | itertools |
Kitchen Gadgets | Tools for smart iteration |
| 3 | Delegation | Sous-Chefs & Pipelines | Composable generator chains |
| 4 | Coroutines | Interactive Waiters | Two-way, reactive pipelines |
π Final Takeaway:
Iterators feed data, generators process data, and coroutines interact with data.
Pythonβs iteration model isnβt just about looping
itβs a design pattern for scalable, memory-friendly, and interactive data flow.
Top comments (0)