Where I started and how I got here
I’ve been writing Python for about two years now. Still feels like I’m “new” because every time I think I’ve figured it out, Python throws me a new curveball.
In the beginning, it was just print statements and if-else ladders. Then came list comprehensions, then lambda, and then… well, I started breaking things just to understand how they work.
Somewhere between debugging spaghetti functions and reading other people’s cleaner code, I stumbled onto a few features that felt advanced but also cool. Not scary textbook stuff, but practical tools that instantly made my code feel smarter.
So I made a list.
These are the 7 features that made me go, “Wait, Python can do THAT?”
If you’re also somewhere between “I kinda get it” and “please don’t make me touch metaclasses,” this list is for you.
1. Walrus operator :=
the lazy genius of assignments inside expressions
When I first saw :=
, I thought it was a typo. Looked like someone fell asleep on the keyboard.
Then I saw this:
while (line := file.readline()) != "":
process(line)
And my brain short-circuited.
I used to write:
line = file.readline()
while line != "":
process(line)
line = file.readline()
Both work the same but the walrus version? Cleaner. Less cognitive load. It feels like a pro move.
What is it?
The walrus operator (introduced in Python 3.8) allows you to assign and return a value in a single expression. Think of it like a shortcut that lets you reuse a value without writing an extra line.
When it clicked for me
I was filtering a massive list of objects and needed to both check if a value existed and use it without calling the function twice.
if (match := re.search(pattern, line)):
print(match.group(0))
Before, I would’ve written:
match = re.search(pattern, line)
if match:
print(match.group(0))
Both are valid. But one of them makes you feel like you finally speak Python like a native.
2. Data classes your classes, now with less boilerplate
There’s a certain pain that comes with writing class constructors in Python when you’re doing it the old-school way:
class Person:
def init(self, name, age):
self.name = name
self.age = age
Every beginner tutorial shows you this. And it works… until you have 6 fields. Then it gets annoying. You write self.
a hundred times and start wondering if you're doing something wrong.
Enter @dataclass
Here’s the glow-up:
from dataclasses import dataclass
@dataclass
class Person:
name: str
age: int
That’s it. No constructor. No repr
, eq
, or other magic methods. Python just gives them to you.
Why it blew my mind
I was working on a side project with a bunch of models, and my classes were getting out of hand. I needed clean, readable code that wasn’t buried in boilerplate.
Using @dataclass
instantly made everything more elegant. I could even set default values or make fields optional with just a few extra keystrokes.
@dataclass
class Product:
name: str
price: float
in_stock: bool = True
It’s like Python saying: “Hey, I got you. Stop writing stuff I can handle.”
Bonus: Immutability
Want your objects to be immutable (like a tuple)? Just add:
@dataclass(frozen=True)
class Config:
debug: bool
retries: int
Now trying to change config.debug = False
will raise an error. This saved me from dumb bugs more than once.
3. Enumerate()
goodbye range(len(...))
, and good riddance
I used to write stuff like this all the time:
for i in range(len(my_list)):
print(i, my_list[i])
Not because I loved it because I didn’t know any better.
Then someone commented on my GitHub code:
“You know you can just use
enumerate()
, right?”
And boom. My brain rebooted.
Here’s the better way:
for i, item in enumerate(my_list):
print(i, item)
what makes it better?
- More readable you’re declaring exactly what you’re using: an index and an item.
-
Less error-prone no
IndexError
surprises if you refactor the list. -
Cleaner no unnecessary
range()
wrapper or manual indexing.
when it helped me
I was building a CLI tool that processed user input line-by-line. I needed the line number and the content. enumerate()
made the loop stupidly clean:
for line_number, line in enumerate(user_input.splitlines(), start=1):
print(f"{line_number}: {line}")
That start=1
? Chef’s kiss.
4. Yield
the moment I realized return doesn’t have to be the end
At first, yield
felt like Python’s way of trolling me.
I mean, look at this:
def countdown(n):
while n > 0:
yield n
n -= 1
That’s not returning a list. It’s returning… something that feels like a ghost list.
When I ran it:
for number in countdown(5):
print(number)
I got:
5
4
3
2
1
What’s happening here?
When a function uses yield
, it becomes a generator.
It doesn’t return all the values at once it pauses and resumes on each call.
This is perfect when:
- You’re working with huge datasets
- You want memory efficiency
- You need lazy evaluation (like streaming log files or paginated APIs)
When it clicked for me
I had a CSV file with ~10 million rows (don’t ask).
Loading the whole thing into memory crashed my script faster than a triple nested loop.
So I rewrote the loader with yield
:
def read_large_file(filename):
with open(filename) as f:
for line in f:
yield line.strip()
Now I could iterate row by row without a memory meltdown.
bonus: generators are resumable
Every time yield
runs, it “saves” the state of the function and picks up from there next time.
It's like your function hits a pause button instead of exiting.
5. Args
and kwargs
function flexibility unlocked
This one was like discovering a hidden input slot in Python’s controller.
I’d see it in code:
def do_something(*args, **kwargs):
pass
And I’d think:
“Cool cool… but what black magic is this and why are there asterisks?”
Turns out, it’s one of the most powerful ways to make your functions flexible, reusable, and clean.
What they do
-
*args
: collects extra positional arguments into a tuple -
kwargs
: collects extra keyword arguments into a dictionary
This means your function can accept as many inputs as someone throws at it and sort them out like a boss.
Example:
def show_info(*args, **kwargs):
print("Positional:", args)
print("Keyword:", kwargs)
show_info(1, 2, 3, name="Alex", age=25)
Output:
Positional: (1, 2, 3)
Keyword: {'name': 'Alex', 'age': 25}
When it saved me
I was writing a wrapper function around a third-party library. The underlying method took a weird mix of arguments, and I didn’t want to replicate them manually.
So I wrote:
def call_api(*args, **kwargs):
log_request(args, kwargs)
return third_party_api(*args, **kwargs)
No more worrying about which exact parameters to expect. It just… worked.
Bonus: you can also unpack them
You can use and
**
not only to receive arguments, but also to send them:
options = {"debug": True, "retry": 3}
start_service("localhost", 8080, **options)
It feels like Python saying: “Here’s a shortcut. Don’t make it weird.”
6. Functools.lru_cache
recursion's best friend and your RAM’s, too
Ever written a recursive function that technically works but practically melts your CPU?
Yeah, me too.
Let me introduce you to the decorator that made me feel like a performance wizard:
from functools import lru_cache
I first tried it on a basic Fibonacci function:
@lru_cache
def fib(n):
if n < 2:
return n
return fib(n - 1) + fib(n - 2)
No fancy tricks. Just a simple line of magic: @lru_cache
.
Before: took forever to get to fib(40)
After: instant result
What it does
lru_cache
(Least Recently Used cache) stores the results of previous calls. So if the same input shows up again, Python just returns the cached answer no need to recompute.
Perfect for:
- Recursive functions
- Expensive lookups
- Functions with repeatable input/output
When it saved my neck
I was calculating some deeply nested config dependencies in a tree structure. Re-running the same function on the same node again and again slowed everything to a crawl.
Added one line:
@lru_cache(maxsize=None)
Boom. Problem solved. Like a function with memory but without you needing to manage a cache dictionary manually.
Heads-up
- It only works with pure functions (same input = same output)
- All arguments must be hashable (so no lists or dicts directly)
lru_cache
is like a brain for your function: it remembers what it’s done and doesn’t repeat itself. If only people worked that way.
7. With
and context managers not just for opening files anymore
Like most folks, I first saw with
used like this:
with open("data.txt") as f:
contents = f.read()
So naturally, I thought it was just a shortcut for opening and closing files.
Then I saw this:
from threading import Lock
lock = Lock()
with lock:
# critical section
do_something()
Wait. WHAT?! You can use with
on other things?
Yes. Anything that has enter()
and exit()
under the hood can be used in a with
block.
That’s when I discovered: context managers are low-key Python gold.
What they really are
A context manager is just a way to set something up, do some work, and clean up afterward safely.
It ensures:
- Setup code runs before
- Cleanup runs after even if there’s an error
Think: safe transactions, locks, temp files, database sessions, timing blocks…
You can even write your own
Here’s a simple custom context manager that times how long a block of code takes:
import time
Usage:
with Timer():
time.sleep(1.5)
Boom. Clean, reusable, and no stray try-finally
blocks.
Bonus: contextlib
makes it even easier
You can skip the whole class and use a generator-style context manager:
from contextlib import contextmanager
@contextmanager
def open_resource():
print("Opening resource")
yield
print("Closing resource")
with open_resource():
print("Doing stuff")
Once you realize with
is just a fancy lifecycle manager, you start seeing uses for it everywhere.

Conclusion: still learning, but these made me feel clever
Look, I’m not a Python pro. I still Google basic stuff like how to reverse a list or the difference between is
and ==
(don’t judge me).
But every time I learn a feature like the ones above, Python feels more like a language and less like a puzzle.
The best part? These aren’t obscure, academic features. They’re tools. You can start using them right now and feel the difference in how clean, fast, and flexible your code becomes.
So if you’re somewhere around year 1 or 2 of your Python journey, I hope these gave you some “aha” moments.
And if you’re already past that point? Hey drop your favorite underrated Python feature in the comments. I’m still learning.
Helpful resources & links I actually use
- Official Python docs surprisingly readable if you squint
- realpython.com best tutorials out there, period
- Corey Schafer’s YouTube explained half my career
-
functools
lru_cache
- contextlib and custom context managers

Top comments (0)