As a best-selling author, I invite you to explore my books on Amazon. Don't forget to follow me on Medium and show your support. Thank you! Your support means the world!
Let's talk about code that writes code. It sounds like something from science fiction, but in Python, it's a practical set of tools you can use today. I'm going to show you eight methods that let your programs examine, modify, and even create other pieces of code. This isn't about being clever for the sake of it. It's about solving real problems, like removing tedious repetition, building adaptable systems, and giving your programs a new level of flexibility.
Think about a time you've written the same pattern of code over and over. Maybe it's checking inputs, logging function calls, or managing resources. Meta-programming offers a way out of that loop. It helps you build the tools that write the boring parts for you, so you can focus on the unique logic of your application.
We'll start with one of the most common and useful entry points.
Decorators are like wrappers you can put around your functions. They let you add behavior before and after the function runs without touching the function's own code. It's perfect for tasks you want to apply to many different functions.
import time
from functools import wraps
def timing_decorator(func):
@wraps(func)
def wrapper(*args, **kwargs):
start_time = time.time()
result = func(*args, **kwargs)
end_time = time.time()
print(f"{func.__name__} executed in {end_time - start_time:.4f} seconds")
return result
return wrapper
def retry_decorator(max_attempts=3, delay=1):
def decorator(func):
@wraps(func)
def wrapper(*args, **kwargs):
last_exception = None
for attempt in range(max_attempts):
try:
return func(*args, **kwargs)
except Exception as e:
last_exception = e
if attempt < max_attempts - 1:
time.sleep(delay)
raise last_exception
return wrapper
return decorator
@timing_decorator
@retry_decorator(max_attempts=3)
def api_call(endpoint):
# Simulate an unreliable API
if time.time() % 2 > 1:
raise ConnectionError("Temporary failure")
return f"Data from {endpoint}"
result = api_call("/users")
print(f"Result: {result}")
Here, api_call gets two new behaviors. The retry_decorator will try it up to three times if it fails. The timing_decorator then records how long the whole process took. You can stack these, and more importantly, you can apply them to any other function with a simple @ symbol. I use this pattern constantly for adding logging, permission checks, or caching.
While decorators change functions and classes after they're made, sometimes you need to change the blueprint itself. That's where metaclasses come in. A metaclass is like a template for creating classes. It lets you intervene at the moment a class is defined.
Imagine you need a guarantee that only one instance of a particular class ever exists—a Singleton. A metaclass can enforce this.
class SingletonMeta(type):
_instances = {}
def __call__(cls, *args, **kwargs):
if cls not in cls._instances:
cls._instances[cls] = super().__call__(*args, **kwargs)
return cls._instances[cls]
class DatabaseConnection(metaclass=SingletonMeta):
def __init__(self, connection_string):
self.connection_string = connection_string
print(f"Connecting to {self.connection_string}")
# Both variables point to the exact same object
db1 = DatabaseConnection("postgresql://localhost/db")
db2 = DatabaseConnection("postgresql://localhost/db")
print(f"Same instance? {db1 is db2}") # Output: True
The SingletonMeta metaclass stores instances in a dictionary. When you try to call DatabaseConnection() to create an object, the metaclass's __call__ method checks if one already exists. If it does, it returns the old one. The class itself doesn't contain this logic; the metaclass manages it behind the scenes.
You can also use metaclasses for automatic setup. Let's say you want all classes with certain attributes to get automatic validation methods.
class ValidationMeta(type):
def __new__(cls, name, bases, attrs):
for attr_name, value in attrs.items():
if isinstance(value, ValidatedAttribute):
attrs[f'_validate_{attr_name}'] = value.create_validator()
return super().__new__(cls, name, bases, attrs)
class ValidatedAttribute:
def __init__(self, type_hint, validator=None):
self.type_hint = type_hint
self.validator = validator
def create_validator(self):
def validator(value):
if not isinstance(value, self.type_hint):
raise TypeError(f"Expected {self.type_hint}, got {type(value)}")
if self.validator:
self.validator(value)
return True
return validator
class Product(metaclass=ValidationMeta):
name = ValidatedAttribute(str, lambda x: len(x) > 0)
price = ValidatedAttribute(float, lambda x: x >= 0)
def __init__(self, name, price):
# Use the auto-generated validation methods
getattr(self, f'_validate_name')(name)
getattr(self, f'_validate_price')(price)
self.name = name
self.price = price
# This will fail because the name is empty and the price is negative
try:
product = Product("", -10)
except (TypeError, ValueError) as e:
print(f"Validation worked: {e}")
When Python creates the Product class, the ValidationMeta metaclass runs. It looks for ValidatedAttribute objects and, for each one, adds a corresponding _validate_... method to the class. The Product.__init__ method can then use these methods. This keeps the validation rules declared right next to the attribute, clean and self-contained.
What happens when you try to access an attribute that doesn't exist? Normally, Python raises an AttributeError. But you can change that. The __getattr__ method is your custom handler for missing attributes.
This is incredibly useful for creating proxy objects or lazy-loading systems. A proxy object stands in for another object, controlling access to it.
class LazyLoader:
def __init__(self):
self._loaded_data = {}
def __getattr__(self, name):
if name not in self._loaded_data:
print(f"Loading {name} from database...")
self._loaded_data[name] = self._load_from_source(name)
return self._loaded_data[name]
def _load_from_source(self, key):
time.sleep(0.5) # Simulate slow database fetch
return f"Data for {key}"
loader = LazyLoader()
print(f"First access: {loader.user_profile}") # Triggers loading
print(f"Second access: {loader.user_profile}") # Returns cached data
The first time you ask for loader.user_profile, Python can't find that attribute. So it calls __getattr__('user_profile'). Our method sees it's not in the cache, simulates a slow load from a database, stores the result, and returns it. The next time you ask, the data is already in the cache and is returned instantly. This pattern is great for expensive resources.
Let's look at a configuration proxy example.
class ConfigurationProxy:
def __init__(self, config_dict):
self._config = config_dict
self._accessed_keys = set()
def __getattr__(self, name):
if name in self._config:
self._accessed_keys.add(name)
return self._config[name]
raise AttributeError(f"No configuration for '{name}'")
def get_accessed_keys(self):
return sorted(self._accessed_keys)
config = ConfigurationProxy({
'database_host': 'localhost',
'database_port': 5432
})
print(f"Host: {config.database_host}")
print(f"Accessed keys: {config.get_accessed_keys()}")
# Trying config.some_unknown_key would raise an AttributeError
Here, the proxy lets you access configuration values with dot notation (config.database_host), while secretly tracking which keys were used. This can help with auditing or detecting unused settings.
This technique is where things get powerful. Instead of just modifying existing code, you can build new code as text or as an Abstract Syntax Tree (AST) and then execute it. The ast module lets you construct code programmatically.
Why would you do this? Imagine building a custom query language, or generating a batch of similar functions from a configuration file.
import ast
import inspect
class CodeGenerator:
def create_function(self, name, params, body_statements):
args = ast.arguments(
args=[ast.arg(arg=param) for param in params],
kwonlyargs=[], defaults=[]
)
body = []
for stmt in body_statements:
body.append(stmt)
func_def = ast.FunctionDef(
name=name,
args=args,
body=body,
decorator_list=[]
)
module = ast.Module(body=[func_def], type_ignores=[])
ast.fix_missing_locations(module)
code = compile(module, filename='<generated>', mode='exec')
exec(code, globals())
return locals()[name]
generator = CodeGenerator()
# Let's generate a simple 'add' function
add_body = [
ast.Return(ast.BinOp(
left=ast.Name(id='a', ctx=ast.Load()),
op=ast.Add(),
right=ast.Name(id='b', ctx=ast.Load())
))
]
generated_add = generator.create_function('generated_add', ['a', 'b'], add_body)
print(f"5 + 3 = {generated_add(5, 3)}")
We built the AST nodes for a function that takes two arguments (a and b) and returns their sum. We then compiled and executed that AST to create a real, working function in memory.
You can generate more complex logic too.
# Generate a 'classify' function: return 'positive' if x > 0, else 'non-positive'
if_body = [
ast.If(
test=ast.Compare(
left=ast.Name(id='x', ctx=ast.Load()),
ops=[ast.Gt()],
comparators=[ast.Constant(value=0)]
),
body=[ast.Return(ast.Constant(value='positive'))],
orelse=[ast.Return(ast.Constant(value='non-positive'))]
)
]
classify = generator.create_function('classify_number', ['x'], if_body)
print(f"Classify 10: {classify(10)}")
print(f"Classify -5: {classify(-5)}")
This approach is fundamental to how many frameworks work. They take your high-level declarations and generate the specific Python code needed to execute them.
Sometimes you don't know what code you'll need until the program is running. Dynamic imports let you load modules based on conditions, user input, or configuration files. This is the foundation of plugin architectures.
The importlib module is your tool for this.
import importlib
import sys
from pathlib import Path
class PluginLoader:
def __init__(self, plugin_directory):
self.plugin_directory = Path(plugin_directory)
self.plugins = {}
def discover_plugins(self):
for module_file in self.plugin_directory.glob("*.py"):
if module_file.stem == "__init__":
continue
module_name = f"plugins.{module_file.stem}"
try:
spec = importlib.util.spec_from_file_location(module_name, module_file)
module = importlib.util.module_from_spec(spec)
spec.loader.exec_module(module)
for attr_name in dir(module):
attr = getattr(module, attr_name)
if isinstance(attr, type) and hasattr(attr, 'is_plugin'):
self.plugins[attr_name] = attr
print(f"Loaded plugin: {attr_name}")
except Exception as e:
print(f"Failed to load {module_file}: {e}")
def get_plugin(self, name, *args, **kwargs):
plugin_class = self.plugins.get(name)
if plugin_class:
return plugin_class(*args, **kwargs)
raise ValueError(f"Unknown plugin: {name}")
# Example of a plugin base class you might define
class BasePlugin:
is_plugin = True
def execute(self, data):
raise NotImplementedError
# Simulate finding a plugin file
loader = PluginLoader("/tmp/my_plugins")
loader.discover_plugins()
This loader scans a directory for Python files, imports each one, and looks for classes marked as plugins. Your main application can then instantiate and use these plugins without ever having hardcoded their names. It makes your system extensible by others.
You can also import by string.
class DynamicImporter:
@staticmethod
def import_from_string(import_path):
module_path, class_name = import_path.rsplit('.', 1)
module = importlib.import_module(module_path)
return getattr(module, class_name)
# Use a string to import the 'dumps' function from the 'json' module
json_dumper = DynamicImporter.import_from_string("json.dumps")
data = json_dumper({"key": "value"})
print(f"Dynamically imported function result: {data}")
This is how many web frameworks convert string-based settings like 'path.to.MyViewClass' into actual Python classes.
Descriptors are objects that manage the access (get, set, delete) of another attribute. The built-in @property decorator is actually a descriptor. You can create your own to add custom logic whenever an attribute is accessed or changed.
A classic use case is validated or cached attributes.
class ValidatedDescriptor:
def __init__(self, validator=None, default=None):
self.validator = validator
self.default = default
self.data = {} # Stores data per instance
def __get__(self, obj, objtype=None):
if obj is None:
return self
return self.data.get(id(obj), self.default)
def __set__(self, obj, value):
if self.validator and not self.validator(value):
raise ValueError(f"Invalid value: {value}")
self.data[id(obj)] = value
def __delete__(self, obj):
if id(obj) in self.data:
del self.data[id(obj)]
class Configuration:
port = ValidatedDescriptor(
validator=lambda x: 1 <= x <= 65535,
default=8080
)
config = Configuration()
print(f"Default port: {config.port}") # 8080
config.port = 3000
print(f"Set port: {config.port}") # 3000
try:
config.port = 70000 # Invalid, will raise ValueError
except ValueError as e:
print(f"Caught error: {e}")
The ValidatedDescriptor object is assigned as a class attribute port. When you do config.port = 3000, it's not a simple assignment. Python calls the descriptor's __set__ method, which runs the validator. The value is then stored in the descriptor's data dictionary, keyed by the instance's id. This keeps the data for each instance separate.
Let's build a read-only cached property.
class CachedProperty:
def __init__(self, func):
self.func = func
self.cache = {}
def __get__(self, obj, objtype=None):
if obj is None:
return self
cache_key = id(obj)
if cache_key not in self.cache:
self.cache[cache_key] = self.func(obj)
return self.cache[cache_key]
def __set__(self, obj, value):
raise AttributeError("CachedProperty is read-only")
class DataProcessor:
def __init__(self, data):
self.data = data
@CachedProperty
def expensive_summary(self):
print("Performing heavy calculation...")
time.sleep(1) # Simulate hard work
return sum(self.data) / len(self.data)
processor = DataProcessor([1, 2, 3, 4, 5])
print(f"First call: {processor.expensive_summary}") # Calculates
print(f"Second call: {processor.expensive_summary}") # Returns cached value
The @CachedProperty decorator wraps the method. The first time you access expensive_summary, the descriptor's __get__ runs the original function and saves the result. Every subsequent access returns the cached value instantly. It's a clean way to optimize expensive operations.
Context managers, defined by the with statement, give you a reliable way to set up and tear down resources. You know them from working with files (with open('file.txt') as f:). You can build your own for any paired operations.
from contextlib import contextmanager, AbstractContextManager
import tempfile
import os
class DatabaseTransaction:
def __init__(self, connection):
self.connection = connection
def __enter__(self):
print("Starting transaction")
self.connection.execute("BEGIN")
return self
def __exit__(self, exc_type, exc_val, exc_tb):
if exc_type is None:
print("Committing transaction")
self.connection.execute("COMMIT")
else:
print("Rolling back transaction")
self.connection.execute("ROLLBACK")
return False # Let any exception propagate
class MockConnection:
def execute(self, query):
print(f" Executing: {query}")
conn = MockConnection()
try:
with DatabaseTransaction(conn):
conn.execute("INSERT INTO users VALUES ('Alice')")
# Uncomment next line to simulate a failure and trigger rollback
# raise ValueError("Oops!")
except Exception:
print("An error occurred, but the transaction was rolled back.")
The __enter__ method sets up the transaction. The __exit__ method is your cleanup hook. It receives information about any exception that happened inside the with block. If there was no exception (exc_type is None), it commits. If there was, it rolls back. The return False tells Python to not suppress the exception.
The contextlib module makes it even easier for functions.
@contextmanager
def temporary_workspace(cleanup=True):
workspace = tempfile.mkdtemp()
original_cwd = os.getcwd()
os.chdir(workspace)
print(f"Working in {workspace}")
try:
yield workspace # This is where the 'with' block runs
finally:
os.chdir(original_cwd)
if cleanup:
import shutil
shutil.rmtree(workspace)
print(f"Cleaned up {workspace}")
with temporary_workspace() as ws:
with open("my_file.txt", "w") as f:
f.write("test")
print(f"Created file in: {os.listdir(ws)}")
# The directory is now gone
The @contextmanager decorator lets you write a generator function. Everything before the yield is __enter__. The yielded value becomes the as variable. Everything after, inside the finally block, is __exit__. It's a very concise pattern.
This is an advanced area, but it reveals how Python works under the hood. The dis module lets you see the bytecode—the low-level instructions the Python interpreter executes. You can even create or modify it.
import dis
import types
def simple_function(x):
return x * 2
print("Bytecode for simple_function:")
dis.dis(simple_function)
This will print out the step-by-step instructions. It can be helpful for debugging performance issues or understanding a tricky piece of code.
You can go further and create a function from raw bytecode.
def create_function_from_code():
# Bytecode for: def add(a, b): return a + b
bytecode = bytes([124, 0, 124, 1, 23, 0, 83, 0])
code_obj = types.CodeType(
2, # argcount
0, # posonlyargcount
2, # kwonlyargcount? Actually nlocals
2, # nlocals
1, # stacksize
67, # flags
bytecode,
(None,), # constants
('a', 'b'), # varnames
(), (), # freevars, cellvars
'<generated>', # filename
'add', # name
1, # firstlineno
b'' # linetable
)
return types.FunctionType(code_obj, {})
my_add = create_function_from_code()
print(f"Generated add function: {my_add(7, 3)}")
This is complex, but it shows the fundamental building blocks of a function. Some optimization and obfuscation tools work at this level.
You've seen eight different angles on meta-programming. They range from the everyday utility of decorators to the deep mechanics of bytecode. The goal isn't to use them all in every project. The goal is to know they exist.
When you face a problem involving repetitive structure, think of decorators or code generation. When you need runtime adaptability, think of dynamic imports or __getattr__. When you need to enforce rules across many classes, consider descriptors or metaclasses.
These techniques are the foundation of major frameworks like Django, SQLAlchemy, and Pytest. They let those libraries provide a clean, high-level interface while generating the complex code required to make it work. By understanding these tools, you move from just using Python to shaping it, allowing you to build systems that are more expressive, maintainable, and powerful.
📘 Checkout my latest ebook for free on my channel!
Be sure to like, share, comment, and subscribe to the channel!
101 Books
101 Books is an AI-driven publishing company co-founded by author Aarav Joshi. By leveraging advanced AI technology, we keep our publishing costs incredibly low—some books are priced as low as $4—making quality knowledge accessible to everyone.
Check out our book Golang Clean Code available on Amazon.
Stay tuned for updates and exciting news. When shopping for books, search for Aarav Joshi to find more of our titles. Use the provided link to enjoy special discounts!
Our Creations
Be sure to check out our creations:
Investor Central | Investor Central Spanish | Investor Central German | Smart Living | Epochs & Echoes | Puzzling Mysteries | Hindutva | Elite Dev | Java Elite Dev | Golang Elite Dev | Python Elite Dev | JS Elite Dev | JS Schools
We are on Medium
Tech Koala Insights | Epochs & Echoes World | Investor Central Medium | Puzzling Mysteries Medium | Science & Epochs Medium | Modern Hindutva
Top comments (0)