If you've used Flask or FastAPI for more than a week, you've already used decorators — @app.route,
@login_required, @pytest.mark.parametrize. They feel like magic the first time, and
then someone explains what's actually happening and it clicks immediately. A decorator is just a function that
wraps another function. The @ syntax is pure syntactic sugar — @my_decorator above a
function definition is exactly equivalent to writing func = my_decorator(func) after it. That's the
whole secret. The rest is just patterns built on top of that one idea. This article builds the mental model from
scratch, then walks through the patterns you'll actually use: @functools.wraps, decorators with
arguments, class-based decorators, and a handful of real-world examples including @lru_cache,
@dataclass, and a proper @retry with exponential backoff.
Functions Are First-Class Objects
Before decorators make sense, you need to be solid on one Python fact: functions are objects. You can assign them to variables, pass them as arguments to other functions, return them from functions, and store them in lists or dicts. Nothing special happens just because something is a function.
def greet(name: str) -> str:
return f"Hello, {name}"
# Assign to a variable — no () means we're not calling it, just referencing it
say_hello = greet
print(say_hello("Alice")) # 'Hello, Alice'
# Pass a function as an argument
def run_twice(fn, value):
return fn(value), fn(value)
run_twice(greet, "Bob") # ('Hello, Bob', 'Hello, Bob')
# Return a function from another function — this is a "factory"
def make_prefixer(prefix: str):
def prefixed_greet(name: str) -> str:
return f"{prefix}, {name}"
return prefixed_greet
morning_hello = make_prefixer("Good morning")
morning_hello("Carol") # 'Good morning, Carol'The inner function prefixed_greet "closes over" the variable prefix from
the enclosing scope — even after make_prefixer has returned, the inner function still has access
to prefix. This is a
closure,
and it's the mechanism that makes decorators tick. The
Python docs on scoping rules
explain this in detail if you want the full picture.
Building a Decorator from Scratch
A decorator is a function that takes a function and returns a (usually modified) function. The classic first example is a timing decorator — it wraps any function and logs how long it took to run.
import time
import functools
def timer(fn):
@functools.wraps(fn)
def wrapper(*args, **kwargs):
start = time.perf_counter()
result = fn(*args, **kwargs)
elapsed = time.perf_counter() - start
print(f"{fn.__name__} finished in {elapsed:.4f}s")
return result
return wrapper
@timer
def fetch_user_records(db, user_id: int):
"""Fetch all records for a given user from the database."""
return db.query("SELECT * FROM records WHERE user_id = ?", user_id)
# Calling fetch_user_records is now: timer(fetch_user_records)(db, 42)
# Which is exactly what @timer desugars toA few things to notice: the wrapper uses *args, **kwargs so it can forward any
combination of arguments to the original function without knowing its signature. It captures the return
value in result and returns it, so the wrapped function still behaves identically from the
caller's perspective — it just has an extra print side effect. Remove the timing, and you have the
skeleton of almost every decorator you'll ever write.
@some_decorator above a
function definition, mentally substitute it with fn = some_decorator(fn) written immediately
after the def block. The two are exactly equivalent. There is no magic — it's a function call.Why You Must Use @functools.wraps
In the example above, there's a @functools.wraps(fn) line on the wrapper. This is
not optional. Without it, your decorated function loses its identity — its __name__,
__doc__, and __qualname__ attributes all get replaced with those of the inner
wrapper function. That causes subtle breakage in a few real situations:
- Docstrings disappear.
help(fetch_user_records)shows the wrapper's empty docstring instead of"Fetch all records for a given user...". - Stack traces lie. When an exception is raised inside the wrapped function, the traceback shows
wrapperinstead of the real function name — hard to debug. - Introspection breaks. Tools like pytest, Flask's routing system, and
inspect.signature()all rely on__name__and__wrapped__. Flask's router will throw if two routes share the same (wrapper) name. functools.lru_cacheand similar tools use the function's identity for cache keying — withoutwraps, you can get surprising cache collisions.
import functools
# Without @functools.wraps — broken
def bad_timer(fn):
def wrapper(*args, **kwargs):
return fn(*args, **kwargs)
return wrapper
@bad_timer
def process_batch(batch_id: int):
"""Process a single batch job."""
pass
print(process_batch.__name__) # 'wrapper' ← wrong
print(process_batch.__doc__) # None ← docstring gone
# With @functools.wraps — correct
def good_timer(fn):
@functools.wraps(fn)
def wrapper(*args, **kwargs):
return fn(*args, **kwargs)
return wrapper
@good_timer
def process_batch(batch_id: int):
"""Process a single batch job."""
pass
print(process_batch.__name__) # 'process_batch' ← correct
print(process_batch.__doc__) # 'Process a single batch job.' ← preservedfunctools.wraps
is itself a decorator — it copies over __module__, __name__, __qualname__,
__annotations__, __doc__, and sets __wrapped__ to the original function.
Use it on every wrapper function, full stop. There's no reason not to.
Decorators with Arguments
A plain decorator takes a function and returns a function. A decorator with arguments needs one more
level: a function that takes the arguments and returns a decorator. That's three levels of nesting,
and it confuses almost everyone the first time they see it. Here's a @retry decorator that retries
a function up to max_attempts times on exception:
import functools
import time
def retry(max_attempts: int = 3, delay: float = 1.0, backoff: float = 2.0):
"""
Retry a function on exception with exponential backoff.
Usage:
@retry(max_attempts=5, delay=0.5, backoff=2.0)
def call_external_api(endpoint: str) -> dict:
...
"""
def decorator(fn):
@functools.wraps(fn)
def wrapper(*args, **kwargs):
current_delay = delay
last_exception = None
for attempt in range(1, max_attempts + 1):
try:
return fn(*args, **kwargs)
except Exception as exc:
last_exception = exc
if attempt < max_attempts:
print(
f"{fn.__name__}: attempt {attempt}/{max_attempts} failed "
f"({exc!r}), retrying in {current_delay:.1f}s..."
)
time.sleep(current_delay)
current_delay *= backoff
else:
print(f"{fn.__name__}: all {max_attempts} attempts failed.")
raise last_exception
return wrapper
return decorator
@retry(max_attempts=4, delay=0.5, backoff=2.0)
def fetch_price_data(ticker: str) -> dict:
"""Fetch stock price data from external API."""
response = requests.get(f"https://api.example.com/prices/{ticker}", timeout=5)
response.raise_for_status()
return response.json()The call chain when Python processes @retry(max_attempts=4, delay=0.5, backoff=2.0):
first retry(max_attempts=4, delay=0.5, backoff=2.0) is called and returns decorator.
Then decorator(fetch_price_data) is called and returns wrapper. Finally
fetch_price_data is rebound to wrapper. So @retry(...) is
fetch_price_data = retry(...)(fetch_price_data) — three calls, two levels of wrapping.
Once you see that pattern, decorator factories stop being confusing.
Class-Based Decorators
You can also implement a decorator as a class by defining __call__. This is useful
when the decorator needs to maintain state between calls — a call counter, a cache, connection pools — because
instance variables are a more natural home for that state than closure variables.
import functools
import time
class RateLimiter:
"""
Decorator that limits how often a function can be called.
Raises RuntimeError if the function is called within `min_interval` seconds
of the previous call.
"""
def __init__(self, min_interval: float):
self.min_interval = min_interval
self._last_called: float = 0.0
def __call__(self, fn):
@functools.wraps(fn)
def wrapper(*args, **kwargs):
now = time.monotonic()
since_last = now - self._last_called
if since_last < self.min_interval:
wait = self.min_interval - since_last
raise RuntimeError(
f"{fn.__name__} called too soon — "
f"wait {wait:.2f}s before calling again."
)
self._last_called = now
return fn(*args, **kwargs)
return wrapper
# Usage — one call per 2 seconds
@RateLimiter(min_interval=2.0)
def send_sms_alert(phone: str, message: str) -> None:
"""Send an SMS alert via the gateway API."""
sms_gateway.send(phone, message)@RateLimiter(min_interval=2.0) works exactly like a decorator factory:
RateLimiter(2.0) constructs an instance, then that instance is called with
send_sms_alert because it has __call__. The instance stores
_last_called as an attribute — no closure variable juggling needed. See
PEP 318 (the original
decorator proposal) for the design rationale behind the @ syntax, and
PEP 614 for relaxed
decorator grammar that landed in Python 3.9.
Real-World Decorators from the Standard Library
Before writing your own, check if the stdlib already has what you need. Three decorators in
functools
come up constantly in production Python code.
from functools import lru_cache, cache
import requests
# @cache — unbounded memoisation (Python 3.9+)
# Caches every unique set of arguments forever.
# Good for pure functions with a small domain of inputs.
@cache
def get_country_name(country_code: str) -> str:
"""Look up a country name from ISO 3166 code. Cached after first call."""
response = requests.get(f"https://restcountries.com/v3.1/alpha/{country_code}")
return response.json()[0]["name"]["common"]
get_country_name("DE") # hits the API
get_country_name("DE") # served from cache, no network call
# @lru_cache(maxsize=N) — bounded LRU cache
# Evicts least-recently-used entries once the cache hits `maxsize`.
# Better when the input domain is large and memory is a concern.
@lru_cache(maxsize=256)
def compute_discount(base_price: float, tier: str) -> float:
"""Heavy computation — price varies by tier. Cache the top 256 combinations."""
discount_table = load_discount_table() # expensive DB call
rate = discount_table.get(tier, 0.0)
return round(base_price * (1 - rate), 2)
# Inspect cache performance
print(compute_discount.cache_info())
# CacheInfo(hits=142, misses=14, maxsize=256, currsize=14)@dataclass is in a different category — it's a class decorator that auto-generates
__init__, __repr__, and __eq__ from your field annotations. It cuts
out a significant amount of boilerplate for data-holding classes:
from dataclasses import dataclass, field
from typing import Optional
from datetime import datetime
@dataclass
class WebhookEvent:
event_type: str
source_id: int
payload: dict
received_at: datetime = field(default_factory=datetime.utcnow)
retry_count: int = 0
error_message: Optional[str] = None
# @dataclass generates all of this for free:
# - __init__(self, event_type, source_id, payload, received_at=..., retry_count=0, error_message=None)
# - __repr__ that shows all fields
# - __eq__ that compares field-by-field
event = WebhookEvent(event_type="order.created", source_id=9912, payload={"order_id": 44501})
print(event)
# WebhookEvent(event_type='order.created', source_id=9912, payload={...}, retry_count=0, ...)The @property decorator turns a method into an attribute-style accessor — callers
read user.display_name instead of user.get_display_name(). Combine it with
@property.setter to validate on write:
class UserProfile:
def __init__(self, first_name: str, last_name: str, email: str):
self._first_name = first_name
self._last_name = last_name
self._email = email.strip().lower()
@property
def display_name(self) -> str:
return f"{self._first_name} {self._last_name}"
@property
def email(self) -> str:
return self._email
@email.setter
def email(self, value: str) -> None:
if "@" not in value:
raise ValueError(f"Invalid email address: {value!r}")
self._email = value.strip().lower()
profile = UserProfile("Alice", "Smith", " [email protected] ")
print(profile.display_name) # 'Alice Smith'
print(profile.email) # '[email protected]'
profile.email = "[email protected]" # setter runs validation
profile.email = "not-an-email" # raises ValueErrorA @require_auth Decorator for Route Handlers
Authentication checks in web frameworks are a textbook decorator use case. Rather than duplicating the "is the user logged in?" check at the top of every route handler, you write it once as a decorator and apply it where needed. Here's the pattern, written to work with Flask but transferable to any framework:
import functools
from flask import request, jsonify, g
def require_auth(fn):
"""
Decorator that validates a Bearer token before the route handler runs.
Sets g.current_user on success; returns 401 JSON on failure.
"""
@functools.wraps(fn)
def wrapper(*args, **kwargs):
auth_header = request.headers.get("Authorization", "")
if not auth_header.startswith("Bearer "):
return jsonify({"error": "Missing or malformed Authorization header"}), 401
token = auth_header[len("Bearer "):]
user = verify_token(token) # your token validation logic
if user is None:
return jsonify({"error": "Invalid or expired token"}), 401
g.current_user = user # available to the route handler
return fn(*args, **kwargs)
return wrapper
# Usage — the auth check runs before the handler body
@app.route("/api/v1/reports/<int:report_id>")
@require_auth
def get_report(report_id: int):
report = Report.query.get_or_404(report_id)
if report.owner_id != g.current_user.id:
return jsonify({"error": "Forbidden"}), 403
return jsonify(report.to_dict())Note the order: @app.route goes first (outermost), @require_auth goes
second. This matters — see the next section. The pattern extends naturally: you could add a
@require_role("admin") decorator factory that checks g.current_user.role after
@require_auth has already verified the user exists.
Stacking Decorators — Order Matters
When you stack multiple decorators, they apply bottom-up (the decorator closest to the function applies first), but they execute top-down when the function is called. This catches people out.
@decorator_a
@decorator_b
@decorator_c
def my_function():
pass
# This is exactly equivalent to:
my_function = decorator_a(decorator_b(decorator_c(my_function)))
# When my_function() is called:
# 1. decorator_a's wrapper runs first (outermost)
# 2. decorator_b's wrapper runs second
# 3. decorator_c's wrapper runs third (innermost, closest to the real function)
# 4. The real my_function body runs
# 5. Unwinding: decorator_c → decorator_b → decorator_a# Practical example: order matters for @timer and @retry
# If timer is outermost, it measures total time including retry wait periods.
# If retry is outermost, it measures only the final successful call.
@timer # measures: total time across all retry attempts + sleep
@retry(max_attempts=3, delay=1.0)
def sync_with_partner_api(partner_id: int) -> dict:
...
# vs.
@retry(max_attempts=3, delay=1.0) # measures: only the final successful call
@timer
def sync_with_partner_api(partner_id: int) -> dict:
...
# Usually you want @timer outermost — it tells you the real wall-clock cost of the operation.
# Think about what "calling this function" means to the caller, then wrap in that order.@functools.wraps correctly, the names
will reflect the original functions. If you see a sea of wrapper frames, someone forgot
@functools.wraps. The
Real Python decorator primer
has a good walkthrough of how to debug stacked decorators.Wrapping Up
The mental model to carry forward: @decorator is fn = decorator(fn).
Everything else — decorators with arguments, class-based decorators, stacked decorators — is a variation on
that one substitution. Use @functools.wraps on every inner wrapper, always forward
*args, **kwargs to the wrapped function, and return its result. Decorators with arguments need
three levels of nesting: the argument function, the decorator, and the wrapper. Class-based decorators are
the right reach when your decorator needs to maintain state across calls.
For further reading: the
Python glossary entry on decorators
is brief but precise. The original decorator proposal,
PEP 318, is worth reading
for context on why the @ syntax was chosen over alternatives. If you're using decorators
in a codebase that also does a lot of data processing — reading files, transforming records — the patterns
here pair naturally with what's covered in
Python File Handling. And if you're using decorators to
transform collections or build lookup structures,
Python List Comprehensions covers the data
transformation side of that picture. If your decorated functions return JSON output and you want to inspect
it quickly, the JSON Formatter on this site is handy for that.