One of the most common examples given for Python decorators is a decorator that tracks how long the execution of the wrapped function took. While this is very useful in and of itself, it can cause issues when you want to apply that into production usage.
The issue I faced was that when I was trying to find out why my servers were too slow (only under production loads), I first added the simple timing decorators to everything I thought might be slow in the hopes of catching all the slow calls and maybe finding some patterns. Well, this approach worked in the sense that I did find the slow parts quickly, but it was producing much more logs than before, and I wasn’t really interested in most of this timing information.
What I really wanted was a timing decorator that would log only when the wrapped callable took too long to execute. But there were still some calls that I wanted to log always for accurate statistical purposes. I figured the best way was to make my decorator take a threshold argument with some reasonable default that I could override if I wanted.
Now while I have written decorators before, this was the first decorator that called for optional arguments. Python treats decorators that don’t take any arguments slightly differently from those that require arguments, so this complicates things a bit. The sample in Python decorator library is almost scary! I think my approach is nice and simple yet fairly sophisticated:
from time import time import logging import functools log = logging.getLogger(__name__) def time_slow(f=None, logger=log, threshold=0.01): def decorated(f): @functools.wraps(f) def wrapper(*args, **kw): start = time() try: ret = f(*args, **kw) finally: duration = time() - start if duration > threshold: logger.info('slow: %s %.9f seconds', f.__name__, duration) return ret return wrapper if f is not None: return decorated(f) return decorated
This decorator can be placed on a callable without any arguments, or with a custom logger or threshold value. In other words, both of these would work:
@time_slow def myfast(): pass @time_slow(threshold=0) def myslow(): from time import sleep sleep(0.001)
and typically calling
myslow only would produce log output. I chose 0.01 as a reasonable default threshold, but this of course depends a lot on the use case. The log includes the slow function’s name, as well as the time formatted with 9 decimals in order to avoid the exponential notation, which makes it easier to work with the log output (
sort -n, for example). I have just used a single threshold, but an easy improvement would be to pass a list of thresholds and log at different levels depending on the duration.