DEV Community

Cover image for Python auto refresh cache
BalighMehrez
BalighMehrez

Posted on

Python auto refresh cache

Their are many blog posts that illustrate caching in Python.
But I could not find any that illustrates my simple idea.
I was searching for a module or post about how to keep my cache updated every time i request the same key
I wanted a threaded cache that answer my request immediately but after that refresh the value so my data will be always fresh.
I know you can achieve this with setting a small TTL value but my intention is to get current data on cache and before the next request the cache will be refreshed in a separate thread.
I get this done by combining caching with multi-threading features of Python.

Enough talking, Let's do coding

import functools
import random
import threading
import time
from threading import Thread
from typing import Match

my_cache = {}


def cach_it(_func=None, *, as_daemon=False):

    def decorator_cache_it(func):

        @functools.wraps(func)
        def wrapper_cach_it(*args, **kwargs):
            key = (func.__name__, args, hash(tuple(sorted(kwargs))))
            if key in my_cache:
                result = my_cache[key]

                def threaded_func():
                    my_cache[key] = func(*args, **kwargs)

                t = threading.Thread(target=threaded_func, daemon=as_daemon)
                t.start()

            else:
                result = func(*args, **kwargs)
                my_cache[key] = result
            return result
        return wrapper_cach_it

    if _func is None:
        return decorator_cache_it
    else:
        return decorator_cache_it(_func)


@cach_it(as_daemon=True)
def sample_func(a, b):
    time.sleep(2)
    return (a + b) * random.random()


if __name__ == "__main__":
    for i in range(10):
        print(sample_func(5, 5))

        time.sleep(1)
Enter fullscreen mode Exit fullscreen mode

and the output

9.57128226715124
9.57128226715124
9.57128226715124
2.886742714779955
3.8942583934686836
6.492309167842186
6.492309167842186
0.3771087685702834
6.994070882085098
9.45145128039504
Enter fullscreen mode Exit fullscreen mode

The idea is simple.
I am using the simple Python dictionary caching method for simplicity.
if the key is not found in the dictionary go and execute the function after that cache the result on the dictionary
if the key already exists then return the result from the dictionary and update the dictionary in a background thread so the next time I will get a fresh data.

From output you can notice that same result returned until the cache get updated from the thread.

sleep functions are used to simulate operations that take time.

I hope you can use this method for a better caching on your projects.

Your feedback is highly appreciated .

Top comments (2)

Collapse
 
defman profile image
Sergey Kislyakov

Hey, could you explain why there's an argument called func_ in your decorator? Is it there for wrapping functions somehow like this: cached_func = cach_it(some_func, as_daemon=True) instead of cached_func = cach_it(as_daemon=True)(some_func)? I've never seen that way in before.

Collapse
 
balighmehrez profile image
BalighMehrez

it is to make decorator work with or without arguments
as discussed on this topic
realpython.com/primer-on-python-de...