Skip to content

Python

Python cache with lru_cache

The functools.lru_cache in Python is a decorator that provides a simple mechanism for caching the results of function calls. "LRU" stands for "Least Recently Used," which is a caching strategy that discards the least recently used items first. The lru_cache decorator is useful for improving the performance of functions that are called multiple times with the same arguments by avoiding redundant calculations.

How It Works

  • When you decorate a function with @lru_cache, the results of function calls are stored in a cache.
  • If the function is called again with the same set of arguments, the cached result is returned instead of re-evaluating the function.
  • The cache has a maximum size, specified by the maxsize parameter, which determines how many unique sets of arguments/results it can store.
  • Once the cache is full, the least recently used entries are removed to make room for new ones.

Requirements

  • The function being decorated must take hashable arguments, as the arguments are used as keys in the cache.
  • Immutable types like integers, strings, and tuples are hashable. Mutable types like lists or dictionaries are not.

Usage

The lru_cache decorator is part of the functools module, so you need to import it before using it.

Parameters

  • maxsize: This is an integer that specifies the maximum number of cached functions calls. If set to None, the cache can grow without bound. If set to a positive integer, the cache will discard the least recently used data when adding new data once the maxsize is reached.
  • typed: If set to True, arguments of different types will be cached separately. For instance, f(3) and f(3.0) will be treated as distinct calls with typed=True.

Example

Here is a simple example demonstrating how to use lru_cache:

from functools import lru_cache

@lru_cache(maxsize=32)
def fibonacci(n):
    if n < 2:
        return n
    return fibonacci(n - 1) + fibonacci(n - 2)


# Example usage
print([fibonacci(n) for n in range(10)])
In this example: - We compute the Fibonacci sequence using a recursive function. - The @lru_cache(maxsize=32) decorator caches the results of the fibonacci() function to prevent redundant calculations of known Fibonacci numbers. - This is highly beneficial because without caching, the recursive calls would result in a lot more repeated calculations, slowing down the computation for larger values of n.

Other Considerations

  • The decorator is thread-safe, making it useful in multi-threaded applications.
  • You can view cache information with the cache_info() method and clear it with the cache_clear() method.

# View cache info
print(fibonacci.cache_info())
# Clear the cache
fibonacci.cache_clear()
This makes lru_cache a powerful tool for optimizing performance-critical parts of a Python application where repeated computations with identical inputs occur.