Python > Modules and Packages > Standard Library > Functional Programming Tools (`functools` module)
Leveraging `functools.lru_cache` for Memoization
This snippet illustrates how to use `functools.lru_cache` to memoize the results of a function, significantly improving performance for computationally intensive tasks with repeated inputs.
Understanding Memoization with `lru_cache`
Memoization is an optimization technique where you store the results of expensive function calls and reuse them when the same inputs occur again. `functools.lru_cache` provides a decorator that automatically memoizes a function's results, using a Least Recently Used (LRU) cache to manage the stored values. This is particularly beneficial for recursive functions or functions called frequently with the same arguments.
Code Example: Memoizing a Recursive Fibonacci Function
This code defines a recursive `fibonacci` function, which is known to be computationally expensive for larger values of `n` due to repeated calculations. The `@functools.lru_cache(maxsize=None)` decorator memoizes the results. When `fibonacci(10)` is called the first time, the result is calculated and stored in the cache. The second time `fibonacci(10)` is called, the result is retrieved directly from the cache, avoiding redundant calculations and improving performance significantly. `maxsize=None` means the cache can grow without limit.
import functools
@functools.lru_cache(maxsize=None) # `maxsize=None` for unlimited cache
def fibonacci(n):
if n <= 1:
return n
return fibonacci(n-1) + fibonacci(n-2)
# Calculate Fibonacci numbers
print(fibonacci(10)) # Output: 55
print(fibonacci(20)) # Output: 6765
print(fibonacci(10)) # Output: 55 (retrieved from cache)
Concepts Behind the Snippet
`lru_cache` implements a Least Recently Used (LRU) cache. The cache stores recent function calls and their results. When the cache is full (if `maxsize` is set), the least recently used entry is discarded to make room for new entries. The decorator wraps the original function, intercepting calls, checking for cached results, and storing new results as needed.
This illustrates dynamic programming principles, specifically memoization which improves performance by reducing computational redundancy.
Real-Life Use Case
Consider a web application that needs to retrieve data from a database or an external API. Caching the results of these calls can significantly reduce the load on the database/API and improve the response time of the application. You can use `lru_cache` to memoize the results of the data retrieval functions, so that subsequent requests for the same data can be served from the cache.
Another common use case is parsing configuration files. Parsing can be expensive, especially for large files. By memoizing the parsing function, you can ensure that the file is only parsed once, even if it's needed multiple times throughout the application's lifecycle.
Best Practices
Interview Tip
In an interview, be prepared to explain how memoization works and why `lru_cache` is an effective way to implement it. Emphasize its performance benefits for computationally intensive tasks with repeated inputs. Understanding the concepts of caching and LRU eviction strategies is crucial. Also mention the limitations of `lru_cache` and when it is not appropriate (e.g. functions with side-effects or unhashable arguments).
When to Use `functools.lru_cache`
Use `functools.lru_cache` when:
Memory Footprint
The memory footprint of `lru_cache` depends on the `maxsize` and the size of the cached results. Larger `maxsize` values will consume more memory. Be mindful of the memory usage, especially for unbounded caches (`maxsize=None`). Regularly monitor the cache size using `cache_info()` and clear the cache when it's no longer needed.
Alternatives
Alternatives to `functools.lru_cache` include:
Pros
Cons
FAQ
-
What happens if I call a function decorated with `lru_cache` with unhashable arguments?
You will get a `TypeError` because `lru_cache` uses the arguments as keys in a dictionary (the cache), and dictionary keys must be hashable. -
How can I clear the cache of a function decorated with `lru_cache`?
You can use the `cache_clear()` method of the decorated function. For example, if you have `@lru_cache` decorator on your `my_function`, call `my_function.cache_clear()`.