PyLRU: Least Recently Used (LRU) Cache
Pylru is a pure Python library implementing a Least Recently Used (LRU) cache. It provides a simple dictionary-like interface for `lrucache` and includes classes for wrapping existing dictionary-like objects (`WriteThroughCacheManager`) and functions (`FunctionCacheManager`, `lrudecorator`). The library is efficient, offering constant-time basic operations (lookup, insert, delete). The current version is 1.3.1, with releases happening periodically based on maintenance needs.
Common errors
-
KeyError: 'non_existent_key'
cause Attempting to access a key that does not exist in the LRU cache using dictionary-style bracket notation (e.g., `cache['key']`).fixUse `value = cache.get('key', default_value)` to provide a fallback, or check for key existence before access: `if 'key' in cache: value = cache['key']`. -
Decorated function re-computes results even for same inputs.
cause This often occurs when mutable objects are passed as arguments to a function decorated with `@lrudecorator`. The cache keys are based on the hash of the arguments, and while the object's identity (and thus hash) might remain the same, its internal state might change, making the cached result stale or causing it to be treated as a new entry if the hash changes.fixRefactor the function to accept only immutable arguments (e.g., numbers, strings, tuples of immutable types). If mutable data must be used, convert it to an immutable representation (e.g., a tuple of its contents) before passing it to the decorated function.
Warnings
- gotcha Modifying a pylru cache (insert, lookup, delete) during iteration (e.g., in a `for` loop) can lead to undefined behavior as it changes the internal order. If iteration is needed while modification might occur, convert `cache.keys()` or `cache.items()` to a `list` first.
- gotcha The `@lrudecorator` only works reliably with functions whose arguments are hashable and whose equality is determined by the argument values, not their identity or mutable state. Passing mutable objects (lists, dictionaries, custom objects without proper `__hash__` and `__eq__` implementations) as arguments can lead to cache misses or stale data.
- gotcha Calling `len(cached)` on a `WriteThroughCacheManager` instance can trigger an internal `sync()` operation, which might have adverse performance effects if called frequently, depending on the underlying store object.
Install
-
pip install pylru
Imports
- lrucache
from pylru import lrucache
- lrudecorator
from pylru import lrudecorator
- WriteThroughCacheManager
from pylru import WriteThroughCacheManager
- FunctionCacheManager
from pylru import FunctionCacheManager
- lruwrap
from pylru import lruwrap
Quickstart
import pylru
# Create an LRU cache with a maximum size of 3
cache_size = 3
cache = pylru.lrucache(cache_size)
# Insert items into the cache
cache['apple'] = 1
cache['banana'] = 2
cache['cherry'] = 3
print(f"Cache after initial inserts: {list(cache.items())}")
# Access 'apple' - it becomes the most recently used
print(f"Accessed 'apple': {cache['apple']}")
print(f"Cache order after accessing 'apple': {list(cache.keys())}")
# Insert a new item - 'banana' (least recently used) should be evicted
cache['date'] = 4
print(f"Cache after adding 'date': {list(cache.items())}")
# Test for membership
print(f"Is 'cherry' in cache? {'cherry' in cache}")
print(f"Is 'banana' in cache? {'banana' in cache}")
# Using the lrudecorator
@pylru.lrudecorator(max_size=2)
def fibonacci(n):
print(f"Computing fibonacci({n})...")
if n <= 1:
return n
return fibonacci(n - 1) + fibonacci(n - 2)
print("\n--- Decorator Example ---")
print(f"Fib(3): {fibonacci(3)}") # Computes
print(f"Fib(2): {fibonacci(2)}") # Computes
print(f"Fib(3): {fibonacci(3)}") # From cache (most recent)
print(f"Fib(4): {fibonacci(4)}") # Computes, fib(2) gets evicted as LRU