Professional Documents
Culture Documents
VERSIONS
Note:
● Sample code tested using:
● Python 3.10
OVERVIEW
● A cache is a performance trade-off:
● Gain speed
● Costs memory
● Expensive calculations or data that is costly to fetch is stored in memory
for later re-use
● Many programs follow a principle of locality:
● More likely to use data that was recently accessed
● Python provides the @lru_cache decorator to quickly provide caching
functionality to almost any function
NEXT UP...
Caching policies
TABLE OF CONTENTS
1. Overview
2. Caching
3. LRU In Python
4. More lru_cache Features
5. Quick Intro To Decorators
6. LRU + Time Expiration Decorator
7. Summary
CACHING
● Software often exhibits a property of locality
● Recently used things are more likely to be used again
● Content fetched from memory
● Content fetched from disk
● Computed values
● Content fetched from the network
● Caching uses some memory to store a result in anticipation of its re-use
● Useful when the computation/fetching cost of something is high
● Hardware strategies built into your computer
● Memory strategies built into your Operating System
● Software strategies you can use in your code
CACHING
5
1010001100
1011011001
1110110000
1110100111
0010111110
0001010011
CACHING
0
1010001100
1011011001
1110110000
1
1010001100
1011011001
1110110000
2
1010001100
1011011001
1110110000
3
1010001100
1011011001
1110110000
4
1010001100
1011011001
1110110000
5
1010001100
1011011001
1110110000
6
1010001100
1011011001
1110110000
7
1010001100
1011011001
1110110000
8
1010001100
1011011001
1110110000
9
1010001100
1011011001
1110110000
1110100111 1110100111 1110100111 1110100111 1110100111 1110100111 1110100111 1110100111 1110100111 1110100111
0010111110 0010111110 0010111110 0010111110 0010111110 0010111110 0010111110 0010111110 0010111110 0010111110
0001010011 0001010011 0001010011 0001010011 0001010011 0001010011 0001010011 0001010011 0001010011 0001010011
CACHING
0
1010001100
1011011001
1110110000
1
1010001100
1011011001
1110110000
2
1010001100
1011011001
1110110000
3
1010001100
1011011001
1110110000
4
1010001100
1011011001
1110110000
5
1010001100
1011011001
1110110000
6
1010001100
1011011001
1110110000
7
1010001100
1011011001
1110110000
8
1010001100
1011011001
1110110000
9
1010001100
1011011001
1110110000
1110100111 1110100111 1110100111 1110100111 1110100111 1110100111 1110100111 1110100111 1110100111 1110100111
0010111110 0010111110 0010111110 0010111110 0010111110 0010111110 0010111110 0010111110 0010111110 0010111110
0001010011 0001010011 0001010011 0001010011 0001010011 0001010011 0001010011 0001010011 0001010011 0001010011
CACHING
2
1010001100
1011011001
1110110000
1110100111
0010111110
0001010011
0
1010001100
1011011001
1110110000
1
1010001100
1011011001
1110110000
2
1010001100
1011011001
1110110000
3
1010001100
1011011001
1110110000
4
1010001100
1011011001
1110110000
5
1010001100
1011011001
1110110000
6
1010001100
1011011001
1110110000
7
1010001100
1011011001
1110110000
8
1010001100
1011011001
1110110000
9
1010001100
1011011001
1110110000
1110100111 1110100111 1110100111 1110100111 1110100111 1110100111 1110100111 1110100111 1110100111 1110100111
0010111110 0010111110 0010111110 0010111110 0010111110 0010111110 0010111110 0010111110 0010111110 0010111110
0001010011 0001010011 0001010011 0001010011 0001010011 0001010011 0001010011 0001010011 0001010011 0001010011
CACHING
?
0
1010001100
1011011001
1110110000
1
1010001100
1011011001
1110110000
2
1010001100
1011011001
1110110000
3
1010001100
1011011001
1110110000
4
1010001100
1011011001
1110110000
5
1010001100
1011011001
1110110000
6
1010001100
1011011001
1110110000
7
1010001100
1011011001
1110110000
8
1010001100
1011011001
1110110000
9
1010001100
1011011001
1110110000
1110100111 1110100111 1110100111 1110100111 1110100111 1110100111 1110100111 1110100111 1110100111 1110100111
0010111110 0010111110 0010111110 0010111110 0010111110 0010111110 0010111110 0010111110 0010111110 0010111110
0001010011 0001010011 0001010011 0001010011 0001010011 0001010011 0001010011 0001010011 0001010011 0001010011
WHAT TO KEEP
● Several styles of cache with different rules about what to keep:
● FIFO: First-In/First-Out
● LIFO: Last-In/First-Out
● LRU: Least Recently Used
● MRU: Most Recently Used
● LFU: Least Frequently Used
FIFO CACHING
1
3
5
4 353914
9
LIFO CACHING
3
5 353914
9
1
4
LRU CACHING
4
3
5
1 353914
9
NEXT UP...
LRU in Python
TABLE OF CONTENTS
1. Overview
2. Caching
3. LRU In Python
4. More lru_cache Features
5. Quick Intro To Decorators
6. LRU + Time Expiration Decorator
7. Summary
USE CASES
● Your hardware and your Operating System will use caching to improve
performance
● Python provides an LRU Cache decorator
● Possible uses:
● Long calculations
● Network access
FIBONACCI SEQUENCE
1 1 2 3 5 8 ...
+ +
LRU DECORATOR
● functools library has the lru_cache decorator
● Wrap your function: instant caching
RESTRICTIONS
● Cache key is based on a dict
● All wrapped function arguments must be hashable
● Distinct argument order will be considered a separate key
Keys
foo(a=3, b=4) foo(b=4, a=3)
!=
NEXT UP...
CACHE OPTIONS
● Additional arguments:
● maxsize -- limit how many items are kept, defaults to 128
● typed -- causes arguments of different types to be cached separately
LRU IMPLEMENTATION
● Parts of functools are implemented in both Python and C
● Can see how lru_cache is built
● Thread safe
● Cache is a dictionary for quick hit/miss calculation
● Keys are made from the wrapped function’s arguments
● Circular doubly linked list used to store what is LRU
● Built using lists of lists
● All of this takes up memory!
NEXT UP...
Decorators Tangent
TABLE OF CONTENTS
1. Overview
2. Caching
3. LRU In Python
4. More lru_cache Features
5. Quick Intro To Decorators
6. LRU + Time Expiration Decorator
7. Summary
WRITING DECORATORS
● A decorator is a function that wraps another function
● Write your own decorator
● Typically used when pre- and post-conditions are required
● Example: timing how long a function took
NEXT UP...
AUGMENTING LRU
● Some data is time sensitive
● Don’t want to return a cached result if it is too old
● Recent-ness of use needs to be adapted
NEXT UP...
Summary
TABLE OF CONTENTS
1. Overview
2. Caching
3. LRU In Python
4. More lru_cache Features
5. Quick Intro To Decorators
6. LRU + Time Expiration Decorator
7. Summary
SUMMARY
● Caching can make a significant speed improvement at the cost of memory
● There are a variety of caching policies
● LRU -- Least Recently Used: attempts to keep things that are being
accessed
● Python provides the @lru_cache decorator to turn almost any function
into a cached version
● Decorators can be used as functions inside of other decorators to augment
their capabilities
FURTHER INVESTIGATION
● Cache Replacement Policies:
https://en.wikipedia.org/wiki/Cache_replacement_policies
● Python Coding Interviews; Tips & Best Practices: functools Module
https://realpython.com/lessons/functools-module/
● Primer on Python Decorators:
https://realpython.com/primer-on-python-decorators/