Professional Documents
Culture Documents
Ryan Febriansyah
1 Guides 1
1.1 Purpose . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2 Installation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.3 Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
2 Usage 3
2.1 Using LRUCache(capacity=128, seconds=60*15) . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
2.2 set() method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
2.3 get_dict() method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
2.4 get_duration() method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
2.5 get_lru_element() method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
2.6 get_capacity() method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
2.7 get_cache() method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
2.8 get_ttl() method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
2.9 clear_all() method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
2.10 clear_cache_key() method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
2.11 is_empty() method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
2.12 @lru_cache(capacity=128) decorator . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
2.13 @lru_cache_time(capacity=128, seconds=60*15) decorator . . . . . . . . . . . . . . . . . . . . . . 6
2.14 Enable thread_safe parameter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
3 Further Examples 7
3.1 Backported with Django . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
3.2 Backported with Flask . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
4 References 9
4.1 lru.lrucache . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
4.2 lru.utils . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
4.3 lru.decorators . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
5 Caveats 11
6 Miscellanous 13
7 Roadmap 15
Index 19
i
ii
CHAPTER
ONE
GUIDES
LRUCache is a package for tracking store in-data memory using replacement cache algorithm / LRU cache. The
Priority of storing or removing the data based on Min-Max heap algorithm or basic priority queue instead using
OrderedDict module that provided by Python.
1.1 Purpose
The purpose of using this package itself is at least to be able to dynamically tracking. inserting, and removing least
frequently used in-data memory or in an element. Another purposes, with the use of python decorator or the method
looks like, it’s also possible to figure it out whether the data in the cache is full or not (it’s called LRU eviction), since
Min-Max heap algorithm is using O(1) complexity for basic insertion and searching, it’s also possible to efficiently
accessing the store in-data memory based on most frequently used method.
1.2 Installation
LRUCache only works in Python version 3.5 and above, you can install it with :
As for concerns, since the latest version (v1.0.1), this package only support Python version 3.6 and above, the next
release it will be dropped the Python 3.5 support
1.3 Testing
For running the test, you can use command python -m unittest tests
1
lrucache
2 Chapter 1. Guides
CHAPTER
TWO
USAGE
There is a little explanation regarding the use of this LRU cache. You can see at this simple configuration and expla-
nation for using several method that provided by this package.
For the first time, you can initialize LRUCache method with maximum capacity of cache is 128 and maximum duration
of cache is 15 minutes when you don’t initialize at first. For example:
Set an objects that wants to be cached in cache element, given the key parameters and value parameters as integer. For
example:
foo.set(1, "foobar")
foo.set(2, "bar")
get() method
Get the objects based on their key in cache element and access time, given the key parameters as integer. This get()
method can also be used to tracking which objects are often called which will later be identified as recently used
objects in a cache element, an object that is often called by this method will be placed in front of the cache element by
using get_lru_element(). For example:
foo.get(1)
foo.get(1) # you can iterate for calling an object
foo.get(2)
3
lrucache
Method for returned a all dictionary of an object in cache element. For example:
foo.get_dict()
Method for getting duration of cache, return True if the duration is exceed for expired time otherwise return False
when the duration is even or below the expired time. The expired time set as 3600 seconds. For example:
Method for retrieved an object based on their key in cache element and the duration when accessing onto the dictionary.
If the object is not called by the get() method, then objects that have short time for accessing onto dictionary will be
placed in beginning of the cache element, if the object is called by the get() method, it will placed depending how
many objects are called. In this case, this called as recently used. For example:
foo.get_lru_element()
Get cache capacity, return True if the cache is full otherwiser return False when the cache is not full. For example:
foo.get_capacity()
Get cache in element based on their key, return True if the element has a key, otherwise return False when element
hasn’t a key. Given the key parameters as integer. For example:
foo.get_cache(1)
4 Chapter 2. Usage
lrucache
Get time-to-live (TTL) duration for cache, will return a value, where the value is the remaining time from the cache
duration that has been set previously. Given the key parameters as integer. Th countdown time will be reduced by
one second according to the cache duration that we have set before, if you set it to within 5 seconds, when using this
method it will display a value of 4 which means its the remaining duration of our cache, and so on until the result
displayed is set as`None`. For example:
foo.get_ttl(1)
foo.clear_all()
Remove cache in element based on their key. Given the key as parameters for remove the cache objects. For example:
Check whether the current cache in element is empty or not. Will return True if the cache element is empty and False
when the cache element is full of objects. For example:
foo.is_empty()
Python decorators using LRUCache classes for cache an object within a function. Default capacity is 128 if you not
define it. For example:
@lru_cache(capacity=5)
def test_lru(x):
print("Calling f(" + str(x) + ")")
return x
test_lru.set(1, "foo")
test_lru.set(2, "test")
test_lru.set(3, "foos")
(continues on next page)
Python decorators for LRUCache classes using expired cached time. This is an mock only, probably not ready to
bump into major version, if you want to try it and there is an error or an unexpected result, please raise the issue. For
example :
@lru_cache_time(capacity=5, seconds=15)
def test_lru(x):
print("Calling f(" + str(x) + ")")
return x
test_lru.set(1, "foo")
test_lru.set(2, "test")
The difference between set duration of cache if using decorators or not lies when we set the value of the duration
cache. By using these @lru_cache_time decorators at least it will compact and dynamically clear the cache if the
duration exceeds of the maximum duration (15 minutes).
By enabling thread_safe parameter into True, it will be possible to safely to call a function together. For example, if
we create a shared task (functions a and b) where the shared task invokes a resource such as object from function c,
then the object can safely be called and can be execute on both functions a and b (thus, its called a deadlock if we
dont use thread_safe parameter to execute two functions from one resource). As for concerns, the use of thread_safe
might be reduce the performance. For example:
6 Chapter 2. Usage
CHAPTER
THREE
FURTHER EXAMPLES
For further example, hopefully this package can be backported with Python web frameworks such as Django or Flask
which can be implemented and supported in the JSON field area, since the return of this LRUCache value is in the
form of dictionary which is very common in JSON type. For simple usage in Django, you must setting the LRUCache
in installed application within Django settings like this :
INSTALLED_APPS = [
...
'lru',
...
]
And then you can create some function for wrapped the objects that you want to cached with JsonResponse from
Django API. The results of this function will return your objects in the form of a JSON dict at your web browser :
# views.py
Dont forget to passing our function method into Django urls parameters :
# urls.py
urlpatterns = [
path('', views.test_lru),
]
After that you can run the django server and see it in a web browser based on your endpoint url. Please remember,
at the moment, each time after you wrapped the object with JsonResponse, you need to set the safe parameter to False
7
lrucache
because when you set an object with LRUCache, we don’t set it to dict type, while the output from JsonResponse itself
is dict type.
For simple usage in Flask probably occured by using the LRUCache decorators on top of the router flask decorator.
Ensure you’ve already installed Flask with pip install flask and then create new file such as the following example :
# app.py
app = Flask(__name__)
@lru_cache_time(seconds=60)
@app.route("/")
def hello():
return "Hello World!"
if __name__ == "__main__":
app.run()
After that, you can run the Flask app with the command like python app.py and then open it in the browser according
to the existing localhost. As for noted, the use of the @lru_cache_time decorators itself will set the cache capacity and
duration of the cache. Since here we only return plain text / html with Hello World output, i’m assume we don’t need
to set the cache capacity. Normally, using LRUCache at the web-environment level can be done with the following
assumptions like :
• You want to build web-based streaming services
• You do object modeling in the database, such as creating objects for a song that is least or most recently heard.
• You set the song object with LRUCache, so that every time you open your site, that song will appear.
Another example of using this LRU cache is that you want to display a book that is most often searched by keywords,
dynamically remove an object that is not or least used, store data of users who frequently visit our site and many
others.
FOUR
REFERENCES
4.1 lru.lrucache
class lru.lrucache.LRUCache(capacity: int = 128, seconds: int = 900, thread_safe: bool = False)
Initial class for representing LRUCache, given the several parameter such as :
Parameters
• capacity – param for set the cache capacity, maximum number is 128
• seconds – param for set the duration for store the cache, maximum is 15 minutes
• thread_safe – param for enable/disable thread safe option, default is False
clear_all()
Clear all cache in element
clear_cache_key(key: int) → None
Clear cache in element based on their key.
get(key: int) → int
Get the objects based on their key in cache element
Parameters key – given key parameter as an integer
get_cache(key: int) → int
Get cache in element based on their key, return True if the element has a key, otherwise return False when
element hasn’t a key.
get_capacity()
Get cache capacity, return True if the cache is full otherwiser return False when the cache is not full.
get_dict()
Returned a dict type in cache element.
get_duration(expired_time: int = 3600) → int
Get duration of cache, return True if the duration is exceed for expired time otherwise return False when
the duration is even or below the expired time.
get_lru_element()
Returned a dict type based on their key in cache element.
get_ttl(key: int) → Optional[Union[int, bool]]
Get time-to-live an objects based on their cache keys. Return False if the objects hasn’t a key or time-to-live
is expired.
is_empty()
Check whether the cache element is empty or not, return True if is empty otherwise will return False if is
not empty
9
lrucache
4.2 lru.utils
4.3 lru.decorators
@lru_cache(capacity=3)
def foo(x):
pass
@lru_cache_time(capacity=3, seconds=60)
def foo(x):
pass
10 Chapter 4. References
CHAPTER
FIVE
CAVEATS
11
lrucache
12 Chapter 5. Caveats
CHAPTER
SIX
MISCELLANOUS
Why use object cache level instead of filtering method or get method based on API? Ideally, cache is fast. and just fast
as storage, reading or accessing the data from a cache takes less time than reading the data from something else. in
example if you use filtering methods, you are doing accessing and getting the object from your database. Otherwise,
caching the object improves performance by keeping recent or most used data in memory locations rather we’re going
to use computationally object from database.
13
lrucache
14 Chapter 6. Miscellanous
CHAPTER
SEVEN
ROADMAP
Since this package already at stable version (v1.0.1), so for next update there will be improvements in several features
such as :
• [x] Use classes as decorators for caching the objects
• [x] Add expired time for caching objects
• [x] Add thread safe parameter
• [ ] Scale the LRUCache capacity
• [ ] Backported and integrated with Django request and response
• [x] Write unittest for LRUCache
• [ ] Improve code coverage up to 90 %
15
lrucache
16 Chapter 7. Roadmap
PYTHON MODULE INDEX
l
lru.decorators, 10
lru.lrucache, 9
lru.utils, 10
17
lrucache
B
BypassThreadSafe (class in lru.utils), 10
C
clear_all() (lru.lrucache.LRUCache method), 9
clear_cache_key() (lru.lrucache.LRUCache
method), 9
G
get() (lru.lrucache.LRUCache method), 9
get_cache() (lru.lrucache.LRUCache method), 9
get_capacity() (lru.lrucache.LRUCache method), 9
get_dict() (lru.lrucache.LRUCache method), 9
get_duration() (lru.lrucache.LRUCache method), 9
get_lru_element() (lru.lrucache.LRUCache
method), 9
get_ttl() (lru.lrucache.LRUCache method), 9
I
is_empty() (lru.lrucache.LRUCache method), 9
L
lru.decorators
module, 10
lru.lrucache
module, 9
lru.utils
module, 10
lru_cache() (in module lru.decorators), 10
lru_cache_time() (in module lru.decorators), 10
LRUCache (class in lru.lrucache), 9
M
module
lru.decorators, 10
lru.lrucache, 9
lru.utils, 10
S
set() (lru.lrucache.LRUCache method), 10
T
ttl() (lru.lrucache.LRUCache property), 10
19