https://medium.com/lambda-automotive/python-and-lru-cache-f812bbdcbb51
from functools import lru_cache#Option1:lru_cache
(maxsize=128, typed=False) #default: maxsize=128, non-sensitive to type #Option2:@lru_cache(maxsize=4)#Option3:@lru_cache(None) # maxsize is unlimitedWhen to use: repeated calling. No repetition, no use.https://docs.python.org/3/library/functools.html
No comments:
Post a Comment