Complete Guide to Caching in Python | by Kay Jan Wong | Dec, 2023

How does caching work, and ways to cache your functions

Photo by Nana Smirnova on Unsplash
Photo by Nana Smirnova on Unsplash

When repeated function calls are made with the same argument, this causes the computation to be repeated. Memoization is useful in such scenarios where the results of function calls can be ‘saved’ for future use. This results in time savings and code optimization as code becomes less computationally expensive. Caching is a more general term used to refer to storing of any data.

This article will touch on the different caching strategies, caching considerations, and how to enable and implement different types of caching for your scripts (using Python package and your implementation)!

There are several strategies for caching based on your needs, such as:

  • Least Recently Used (LRU): removes least recently used data, the most common type of caching
  • Least Frequently Used (LFU): removes least frequently used data
  • First-In-First-Out (FIFO): removes the oldest data
  • Last-In-First-Out (LIFO): removes the newest data
  • Most Recently Used (MRU): removes most recently used data
  • Random Replacement (RR): removes randomly selected data

When using caching in your applications, you should consider the memory footprint of the cache as it is storing additional information. If you are deciding between different implementations, in terms of architecture and data structures, there are a few timing considerations such as:

  • Access time: For arguments that have been computed before, results should be accessed quickly in O(1) time
  • Insertion time: For new arguments, data should be inserted into the cache, preferably in O(1) time (depending on implementation, some may take O(n) time, choose wisely!)
  • Deletion time: In the case…

Source link

Leave a Comment