How a Librarian đź“š Stays Super Fast: A Fun Guide to Caching

How a Librarian đź“š Stays Super Fast: A Fun Guide to Caching

Photo by Paul Melki on Unsplash

Understanding Caching with a Librarian Analogy

As we have discussed in other articles, let’s revisit what caching really is.


Recap of Caching

Assume you are a librarian. Your job is to retrieve books from the shelf and hand them over to users. Now, let’s say a particular book becomes popular, and many people ask for it repeatedly. Every time, you have to walk to the shelf, pick the book, and return to the counter. Eventually, you realize it’s more efficient to keep the frequently requested books near you at the counter.

Keeping data near you for faster access and retrieval is called caching.


The Problem with Limited Space

While caching makes your job easier, it also comes with limitations. Imagine your counter can only hold 10 books at any time. What happens when a new book is requested, but your counter is full? To make space for the new book, you’ll need to remove one of the existing books.

How do you decide which book to remove?

In this article, we’ll explore strategies to help solve this problem. These strategies are simple yet powerful and will help you become an efficient librarian.


1. Random Strategy

The first method is straightforward: pick any random book, put it back on the shelf, and make space for the new book.

  • Every time you need to add a new book, randomly remove one from your counter.

  • This approach works, but as you can imagine, it’s highly inefficient.

  • You might remove a book that is frequently requested, leading to extra trips to the shelves.

While simple, the random strategy is not the most effective.


2. First In, First Out (FIFO)

In this approach, you treat the counter as a queue:

  • The first book you added to the counter is the first one you remove when space is needed.

  • For example, if your counter has 10 books and a new request comes in, you simply remove the oldest book to make space.

Pros:

  • Easy to implement.

Cons:

  • It doesn’t consider the popularity of books.

  • You might remove a book that is still in high demand.


3. Least Recently Used (LRU)

This strategy takes a smarter approach:

  • You prioritize keeping books that were recently requested.

  • If a book hasn’t been used in a while, it gets removed first.

For example:

  • Let’s say a Harry Potter book is frequently requested. It stays on the counter, while a book that hasn’t been requested recently is removed.

This method ensures you’re only keeping books that are actively in demand.


4. Least Frequently Used (LFU)

In the LFU strategy, you focus on how often a book is requested over a longer period of time.

  • Instead of looking at recent usage, you track the frequency of requests for each book.

  • Books that are requested the least often are removed first.

For instance:

  • A Harry Potter book might not have been requested this week (as per LRU, it would be removed), but over the past month, it was frequently requested. With LFU, the Harry Potter book stays because of its long-term popularity.

Alternative Perspective:
Rather than removing the least frequently used books, you can flip the strategy:

  • Keep the most frequently used books and remove everything else.

5. Most Frequently Used (MFU)

This is an alternative twist to LFU.

  • You focus on keeping the books that are the most in-demand at any given moment.

  • For example, if Harry Potter is the most requested book, you ensure it always stays on the counter.

This ensures you’re always prioritizing the most popular books for immediate access.


Conclusion

These five caching strategies—Random, FIFO, LRU, LFU, and MFU—help determine how to manage limited storage efficiently. Each method has its pros and cons, and the choice depends on the specific requirements of the scenario.

By implementing these strategies, you ensure faster service and become a more efficient librarian.

Â