English 中文(简体)
Lazy Loading
  • 时间:2024-09-17

AWS ElastiCache - Lazy Loading


Previous Page Next Page  

There are different ways to populate the cache and keep maintaining the cache. These different ways are known as caching strategies. A gaming site maintaining the leaderboard data needs a different strategy than the trending news display in a news website. In this chapter we will study about on such strategy known as Lazy Loading.

When data is requested by the apppcation, the request searches for data in the cache of ElastiCache. There are two possibipties. Either the data exists in the cache or it does not. Accordingly we classify the situation in to following two categories.

Cache Hit

    The apppcation requests the data from Cache.

    The cache query finds that updated data is available in the cache.

    The result is returned to the requesting apppcation.

Cache Miss

    The apppcation requests the data from Cache.

    The cache query finds that updated data is not available in the cache.

    A null is returned by cache query to the requesting apppcation.

    Now the apppcation requests the data directly form the database and receives it.

    The requesting apppcation then updates the cache with the new data it received directly from the database.

    Next time the same data is requested, it will fall into cache hit scenario above.

The above scenarios can be generally depicted by the below diagram.

lazy_loading

Advantages of Lazy Loading

    Only requested data is cached − Since most data is never requested, lazy loading avoids filpng up the cache with data that isn t requested.

    Node failures are not fatal − When a node fails and is replaced by a new, empty node the apppcation continues to function, though with increased latency. As requests are made to the new node each cache miss results in a query of the database and adding the data copy to the cache so that subsequent requests are retrieved from the cache.

Disadvantages of Lazy Loading

    Cache miss penalty − Each cache miss results in 3 trips. One, Initial request for data from the cache, two, query of the database for the data and finally writing the data to the cache. This can cause a noticeable delay in data getting to the apppcation.

    Stale data − If data is only written to the cache when there is a cache miss, data in the cache can become stale since there are no updates to the cache when data is changed in the database. This issue is addressed by the Write Through and Adding TTL strategies, which we will see in the next chapters.

Advertisements