Smart Cache: Improvement over memcached
You all know memcached. It's a key-value lightweight caching mechanism that can be distributed over multiple machines.
Basically you have two main methods: set and get. When you set a value you can define the expiration time (ttl). When time expires, the key-value pair is removed from memory.
However, this can have several downsides, depending on the use case.
Let's say you have a recommendation engine, and after a user bought an item, you want to suggest what else he might be interested in. Now let's say running the recommendation algorithm takes five seconds of computation time. As a performant website, you don't want to hold the HTML generation for five seconds. So what some sites usually do, is setting the cache for 24 hours (cache per item), and accepting the fact that the first user in every 24h will be the scapegoat, and will "suffer" these five seconds.
This is problematic. What if your website works on the "Long tail" concept, and you have many items? This can lead to a situation that almost every moment you get this cache "miss".
You could solve this problem by running a chron job or a background timer task that will compute every 24 hours the recommended items for every item in the pool. This way, the real-time user will not have to wait. But this has problems as well:
- Your CPU consumption won't be balanced: Every 24 hours you will have a serious task running.
- What's the point to run the algorithm for every item? What if some items are frequently visited while others are not?
I took a different approach on Top7. We created a mechanism that we call Smart Cache, which sits on top of memcached. It is being implemented in PHP, but could be easily implemented in other languages as well (PHP is more challenging because it doesn't support threads). The idea is that the underlying cache never expires. When the time-to-live supplied by the programmer expires, the smart cache returns the value, and renew the value asynchronously, on the background. This ensures that users won't wait at all.
The smart cache has one c'tor method, and one get function, which is a proxy function (or magic function)
The c'tor: SmartCache(boolean blockOnEmptyCache, int timeToLive):
The c'tor: SmartCache(boolean blockOnEmptyCache, int timeToLive):
blockOnEmptyCache indicates what to do when the underlying cache is empty. If it is false, on first invocation the get function simply throws an exception. It is up to the programmer to decide how to handle it. For example, you can choose not to return the recommended items, only for the first user. If you set it to true, the function blocks (In our use case for 5 seconds), update the underlying cache, and return the result.
timeToLive indicates how long the value is valid. When subsequent get function occur, the smart cache checks if the cache value is valid. If it is valid, it returns the result. If not, it immediately returns the value, and updates the cache in the background.
How the cache value is updated? The get function is a magic function, so for example when you call:
$smartCache->getRecommendedItems(itemId, userId)
and the timeToLive is not valid, the smart cache simply calls in the background this function, and updates the cache. It also ensures that if concurrent get invocations occur, only one API function invocation will execute. So, to summaries, it ensures that no matter what happens only one API function, per set of function parameters, will occur in every timeToLive frame. This has a great benefit of reducing load from API server, and can mitigate high loads of traffic. If you use regular memcached, and get on first invocation 10 request at the same time, probably what would happen is that you would execute the API function 10 times, and then update the cache 10 times. Only then you will get your timeToLive function-free time, then the cache value will die, and so on.
So basically the Smart Cache encapsulate the function that should be called when the time expires.
Let's summaries what we get from Smart Cache
- Instead of caching ourselves the results coming back from API functions, it encapsulate both the cache and the API invocation, so typical usage is like that:
$smartCache = new SmartCache(true, 3600);
$smartCache->getRecommendedItems(itemId, userId);
That's about it. - Users don't block when cache revokes
- API is immediately cached.
- Protect API invocation against peeks of concurrent users.
- API will be executed only when needed. If only 25% of the items is reached in a week, then you save 75% of CPU time.