2-random cache eviction
December 29, 2025
While the basic LRU Cache is straightforward which removes the least recently used key-value pairs from the cache, but there are variations and optimizations to cater to spcific needs for example like tracking the recency with timestamps etc for eviction decisions.
Similarly in an LRU, instead of keeping track of the exact LRU data, the 2 random data can be picked, and out of them the one which was least recently used can be evicted from the cache, this is called as 2-random cache evcition technique.
Apparently these Radomized techniques have actually proven to be better than the usual LRU as it uses lesser memory and CPU overhead.
You can research about it more and maybe read this article if this piques your curiosity -> Why are Randomized Algorithms better than LRU ?
Infact most in-memory databases use this kind of eviction policy, Check this out : Redis
