A Simple LRU Cache

Recently, while working on a module has a need of very light weight fixed size LRU cache. There are definitely lot of caching solutions around, but decided to try a simple Map based solution. Surprisingly, the solution using LinkedHashMap turned out to be simplest.

Here is the snippet. Removed all additional validation and other stuff for simplicity.

public class LruCache {
    // cache data holder
    Map<Object, Object> cache;

    // Max size
    private int maxCacheSize;

    public LruCache(int maxSize) {
        this.maxCacheSize = maxSize;
        cache = new LinkedHashMap(maxSize, 0.75f, true) {
            @Override
            protected boolean removeEldestEntry(Map.Entry eldest) {
                return size() > maxCacheSize;
            }
        };
    }

    public void put(Object key, Object value) {
        cache.put(key, value);
    }

    public Object get(Object key) {
        return cache.get(key);
    }
}

The implementation is fairly simple. The third parameter while initialization of LinkedHashMap is the key. a value of "true" means the implementation should use access-order while evicting the entry.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code class="" title="" data-url=""> <del datetime=""> <em> <i> <q cite=""> <strike> <strong> <pre class="" title="" data-url=""> <span class="" title="" data-url="">