Multithreaded lru cache in Java
A Least Recently Used (LRU) cache is a data structure that stores a limited number of items. The items are stored in a way that the least recently used item is removed from the cache when the cache reaches its limit. This type of cache is useful for storing frequently accessed data in memory, so that the time to access the data is reduced.
When implementing an LRU cache in a multithreaded environment, it is important to consider synchronization to prevent race conditions and ensure the cache consistency.
There are several ways to implement a multithreaded LRU cache in Java, including:
Synchronized Map: The simplest way to implement a multithreaded LRU cache is by using a synchronized Map and a synchronized LinkedList. The Map stores the key-value pairs and the LinkedList keeps track of the order of the items in the cache. Whenever an item is accessed, it is moved to the front of the LinkedList. When the cache reaches its limit, the least recently used item is removed from the back of the LinkedList and from the Map.
ConcurrentHashMap: Java provides a ConcurrentHashMap that can be used to implement a multithreaded LRU cache. The ConcurrentHashMap is thread-safe and can be used to store key-value pairs. To implement an LRU cache, we can use a LinkedHashMap with a custom eviction policy that removes the least recently used item.
ReentrantLock: Another way to implement a multithreaded LRU cache is by using a ReentrantLock. The ReentrantLock is a lock that provides synchronization and can be used to protect access to the cache. The ReentrantLock is used to synchronize the access to the cache and to ensure the consistency of the cache.
Method 1: Synchronized Map
SynchronizedLRUCache.java
import java.util.LinkedList;
import java.util.Map;
import java.util.concurrent.ConcurrentHashMap;
public class SynchronizedLRUCache {
private int capacity;
private Map<Integer, Integer> cache;
private LinkedList<Integer> order;
public SynchronizedLRUCache(int capacity) {
this.capacity = capacity;
this.cache = new ConcurrentHashMap<>();
this.order = new LinkedList<>();
}
public synchronized int get(int key) {
if (!cache.containsKey(key)) return -1;
order.remove(Integer.valueOf(key));
order.addFirst(key);
return cache.get(key);
}
public synchronized void put(int key, int value) {
if (cache.containsKey(key)) {
order.remove(Integer.valueOf(key));
}
cache.put(key, value);
order.addFirst(key);
if (cache.size() > capacity) {
int removedKey = order.removeLast();
cache.remove(removedKey);
}
}
}
Output:
Suppose we have the following code to test the SynchronizedLRUCache:
SynchronizedLRUCache cache = new SynchronizedLRUCache(2);
cache.put(1, 1);
cache.put(2, 2);
System.out.println(cache.get(1)); // Output: 1
cache.put(3, 3);
System.out.println(cache.get(2)); // Output: -1
cache.put(4, 4);
System.out.println(cache.get(1)); // Output: -1
System.out.println(cache.get(3)); // Output: 3
System.out.println(cache.get(4)); // Output: 4
Explanation:
In this implementation, we use a synchronized Map and a synchronized LinkedList. The Map stores the key-value pairs, and the LinkedList keeps track of the order of the items in the cache. The get method retrieves the value of a key from the Map and moves it to the front of the LinkedList. The put method adds a new key-value pair to the Map and the front of the LinkedList. If the cache reaches its limit, the least recently used item is removed from the back of the LinkedList and from the Map.
Method 2: ConcurrentHashMap
ConcurrentHashMapLRUCache.java
import java.util.LinkedHashMap;
import java.util.Map;
public class ConcurrentHashMapLRUCache {
private int capacity;
private Map<Integer, Integer> cache;
public ConcurrentHashMapLRUCache(int capacity) {
this.capacity = capacity;
this.cache = new LinkedHashMap<Integer, Integer>(capacity, 0.75f, true) {
@Override
protected boolean removeEldestEntry(Map.Entry<Integer, Integer> eldest) {
return size() > capacity;
}
};
}
public int get(int key) {
return cache.getOrDefault(key, -1);
}
public void put(int key, int value) {
cache.put(key, value);
}
}
Output:
Suppose we have the following code to test the ConcurrentHashMapLRUCache:
ConcurrentHashMapLRUCache cache = new ConcurrentHashMapLRUCache(2);
cache.put(1, 1);
cache.put(2, 2);
System.out.println(cache.get(1)); // Output: 1
cache.put(3, 3);
System.out.println(cache.get(2)); // Output: -1
cache.put(4, 4);
System.out.println(cache.get(1)); // Output: -1
System.out.println(cache.get(3)); // Output: 3
System.out.println(cache.get(4)); // Output: 4
Explanation:
In this implementation, we use a ConcurrentHashMap and a custom eviction policy. The LinkedHashMap keeps track of the order of the items in the cache, and the custom eviction policy removes the least recently used item when the cache reaches its limit. The get method retrieves the value of a key from the Map, and the put method adds a new key-value pair to the Map.
In conclusion, a multithreaded LRU cache is a useful data structure for storing frequently accessed data in memory. When implementing a multithreaded LRU cache in Java, it is important to consider synchronization to prevent race conditions and ensure cache consistency. There are several ways to implement a multithreaded LRU cache in Java, including using a synchronized Map, a ConcurrentHashMap, and a ReentrantLock.