Coding: Implement an LRU Cache and discuss concurrency
Design and implement an in-memory Least Recently Used (LRU) cache data structure.
The cache should:
-
Have a fixed capacity N specified at construction time
-
Store key-value pairs where keys and values are integers (for simplicity)
-
Support the following operations in average O(1) time:
-
get(key):
-
If the key exists, return its value and mark this key as the most recently used
-
If the key does not exist, return -1
-
put(key, value):
-
Insert or update the key with the given value
-
If inserting a new key causes the number of keys to exceed capacity N, evict the least recently used key from the cache before inserting
Clarify and then implement the following:
-
Describe the data structures you would use to achieve O(1) time for both get and put.
-
Implement the LRU cache class with constructor(capacity), get(key), and put(key, value) methods.
-
Explain the time and space complexity of your implementation.
Follow-up (conceptual and design, not necessarily full code):
-
How would you modify your LRU cache so that it is safe to use from multiple threads concurrently?
-
Discuss possible approaches such as coarse-grained locking, fine-grained locking, or using concurrent data structures.
-
Talk about trade-offs between simplicity, contention, and performance.
-
What race conditions or consistency issues might occur if multiple threads call get and put at the same time, and how does your design avoid them?