This question evaluates understanding of concurrent data structures, synchronization strategies, and LRU cache semantics within the Coding & Algorithms domain, testing the ability to design an in-memory cache with O(1) average operations and explicit thread-safety guarantees.
Design and implement an in-memory LRU (Least Recently Used) cache that supports concurrent access.
Implement a cache with fixed capacity supporting:
get(key) -> value | -1
: Return the value if present; otherwise return
-1
. A successful
get
marks the entry as
most recently used
.
put(key, value)
: Insert/update the key. If the cache exceeds
capacity
, evict the
least recently used
entry.
All operations should be O(1) average time.
Now extend the cache so it can be safely used by multiple threads calling get/put concurrently.
You should:
get
and
put
on the same key.
1 <= capacity <= 10^6
10^7
operations