Problem
Design and implement an in-memory LRU (Least Recently Used) cache that supports concurrent access.
Basic requirements
Implement a cache with fixed capacity supporting:
-
get(key) -> value | -1
: Return the value if present; otherwise return
-1
. A successful
get
marks the entry as
most recently used
.
-
put(key, value)
: Insert/update the key. If the cache exceeds
capacity
, evict the
least recently used
entry.
All operations should be O(1) average time.
Concurrency requirements
Now extend the cache so it can be safely used by multiple threads calling get/put concurrently.
You should:
-
Define the thread-safety guarantees (e.g., linearizable operations).
-
Aim to
minimize lock contention
under high read/write concurrency.
-
Clarify how you handle races between
get
and
put
on the same key.
Constraints (assume)
-
1 <= capacity <= 10^6
-
Up to
10^7
operations
-
Keys are hashable (e.g., integers or strings)
Deliverables
-
Data structures you would use.
-
API and behavior under concurrency (including any acceptable trade-offs).