Implement LRU and LFU caches
Company: Verkada
Role: Software Engineer
Category: Coding & Algorithms
Difficulty: medium
Interview Round: Onsite
Implement two in-memory cache classes with fixed capacity:
1. **Recent-use cache**
- `get(key)`: return the value for `key`, or `-1` if the key does not exist.
- `put(key, value)`: insert or update the key.
- When the cache is full, evict the **least recently used** entry.
- Target time complexity: **O(1)** average for both operations.
2. **Frequency-based cache**
- Support the same `get` and `put` API.
- When the cache is full, evict the entry with the **lowest access frequency**.
- If multiple entries have the same frequency, evict the **least recently used** among them.
- Target time complexity: **O(1)** average for both operations.
Be prepared to explain the data structures you choose and how you would test edge cases such as updates, repeated reads, and eviction order.
Quick Answer: This question evaluates understanding of cache eviction policies and data structure design, focusing on LRU and LFU semantics and the ability to support O(1) average-time get/put operations.