Top C++ Interview Questions for Quants: Implement LRU Cache

by Clement D.
Interview question for quant

One of the most common C++ interview questions for quantitative finance roles is the LRU (Least Recently Used) Cache. It looks simple at first, but it tests a candidate’s ability to design efficient data structures, balance time and space complexity, and leverage the C++ Standard Library effectively. How to solve one of the top C++ interview questions? Let’s dive in!

1. Problem Statement

Design and implement a Least Recently Used (LRU) Cache in C++. The cache should support the following operations:

  1. get(key) → Return the value if the key exists in the cache; otherwise return “not found.” Accessing a key should mark it as the most recently used.
  2. put(key, value) → Insert or update a key-value pair. If the cache exceeds its capacity, it must evict the least recently used item.

Requirements:

  • Both operations should run in O(1) average time complexity.
  • The cache should be limited to a fixed capacity defined at construction.
  • You may assume all keys are unique.
  • Iterators or pointers must remain valid during reordering.
  • The design should be clean, modern C++, using STL where appropriate.

This problem is a classic interview favorite because it tests understanding of hash maps, linked lists, and how to combine data structures for performance-critical systems.

2. Implementation

This is a suggestion of implementation:

#include <list>
#include <unordered_map>
#include <optional>
#include <iostream>
#include <string>

template <class Key, class Value>
class LRUCache {
public:
    explicit LRUCache(std::size_t capacity) : cap_(capacity) {}

    // Return value if present; moves the entry to the front (most-recently used).
    std::optional<Value> get(const Key& key) {
        auto it = idx_.find(key);
        if (it == idx_.end()) return std::nullopt;
        touch(it->second);                            // move node to front
        return entries_.front().second;               // value after touch
    }

    // Insert or update; moves/creates entry as most-recently used.
    void put(const Key& key, const Value& value) {
        auto it = idx_.find(key);
        if (it != idx_.end()) {
            // update value and move to front
            it->second->second = value;
            touch(it->second);
            return;
        }
        // evict if needed
        if (entries_.size() == cap_) {
            const Key& k_evict = entries_.back().first;
            idx_.erase(k_evict);
            entries_.pop_back();
        }
        // emplace new at front
        entries_.emplace_front(key, value);
        idx_[key] = entries_.begin();
    }

    bool contains(const Key& key) const { return idx_.count(key) != 0; }
    std::size_t size() const { return entries_.size(); }

private:
    using Node = std::pair<Key, Value>;
    using List = std::list<Node>;
    using Iter = typename List::iterator;

    void touch(Iter it) {
        // move node to front (MRU)
        entries_.splice(entries_.begin(), entries_, it);
    }

    std::size_t cap_;
    List entries_;                          // front = most-recently used
    std::unordered_map<Key, Iter> idx_;     // key -> node iterator
};


// -----------------------------
// Example main() for testing
// -----------------------------
int main() {
    LRUCache<int, std::string> cache(2);

    cache.put(1, "one");
    cache.put(2, "two");

    if (auto v = cache.get(1)) {
        std::cout << "Get 1: " << *v << "\n";  // prints "one"
    }

    cache.put(3, "three"); // evicts key 2

    if (auto v = cache.get(2)) {
        std::cout << "Get 2: " << *v << "\n";
    } else {
        std::cout << "Get 2: miss\n";          // prints "miss"
    }

    if (auto v = cache.get(3)) {
        std::cout << "Get 3: " << *v << "\n";  // prints "three"
    }

    return 0;
}

The cache is built with two core structures: a std::list to maintain the usage order (most recently used at the front, least at the back), and an unordered_map to allow O(1) access to list nodes. When get is called, we move the accessed node to the front of the list. When put is called, we either update an existing node and move it to the front, or insert a new one. If inserting exceeds the capacity, the node at the back (the least recently used) is evicted. This combination ensures that both operations run in O(1) average time.

3. Compilation and Execution

To compile the code, prepare a CMakeLists.txt:

cmake_minimum_required(VERSION 3.10)
project(lrucache)
set(CMAKE_CXX_STANDARD 17)
add_executable(lrucache ../lrucache.cpp)

and compile via cmake:

mkdir build
cd build
cmake ..
make

Then, you can execute it with:

➜  build git:(main) ✗ ./lrucache 
Get 1: one
Get 2: miss
Get 3: three

4. Access the code on Github

The code is accessible here for you to clone, compile and run with a README file for one of the top C++ interview questions:

https://github.com/cppforquants/lrucache

You may also like