This question evaluates understanding of incremental aggregation and state management, including handling overlapping predicates, caching expensive or asynchronous computations, and algorithmic complexity considerations in time-series PnL updates.
You are asked to modify an existing class without changing its public API:
class Table:
def __init__(self, filters, times):
pass
def add_trade(self, trade):
pass
def view_result(self) -> list[list[int]]:
pass
def set_global_filter(self, global_filter):
pass
Assume the following:
filters
is a list of predicates, where each predicate takes a
trade
and returns
True
or
False
.
times
is an ordered list of time points such as
t0, t1, t2, ...
.
get_pnl(trade, time)
returns the PnL of a trade at a given time.
global_filter
is an extra predicate applied in addition to the row filter. It is initially
None
.
view_result() should return a 2D list where:
i
corresponds to
filters[i]
j
corresponds to
times[j]
result[i][j]
is the sum of
get_pnl(trade, times[j])
over all trades that satisfy
filters[i]
and also satisfy
global_filter
if it is set
The starter implementation makes add_trade() simple and computes view_result() using nested loops over all trades, filters, and times.
Redesign the internals so that:
view_result()
runs in
O(1)
time.
add_trade(trade)
incrementally updates any internal state needed for fast queries.
set_global_filter(global_filter)
correctly changes the active filter and keeps subsequent results accurate.
get_pnl(trade, time)
becomes
asynchronous / callback-based
and expensive, so repeated calls should be avoided by caching results inside the class.
Describe or implement a clean state management strategy, including how to handle overlapping filters and changes to the global filter without returning stale results.