This multi-part question evaluates proficiency in streaming and online algorithms, DAG-based scheduling and parallelism reasoning, and constrained dynamic programming/optimization for sliding-window resource limits, within the Coding & Algorithms domain.
You were asked multiple algorithmic questions.
SYou receive an infinite stream of integers (can be positive or negative). A real number S is given.
After each new integer arrives, output the longest contiguous subarray within the elements seen so far whose average equals S.
Clarify/assume for the interview version:
(start_index, end_index)
.
You are given N tasks. Each task i has a duration time[i]. Some tasks depend on others, forming a directed acyclic graph (DAG).
Compute the total time required to finish all tasks subject to dependencies.
(Interviewers often clarify one of these variants; state which you’re solving.)
M parallel workersNow you have M identical workers that can run tasks in parallel when dependencies are met.
Compute the total completion time.
(Again clarify/assume):
N
), or
You have a time-ordered sequence of ads over N time slots. Ad i has revenue r[i].
Choose which ads to show (e.g., pick a subset of positions, keeping the original order) to maximize total revenue, subject to:
T time slots
, the
total revenue of the shown ads inside that window
is
≤ M
.
Follow-up: