This question evaluates algorithmic problem-solving with arrays and visibility relations, assessing competency in array traversal, handling of relative ordering constraints, and designing time- and space-efficient approaches.
You are given an array heights of length n, where heights[i] is the height of the i-th person standing in a line from left to right.
For each index i, compute how many people to the right of i are visible to person i.
Person i can see person j (j > i) if every person k with i < k < j is shorter than both i and j:
heights[k] < min(heights[i], heights[j])
for all
k
between them.
Equivalently (often easier to reason about): looking from i to the right, person i can see a sequence of shorter people, and they can also see the first person whose height is greater than or equal to heights[i] (and then visibility stops beyond that point).
heights
(size
n
).
ans
of size
n
where
ans[i]
is the number of visible people to the right of person
i
.
heights = [10, 6, 8, 5, 11, 9]
[3, 1, 2, 1, 1, 0]
1 <= n <= 100000
1 <= heights[i] <= 10^9