Problem
You are given an array heights of length n, where heights[i] is the height of the i-th person standing in a line from left to right.
For each index i, compute how many people to the right of i are visible to person i.
Visibility rule
Person i can see person j (j > i) if every person k with i < k < j is shorter than both i and j:
-
heights[k] < min(heights[i], heights[j])
for all
k
between them.
Equivalently (often easier to reason about): looking from i to the right, person i can see a sequence of shorter people, and they can also see the first person whose height is greater than or equal to heights[i] (and then visibility stops beyond that point).
Input
-
Integer array
heights
(size
n
).
Output
-
Integer array
ans
of size
n
where
ans[i]
is the number of visible people to the right of person
i
.
Example
-
Input:
heights = [10, 6, 8, 5, 11, 9]
-
Output:
[3, 1, 2, 1, 1, 0]
Constraints
-
1 <= n <= 100000
-
1 <= heights[i] <= 10^9