You are helping to optimize the capacity of a cloud system.
There are n servers, where n is always even. The memory capacity of the i-th server is given by an integer array memory of length n.
When there are n = 2 * x servers:
-
Exactly
x
servers will be chosen as
primary
servers.
-
The remaining
x
servers will be
backup
servers.
For every primary server P, there must exist a distinct backup server B such that:
memory[B] >= memory[P]
(Each server can be used as either primary or backup in exactly one such pair, so the n servers are partitioned into x (primary, backup) pairs.)
The system memory capacity is defined as the sum of the memory capacities of all primary servers:
capacity = sum(memory[P] over all primary servers P)
Your task is to determine the maximum possible system memory capacity that can be obtained by:
-
Partitioning the
n
servers into
n/2
disjoint (primary, backup) pairs, and
-
Respecting the constraint
memory[backup] >= memory[primary]
within each pair.
Input
-
An even integer
n
(number of servers).
-
An integer array
memory
of length
n
, where
memory[i]
is the memory capacity of the
i
-th server.
You may assume:
-
2 <= n <= 2 * 10^5
, and
n
is even.
-
1 <= memory[i] <= 10^9
.
Output
-
Return a single integer: the
maximum possible system memory capacity
.
Complexity requirement
Design an algorithm efficient for n up to about 2 * 10^5 (i.e., significantly better than O(n^2) time).