1D k-Center (Minimax L1) Clustering
Problem
You are given an array of n integers on a number line and an integer k (1 ≤ k ≤ n). Place k cluster centers on the real line to minimize the maximum L1 (absolute) distance between any point and its nearest center.
-
Distance between a point x and a center c is |x − c|.
-
Centers may be placed anywhere on the real line (not necessarily at input points).
-
Output the minimum possible maximum distance (the optimal radius).
Goal
Return the optimal radius R* such that all points can be covered by k intervals of radius R* around the chosen centers, and no smaller radius suffices.
Notes
-
Think of this as covering all sorted points with k intervals of length 2R.
-
Typical approach: sort, binary search on R, and use a greedy feasibility check.
-
If you must return an integer and the true optimum is a half-integer, you can either output a floating value (e.g., with 1e-6 precision) or scale coordinates by 2 to keep everything integral.