You are given n customer orders; each order i = [a_i, p_i] where a_i is the arrival time and p_i is the preparation time. A single chef processes orders non-preemptively. If the chef is idle and multiple orders are available, they pick the earliest-arrived order among those waiting. Compute the average waiting time across all customers, where waiting_i = finish_i − a_i. Return a floating-point value. Constraints: 1 ≤ n ≤ 100000; 0 ≤ a_i < 1e9; 1 ≤ p_i ≤ 1e4. Follow-ups: (