You are given n pickup orders for a grocery delivery counter. Each order i is described by orders[i] = [arrival_i, service_i] in minutes, where arrival_i is when the shopper arrives and service_i is how long a single clerk needs to hand off the order. The counter opens at time 0 and the clerk processes one order at a time; when the clerk is idle and the next shopper has not yet arrived, the clerk waits. Compute the average waiting time across all shoppers (waiting includes service time). Assume arrival_i are non-decreasing; if they are not, sort them by arrival time first. Follow-ups: (a) Extend the solution when there are k >= 1 identical clerks; each arriving order is assigned to the clerk that becomes available the earliest. Return the average waiting time and justify your data structures and time complexity. (b) Discuss how your approach scales for very large n (e.g., n up to 2e 5) and what the big-O time and space complexities are.