Implement a function that returns all unique triplets (i, j, k) of indices whose values sum to a target T, with strong performance and deduplication guarantees.
Inputs: integer array A of length n (2 ≤ n ≤ 200,000), values in [−10^9, 10^9]; integer T (|T| ≤ 10^9).
Outputs: list of triplets of values (a, b, c) in nondecreasing order per triplet and lexicographically sorted overall; no duplicate triplets allowed.
Constraints and sub-questions:
-
Provide an O(n^2) time, O(1) extra-space (beyond output) solution for T=0 using sorting + two pointers. Prove correctness of your duplicate-skipping logic in adversarial cases like A=[0,0,0,0,0].
-
Generalize to arbitrary T and to k-sum (k≥4) with pruning. What complexity do you achieve and how do you prevent integer overflow?
-
Streaming/scale twist: If n can be 5×10^6 and memory caps at 512 MB, outline an external-memory approach (e.g., chunked sort + k-way merge or hashed buckets) and estimate I/O complexity. How would you approximate the count of unique triplets with a given T using sketches if exact enumeration is infeasible?
-
Bonus: Given frequency map F of values instead of A, compute the number of unique triplets summing to T in O(m^2) where m=|support(F)|, carefully handling cases where values repeat (e.g., x=x=y or x=y=z). Provide formulas and edge-case handling.