This question evaluates understanding of linear regression invariance under invertible linear transformations, multivariate linear algebra, and the impact of regularization (ridge and lasso) on coefficient and prediction stability in the Statistics & Math domain.
You fit Model 1: y ~ X1 + X2. You also fit Model 2 using Z = [X1 − X2, X1 + X2] = X T where T = [[1,1], [−1,1]] (2×2, invertible). a) Prove that OLS predictions ŷ are identical for Model 1 and Model 2 for any invertible T; derive the mapping between coefficients (b_Z = T^{-1} b_X). b) If you use ridge (λ||b||₂²) or lasso (λ||b||₁) instead of OLS, will coefficients and predictions remain invariant under this T? Specify conditions precisely (e.g., ridge invariance to orthonormal transforms but not arbitrary scalings; lasso invariance only to signed permutations, not general rotations). c) For ridge, write solutions and show when ŷ is unchanged; for lasso, provide a concrete counterexample where predictions differ.