This question evaluates competence in high-dimensional linear regression, penalized estimation (L1/L2/elastic net), preprocessing and feature handling, cross-validation-based model selection, behavior with correlated predictors, and implications for post-selection inference.
You must fit linear regression with p = 500 predictors and n = 600 observations. What failure modes do you expect and why does OLS overfit when p is comparable to n? Write the L1-regularized objective (Lasso), explain its geometric effect on coefficients, and discuss behavior under correlated predictors (grouping vs instability). Describe how you would select λ (K-fold cross-validation with a one-standard-error rule), how you would preprocess features (centering, scaling, handling categorical variables), and how you would evaluate generalization (nested CV, held-out R²). Contrast L1 with L2 and elastic net for p ≈ n and p ≥ n, and note any implications for inference (post-selection bias).