Bias–Variance Tradeoff: Intuition, Derivation, and Practical Tuning
Task
Explain the bias–variance tradeoff at two levels and connect it to model tuning.
Requirements
-
Intuition: In two simple sentences for a non-technical stakeholder, explain the bias–variance tradeoff.
-
Rigor: Derive the expected test MSE decomposition at a fixed x, showing
E[(y − ŷ)^2] = irreducible_noise + bias^2 + variance,
and interpret each term.
-
Using either ridge regression or decision trees:
a) Qualitatively explain how training set size and regularization (e.g., lambda or max_depth) affect bias and variance.
b) Provide a concrete numeric example where increasing regularization raises bias, lowers variance, and reduces test error—show the numbers.
c) Name two diagnostics you would plot to choose the operating point (e.g., learning curves, validation curves) and one method to cut variance with minimal bias increase (e.g., bagging).