Explain and quantify bias-variance tradeoff
Company: Roku
Role: Data Scientist
Category: Machine Learning
Difficulty: medium
Interview Round: Technical Screen
Explain the bias–variance tradeoff to a non-technical stakeholder in two simple sentences. Then make it rigorous: derive the expected test MSE decomposition at a fixed x, E[(y−ŷ)^2] = irreducible_noise + bias^2 + variance, and interpret each term. Using either ridge regression or decision trees: (i) explain qualitatively how training size and regularization (e.g., lambda or max_depth) move bias and variance; (ii) provide a concrete numeric example where increasing regularization raises bias, lowers variance, and reduces test error—show the numbers; (iii) name two diagnostics you would plot to pick the operating point (e.g., learning curves, validation curves) and one method to cut variance with minimal bias increase (e.g., bagging).
Quick Answer: This question evaluates understanding of the bias–variance tradeoff, the expected test-error decomposition into irreducible noise, bias^2, and variance, and practical model-tuning effects such as regularization and training-set size for models like ridge regression or decision trees.