Use confusion matrix to choose model metric
Company: Microsoft
Role: Data Scientist
Category: Statistics & Math
Difficulty: easy
Interview Round: Technical Screen
Quick Answer: This question evaluates understanding of confusion matrix components, mapping Type I/Type II errors to false positives/negatives, selection and interpretation of classification metrics (accuracy, precision/recall, F1, ROC-AUC, PR-AUC, calibration), threshold choice under asymmetric costs, and identification of practical pitfalls like class imbalance, data leakage, shifting base rates, and calibration issues. It is commonly asked in Statistics & Math interviews for Data Scientist roles to assess the ability to translate model performance into business impact, testing both conceptual understanding and practical application of evaluation and cost-sensitive decision-making.