PracHub
QuestionsPremiumLearningGuidesCheatsheetNEWCoaches
|Home/Statistics & Math/Microsoft

Use confusion matrix to choose model metric

Last updated: Mar 29, 2026

Quick Overview

This question evaluates understanding of confusion matrix components, mapping Type I/Type II errors to false positives/negatives, selection and interpretation of classification metrics (accuracy, precision/recall, F1, ROC-AUC, PR-AUC, calibration), threshold choice under asymmetric costs, and identification of practical pitfalls like class imbalance, data leakage, shifting base rates, and calibration issues. It is commonly asked in Statistics & Math interviews for Data Scientist roles to assess the ability to translate model performance into business impact, testing both conceptual understanding and practical application of evaluation and cost-sensitive decision-making.

  • easy
  • Microsoft
  • Statistics & Math
  • Data Scientist

Use confusion matrix to choose model metric

Company: Microsoft

Role: Data Scientist

Category: Statistics & Math

Difficulty: easy

Interview Round: Technical Screen

## Scenario You built a binary classifier (e.g., fraud detection, churn risk, medical screening, spam). You are given a confusion matrix on a validation set: - True Positives (TP) - False Positives (FP) - True Negatives (TN) - False Negatives (FN) ## Questions 1. Explain what each confusion matrix cell means in the context of the business scenario. 2. Define **Type I** and **Type II** errors and map them to FP/FN. 3. Which evaluation metrics would you report (e.g., accuracy, precision/recall, F1, ROC-AUC, PR-AUC, calibration, expected cost)? When is each appropriate? 4. How would you choose an operating threshold given asymmetric costs? 5. What pitfalls should you watch for (class imbalance, data leakage, shifting base rates, calibration issues)?

Quick Answer: This question evaluates understanding of confusion matrix components, mapping Type I/Type II errors to false positives/negatives, selection and interpretation of classification metrics (accuracy, precision/recall, F1, ROC-AUC, PR-AUC, calibration), threshold choice under asymmetric costs, and identification of practical pitfalls like class imbalance, data leakage, shifting base rates, and calibration issues. It is commonly asked in Statistics & Math interviews for Data Scientist roles to assess the ability to translate model performance into business impact, testing both conceptual understanding and practical application of evaluation and cost-sensitive decision-making.

Related Interview Questions

  • Choose Classification Metrics Under Asymmetric Costs - Microsoft (medium)
  • Compute sample size and analyze A/B results - Microsoft (medium)
  • Test classifier difference with McNemar's test - Microsoft (medium)
  • Compute P(Bag B | red) via Bayes - Microsoft (easy)
Microsoft logo
Microsoft
Feb 9, 2026, 11:59 AM
Data Scientist
Technical Screen
Statistics & Math
5
0
Loading...

Scenario

You built a binary classifier (e.g., fraud detection, churn risk, medical screening, spam).

You are given a confusion matrix on a validation set:

  • True Positives (TP)
  • False Positives (FP)
  • True Negatives (TN)
  • False Negatives (FN)

Questions

  1. Explain what each confusion matrix cell means in the context of the business scenario.
  2. Define Type I and Type II errors and map them to FP/FN.
  3. Which evaluation metrics would you report (e.g., accuracy, precision/recall, F1, ROC-AUC, PR-AUC, calibration, expected cost)? When is each appropriate?
  4. How would you choose an operating threshold given asymmetric costs?
  5. What pitfalls should you watch for (class imbalance, data leakage, shifting base rates, calibration issues)?

Solution

Show

Comments (0)

Sign in to leave a comment

Loading comments...

Browse More Questions

More Statistics & Math•More Microsoft•More Data Scientist•Microsoft Data Scientist•Microsoft Statistics & Math•Data Scientist Statistics & Math
PracHub

Master your tech interviews with 7,500+ real questions from top companies.

Product

  • Questions
  • Learning Tracks
  • Interview Guides
  • Resources
  • Premium
  • For Universities
  • Student Access

Browse

  • By Company
  • By Role
  • By Category
  • Topic Hubs
  • SQL Questions
  • Compare Platforms
  • Discord Community

Support

  • support@prachub.com
  • (916) 541-4762

Legal

  • Privacy Policy
  • Terms of Service
  • About Us

© 2026 PracHub. All rights reserved.