PracHub
QuestionsPremiumLearningGuidesCheatsheetNEWCoaches
|Home/Statistics & Math/Microsoft

Choose Classification Metrics Under Asymmetric Costs

Last updated: Mar 29, 2026

Quick Overview

This question evaluates understanding of binary classification evaluation metrics and model-operating decisions, covering confusion-matrix-derived measures (precision, recall/sensitivity, specificity, false positive/negative rates), summary metrics (ROC-AUC, PR-AUC), Type I/II errors, threshold selection, prevalence drift, probability calibration, and cost-sensitive trade-offs. It is commonly asked to assess a data scientist's ability to reason about model performance under class imbalance and asymmetric business costs, and it belongs to the Statistics & Math / Machine Learning model-evaluation domain requiring both conceptual understanding and practical application.

  • medium
  • Microsoft
  • Statistics & Math
  • Data Scientist

Choose Classification Metrics Under Asymmetric Costs

Company: Microsoft

Role: Data Scientist

Category: Statistics & Math

Difficulty: medium

Interview Round: Technical Screen

You are evaluating a binary classification model for a business problem. Explain how to use a confusion matrix to compute and interpret: - precision, - recall (sensitivity), - specificity, - false positive rate, - false negative rate, - ROC-AUC, - PR-AUC. Also answer the following: 1. What is the difference between Type I and Type II error? 2. How does your metric choice change across scenarios such as spam filtering, medical screening, fraud detection, and ad click prediction? 3. Why can accuracy be misleading under class imbalance? 4. How do decision thresholds, prevalence drift, and probability calibration affect model evaluation? 5. If false positives and false negatives have different business costs, how would you choose an operating threshold?

Quick Answer: This question evaluates understanding of binary classification evaluation metrics and model-operating decisions, covering confusion-matrix-derived measures (precision, recall/sensitivity, specificity, false positive/negative rates), summary metrics (ROC-AUC, PR-AUC), Type I/II errors, threshold selection, prevalence drift, probability calibration, and cost-sensitive trade-offs. It is commonly asked to assess a data scientist's ability to reason about model performance under class imbalance and asymmetric business costs, and it belongs to the Statistics & Math / Machine Learning model-evaluation domain requiring both conceptual understanding and practical application.

Related Interview Questions

  • Use confusion matrix to choose model metric - Microsoft (easy)
  • Compute sample size and analyze A/B results - Microsoft (medium)
  • Test classifier difference with McNemar's test - Microsoft (medium)
  • Compute P(Bag B | red) via Bayes - Microsoft (easy)
Microsoft logo
Microsoft
Feb 25, 2026, 12:00 AM
Data Scientist
Technical Screen
Statistics & Math
3
0

You are evaluating a binary classification model for a business problem.

Explain how to use a confusion matrix to compute and interpret:

  • precision,
  • recall (sensitivity),
  • specificity,
  • false positive rate,
  • false negative rate,
  • ROC-AUC,
  • PR-AUC.

Also answer the following:

  1. What is the difference between Type I and Type II error?
  2. How does your metric choice change across scenarios such as spam filtering, medical screening, fraud detection, and ad click prediction?
  3. Why can accuracy be misleading under class imbalance?
  4. How do decision thresholds, prevalence drift, and probability calibration affect model evaluation?
  5. If false positives and false negatives have different business costs, how would you choose an operating threshold?

Solution

Show

Comments (0)

Sign in to leave a comment

Loading comments...

Browse More Questions

More Statistics & Math•More Microsoft•More Data Scientist•Microsoft Data Scientist•Microsoft Statistics & Math•Data Scientist Statistics & Math
PracHub

Master your tech interviews with 7,500+ real questions from top companies.

Product

  • Questions
  • Learning Tracks
  • Interview Guides
  • Resources
  • Premium
  • For Universities
  • Student Access

Browse

  • By Company
  • By Role
  • By Category
  • Topic Hubs
  • SQL Questions
  • Compare Platforms
  • Discord Community

Support

  • support@prachub.com
  • (916) 541-4762

Legal

  • Privacy Policy
  • Terms of Service
  • About Us

© 2026 PracHub. All rights reserved.