Explain Logistic Regression Fundamentals | Amazon
Back
|
Home
/
Machine Learning
/
Amazon
Explain Logistic Regression Fundamentals
Amazon
Sep 6, 2025, 12:00 AM
Machine Learning Engineer
Onsite
Machine Learning
0
0
Logistic Regression from First Principles
Assumptions and Notation
Binary classification with labels y ∈ {0, 1} and features x ∈ R^d.
Linear score z = wᵀx + b; probability p(x) = P(y=1 | x).
Tasks
Derive logistic regression starting from a Bernoulli likelihood and the logit link.
Derive the negative log-likelihood (log-loss) and gradients for w and b.
Explain the logit link function and how decision thresholds relate to costs.
Discuss probability calibration: when it works, how to measure it, and how to fix miscalibration.
Compare L1 vs L2 regularization (optimization, sparsity, correlation, calibration).
Explain effects of feature scaling and modeling interactions.
Describe strategies for handling class imbalance.
Choose appropriate evaluation metrics for imbalanced data.
Interpret coefficients via odds and odds ratios.
Identify common failure modes and guardrails.
Solution
(Locked)
Login required
10 XP
Comments (0)
Sign in
to leave a comment
Loading comments...