You are given a short ML fundamentals assessment with three parts.
Part A — Precision/Recall/F1
A binary classifier on a dataset produced the following confusion-matrix counts:
-
True Positives (TP) = 40
-
False Positives (FP) = 10
-
False Negatives (FN) = 20
-
True Negatives (TN) = 130
-
Compute
precision
,
recall
, and
F1
.
-
If you
raise the decision threshold
, what typically happens to precision and recall (and why)?
-
In an
imbalanced
dataset where positives are rare, when would you optimize for precision vs. recall?
Part B — Ensemble learning (select all correct)
For each statement, mark whether it is generally True or False, and briefly justify.
-
Bagging primarily reduces
variance
.
-
Bagging uses
bootstrap sampling
(sampling with replacement) to train each base model.
-
Boosting trains base learners
independently
and can be fully parallelized without changing the algorithm.
-
Ensembles can improve generalization because averaging/voting can cancel out uncorrelated errors.
-
If the base learners are perfectly correlated (always make the same predictions), bagging will provide large gains.
Part C — Forward pass of a small neural network
Compute the network output for the following fully connected network. Use the sigmoid activation
σ(t)=1+e−t1 at both layers.
-
Input
x=[1.0, 2.0]T
-
Hidden layer (2 units):
h=σ(W1x+b1)
-
W1=[0.51.0−1.00.5]
,
b1=[0.0−0.5]
-
Output layer (1 unit):
y^=σ(W2h+b2)
-
W2=[1.5−2.0]
,
b2=0.1
Return y^ as a decimal rounded to 4 digits.