You have a 70-minute assessment with several ML-fundamentals multiple-choice questions. Answer the following (show calculations where applicable).
1) Confusion matrix vs. TPR/FPR
You are given four candidate confusion matrices (binary classification). For each matrix, rows are Actual and columns are Predicted.
Option A
-
TP=80, FN=20, FP=10, TN=90
Option B
Option C
-
TP=50, FN=50, FP=5, TN=95
Option D
-
TP=90, FN=10, FP=30, TN=70
Target requirements:
-
True Positive Rate (TPR / Recall) =
0.80
-
False Positive Rate (FPR) =
0.10
Question: Which option(s) satisfy both requirements?
2) Sigmoid neuron calculation
A single neuron computes:
z=w⊤x+b
y^=σ(z)=1+e−z1
Given:
-
w=[0.5,−1.0]
-
x=[2.0,1.0]
-
b=0
Question: Compute z and y^ (to 3 decimal places).
3) Activation functions
Consider the following activation functions: sigmoid, tanh, ReLU, Leaky ReLU, softmax.
Questions:
-
Which activation is most commonly used in the
output layer
for
multi-class single-label
classification?
-
Give two reasons ReLU-like activations are commonly used in
hidden layers
of deep networks.
-
Name one common pitfall of sigmoid/tanh in deep hidden layers and why it happens.