PracHub
QuestionsPremiumLearningGuidesCheatsheetNEWCoaches
|Home/Machine Learning/Goldman Sachs

Evaluate TPR/FPR, sigmoid, and activations

Last updated: Mar 29, 2026

Quick Overview

This question evaluates competency in binary classification metrics (confusion matrices, TPR/FPR), basic neural network forward computation (sigmoid neuron z and output), and knowledge of activation function selection and pitfalls.

  • hard
  • Goldman Sachs
  • Machine Learning
  • Machine Learning Engineer

Evaluate TPR/FPR, sigmoid, and activations

Company: Goldman Sachs

Role: Machine Learning Engineer

Category: Machine Learning

Difficulty: hard

Interview Round: Take-home Project

You have a 70-minute assessment with several ML-fundamentals multiple-choice questions. Answer the following (show calculations where applicable). ## 1) Confusion matrix vs. TPR/FPR You are given four candidate confusion matrices (binary classification). For each matrix, rows are **Actual** and columns are **Predicted**. **Option A** - TP=80, FN=20, FP=10, TN=90 **Option B** - TP=45, FN=5, FP=45, TN=5 **Option C** - TP=50, FN=50, FP=5, TN=95 **Option D** - TP=90, FN=10, FP=30, TN=70 Target requirements: - True Positive Rate (TPR / Recall) = **0.80** - False Positive Rate (FPR) = **0.10** **Question:** Which option(s) satisfy both requirements? ## 2) Sigmoid neuron calculation A single neuron computes: \[ z = w^\top x + b \] \[ \hat{y} = \sigma(z) = \frac{1}{1+e^{-z}} \] Given: - \(w = [0.5, -1.0]\) - \(x = [2.0, 1.0]\) - \(b = 0\) **Question:** Compute \(z\) and \(\hat{y}\) (to 3 decimal places). ## 3) Activation functions Consider the following activation functions: **sigmoid**, **tanh**, **ReLU**, **Leaky ReLU**, **softmax**. **Questions:** 1. Which activation is most commonly used in the **output layer** for **multi-class single-label** classification? 2. Give two reasons ReLU-like activations are commonly used in **hidden layers** of deep networks. 3. Name one common pitfall of sigmoid/tanh in deep hidden layers and why it happens.

Quick Answer: This question evaluates competency in binary classification metrics (confusion matrices, TPR/FPR), basic neural network forward computation (sigmoid neuron z and output), and knowledge of activation function selection and pitfalls.

Goldman Sachs logo
Goldman Sachs
Oct 10, 2025, 12:00 AM
Machine Learning Engineer
Take-home Project
Machine Learning
1
0

You have a 70-minute assessment with several ML-fundamentals multiple-choice questions. Answer the following (show calculations where applicable).

1) Confusion matrix vs. TPR/FPR

You are given four candidate confusion matrices (binary classification). For each matrix, rows are Actual and columns are Predicted.

Option A

  • TP=80, FN=20, FP=10, TN=90

Option B

  • TP=45, FN=5, FP=45, TN=5

Option C

  • TP=50, FN=50, FP=5, TN=95

Option D

  • TP=90, FN=10, FP=30, TN=70

Target requirements:

  • True Positive Rate (TPR / Recall) = 0.80
  • False Positive Rate (FPR) = 0.10

Question: Which option(s) satisfy both requirements?

2) Sigmoid neuron calculation

A single neuron computes:

z=w⊤x+bz = w^\top x + bz=w⊤x+b

y^=σ(z)=11+e−z\hat{y} = \sigma(z) = \frac{1}{1+e^{-z}}y^​=σ(z)=1+e−z1​

Given:

  • w=[0.5,−1.0]w = [0.5, -1.0]w=[0.5,−1.0]
  • x=[2.0,1.0]x = [2.0, 1.0]x=[2.0,1.0]
  • b=0b = 0b=0

Question: Compute zzz and y^\hat{y}y^​ (to 3 decimal places).

3) Activation functions

Consider the following activation functions: sigmoid, tanh, ReLU, Leaky ReLU, softmax.

Questions:

  1. Which activation is most commonly used in the output layer for multi-class single-label classification?
  2. Give two reasons ReLU-like activations are commonly used in hidden layers of deep networks.
  3. Name one common pitfall of sigmoid/tanh in deep hidden layers and why it happens.

Solution

Show

Comments (0)

Sign in to leave a comment

Loading comments...

Browse More Questions

More Machine Learning•More Goldman Sachs•More Machine Learning Engineer•Goldman Sachs Machine Learning Engineer•Goldman Sachs Machine Learning•Machine Learning Engineer Machine Learning
PracHub

Master your tech interviews with 7,500+ real questions from top companies.

Product

  • Questions
  • Learning Tracks
  • Interview Guides
  • Resources
  • Premium
  • For Universities
  • Student Access

Browse

  • By Company
  • By Role
  • By Category
  • Topic Hubs
  • SQL Questions
  • Compare Platforms
  • Discord Community

Support

  • support@prachub.com
  • (916) 541-4762

Legal

  • Privacy Policy
  • Terms of Service
  • About Us

© 2026 PracHub. All rights reserved.