PracHub
QuestionsPremiumLearningGuidesCheatsheetNEWCoaches
|Home/Machine Learning/Pinterest

Explain learning-rate fluctuation and vanishing gradients

Last updated: Mar 29, 2026

Quick Overview

This question evaluates a candidate's understanding of optimization dynamics (effects of learning rate on training stability) and backpropagation-related issues (vanishing gradients) in deep neural networks, assessing competencies in training behavior and gradient propagation within the Machine Learning domain.

  • easy
  • Pinterest
  • Machine Learning
  • Machine Learning Engineer

Explain learning-rate fluctuation and vanishing gradients

Company: Pinterest

Role: Machine Learning Engineer

Category: Machine Learning

Difficulty: easy

Interview Round: Technical Screen

### ML Fundamentals Answer the following conceptual questions: 1. **Learning rate vs. training stability**: Why can training metrics (loss/accuracy) **fluctuate or oscillate** when the learning rate is too large? What happens when it is too small? 2. **Vanishing gradients in fully connected networks**: In a deep fully connected network trained with backpropagation, are **vanishing gradients** more likely to affect layers closer to the **input** or closer to the **output**? Explain why, and name common mitigations.

Quick Answer: This question evaluates a candidate's understanding of optimization dynamics (effects of learning rate on training stability) and backpropagation-related issues (vanishing gradients) in deep neural networks, assessing competencies in training behavior and gradient propagation within the Machine Learning domain.

Related Interview Questions

  • Explain overfitting, underfitting, and regularization - Pinterest (hard)
  • Answer core ML fundamentals questions - Pinterest (hard)
  • Implement Naive Bayes classifier from scratch - Pinterest (hard)
  • Implement bagging with decision trees - Pinterest (hard)
  • Explain bias–variance, overfitting, and vanishing gradients - Pinterest (medium)
Pinterest logo
Pinterest
Dec 15, 2025, 12:00 AM
Machine Learning Engineer
Technical Screen
Machine Learning
3
0

ML Fundamentals

Answer the following conceptual questions:

  1. Learning rate vs. training stability : Why can training metrics (loss/accuracy) fluctuate or oscillate when the learning rate is too large? What happens when it is too small?
  2. Vanishing gradients in fully connected networks : In a deep fully connected network trained with backpropagation, are vanishing gradients more likely to affect layers closer to the input or closer to the output ? Explain why, and name common mitigations.

Solution

Show

Comments (0)

Sign in to leave a comment

Loading comments...

Browse More Questions

More Machine Learning•More Pinterest•More Machine Learning Engineer•Pinterest Machine Learning Engineer•Pinterest Machine Learning•Machine Learning Engineer Machine Learning
PracHub

Master your tech interviews with 7,500+ real questions from top companies.

Product

  • Questions
  • Learning Tracks
  • Interview Guides
  • Resources
  • Premium
  • For Universities
  • Student Access

Browse

  • By Company
  • By Role
  • By Category
  • Topic Hubs
  • SQL Questions
  • Compare Platforms
  • Discord Community

Support

  • support@prachub.com
  • (916) 541-4762

Legal

  • Privacy Policy
  • Terms of Service
  • About Us

© 2026 PracHub. All rights reserved.