PracHub
QuestionsPremiumLearningGuidesCheatsheetNEWCoaches
|Home/Machine Learning/Snapchat

Explain BatchNorm, optimizers, and L1/L2

Last updated: Mar 29, 2026

Quick Overview

This question evaluates a candidate's understanding of core machine learning fundamentals—Batch Normalization, optimizer behaviors (SGD, Momentum, RMSProp, Adam), and regularization methods (L1 vs L2)—and the candidate's ability to reason about training dynamics, parameter effects, and inference versus training distinctions in the Machine Learning domain. It is commonly asked in technical interviews because these topics reveal comprehension of optimization dynamics, generalization and sparsity trade-offs, and implementation implications, testing both conceptual understanding and practical application.

  • medium
  • Snapchat
  • Machine Learning
  • Machine Learning Engineer

Explain BatchNorm, optimizers, and L1/L2

Company: Snapchat

Role: Machine Learning Engineer

Category: Machine Learning

Difficulty: medium

Interview Round: Onsite

## Prompt Answer the following ML fundamentals questions: 1. **Batch Normalization (BatchNorm):** - What trainable parameters does BatchNorm have? - What statistics are used during training vs inference? - Why does BatchNorm help optimization, and what are common pitfalls? 2. **Optimizers:** Compare SGD, Momentum, RMSProp, and Adam. - What problem does each address? - When might plain SGD generalize better than Adam? 3. **Regularization:** Compare **L1** and **L2** regularization. - How do they affect weights and sparsity? - How do they relate to MAP estimation (priors) if you know that framing?

Quick Answer: This question evaluates a candidate's understanding of core machine learning fundamentals—Batch Normalization, optimizer behaviors (SGD, Momentum, RMSProp, Adam), and regularization methods (L1 vs L2)—and the candidate's ability to reason about training dynamics, parameter effects, and inference versus training distinctions in the Machine Learning domain. It is commonly asked in technical interviews because these topics reveal comprehension of optimization dynamics, generalization and sparsity trade-offs, and implementation implications, testing both conceptual understanding and practical application.

Related Interview Questions

  • Explain Overfitting and Transformer Attention - Snapchat (medium)
  • Discuss ML Project Tradeoffs - Snapchat (medium)
  • Model an ads ranking system - Snapchat (medium)
  • Explain CLIP, contrastive losses, and retrieval limits - Snapchat (medium)
  • Explain Core ML Concepts - Snapchat (medium)
Snapchat logo
Snapchat
Feb 11, 2026, 12:00 AM
Machine Learning Engineer
Onsite
Machine Learning
3
0
Loading...

Prompt

Answer the following ML fundamentals questions:

  1. Batch Normalization (BatchNorm):
    • What trainable parameters does BatchNorm have?
    • What statistics are used during training vs inference?
    • Why does BatchNorm help optimization, and what are common pitfalls?
  2. Optimizers: Compare SGD, Momentum, RMSProp, and Adam.
    • What problem does each address?
    • When might plain SGD generalize better than Adam?
  3. Regularization: Compare L1 and L2 regularization.
    • How do they affect weights and sparsity?
    • How do they relate to MAP estimation (priors) if you know that framing?

Solution

Show

Comments (0)

Sign in to leave a comment

Loading comments...

Browse More Questions

More Machine Learning•More Snapchat•More Machine Learning Engineer•Snapchat Machine Learning Engineer•Snapchat Machine Learning•Machine Learning Engineer Machine Learning
PracHub

Master your tech interviews with 7,500+ real questions from top companies.

Product

  • Questions
  • Learning Tracks
  • Interview Guides
  • Resources
  • Premium
  • For Universities
  • Student Access

Browse

  • By Company
  • By Role
  • By Category
  • Topic Hubs
  • SQL Questions
  • Compare Platforms
  • Discord Community

Support

  • support@prachub.com
  • (916) 541-4762

Legal

  • Privacy Policy
  • Terms of Service
  • About Us

© 2026 PracHub. All rights reserved.