PracHub
QuestionsPremiumLearningGuidesCheatsheetNEWCoaches
|Home/Machine Learning/Snapchat

Explain Overfitting and Transformer Attention

Last updated: May 14, 2026

Quick Overview

This question evaluates understanding of model generalization and regularization techniques alongside Transformer self-attention and positional encoding, assessing competencies in diagnosing overfitting, applying appropriate mitigation strategies, and interpreting attention mechanisms.

  • medium
  • Snapchat
  • Machine Learning
  • Machine Learning Engineer

Explain Overfitting and Transformer Attention

Company: Snapchat

Role: Machine Learning Engineer

Category: Machine Learning

Difficulty: medium

Interview Round: Technical Screen

You are interviewing for a machine learning engineering role. Answer the following ML fundamentals questions clearly and compare different modeling settings. 1. What is overfitting? How would you detect it using training and validation metrics? 2. How would you reduce overfitting in a linear model? 3. How would you reduce overfitting in a deep neural network? 4. Explain the structure of Transformer self-attention. What are queries, keys, and values, and how are attention weights computed? 5. Why does a Transformer need positional information? Describe at least two ways positional information can be added.

Quick Answer: This question evaluates understanding of model generalization and regularization techniques alongside Transformer self-attention and positional encoding, assessing competencies in diagnosing overfitting, applying appropriate mitigation strategies, and interpreting attention mechanisms.

Related Interview Questions

  • Discuss ML Project Tradeoffs - Snapchat (medium)
  • Model an ads ranking system - Snapchat (medium)
  • Explain BatchNorm, optimizers, and L1/L2 - Snapchat (medium)
  • Explain CLIP, contrastive losses, and retrieval limits - Snapchat (medium)
  • Explain Core ML Concepts - Snapchat (medium)
Snapchat logo
Snapchat
Apr 29, 2026, 12:00 AM
Machine Learning Engineer
Technical Screen
Machine Learning
0
0

You are interviewing for a machine learning engineering role. Answer the following ML fundamentals questions clearly and compare different modeling settings.

  1. What is overfitting? How would you detect it using training and validation metrics?
  2. How would you reduce overfitting in a linear model?
  3. How would you reduce overfitting in a deep neural network?
  4. Explain the structure of Transformer self-attention. What are queries, keys, and values, and how are attention weights computed?
  5. Why does a Transformer need positional information? Describe at least two ways positional information can be added.

Solution

Show

Comments (0)

Sign in to leave a comment

Loading comments...

Browse More Questions

More Machine Learning•More Snapchat•More Machine Learning Engineer•Snapchat Machine Learning Engineer•Snapchat Machine Learning•Machine Learning Engineer Machine Learning
PracHub

Master your tech interviews with 7,500+ real questions from top companies.

Product

  • Questions
  • Learning Tracks
  • Interview Guides
  • Resources
  • Premium
  • For Universities
  • Student Access

Browse

  • By Company
  • By Role
  • By Category
  • Topic Hubs
  • SQL Questions
  • Compare Platforms
  • Discord Community

Support

  • support@prachub.com
  • (916) 541-4762

Legal

  • Privacy Policy
  • Terms of Service
  • About Us

© 2026 PracHub. All rights reserved.