Explain XGBoost depth, regularization, and dropout
Company: Uber
Role: Machine Learning Engineer
Category: Machine Learning
Difficulty: medium
Interview Round: Onsite
Quick Answer: This question evaluates understanding of model complexity and regularization across gradient-boosted decision trees and neural networks—covering how tree depth impacts bias/variance and computational cost, how L1/L2 and weight decay modify objectives and learned parameters, dropout behavior at inference, and distinctions between training and inference phases in Machine Learning. It is commonly asked to assess reasoning about overfitting, generalization, computational and deployment trade-offs, testing domain knowledge in supervised learning and regularization, and requires primarily conceptual understanding with practical-application considerations.