PracHub
QuestionsPremiumLearningGuidesCheatsheetNEWCoaches
|Home/Machine Learning/Pinterest

Implement bagging with decision trees

Last updated: Mar 29, 2026

Quick Overview

This question evaluates a candidate's competency in ensemble learning and practical implementation skills, including bootstrap sampling, decision tree base learners, aggregation via majority voting with tie-breaking, and managing data using pure Python lists and standard library routines.

  • hard
  • Pinterest
  • Machine Learning
  • Machine Learning Engineer

Implement bagging with decision trees

Company: Pinterest

Role: Machine Learning Engineer

Category: Machine Learning

Difficulty: hard

Interview Round: Take-home Project

Implement a simple **bagging (bootstrap aggregating) classifier** that uses decision trees as base learners. You are given a template with a `DecisionTree` class that supports at least: - `fit(X, y)` - `predict(X)` Implement these methods for a `BaggingClassifier` (do **not** use NumPy; use Python lists and the standard library): 1) `bootstrapping(X, y, seed=None)`: sample `n=len(X)` training examples **with replacement** and return `(X_sample, y_sample)`. 2) `fit(X, y)`: for `n_estimators` trees, create a bootstrap sample and train one tree per sample. 3) `predict(X)`: aggregate predictions from all trees and output the final prediction per row (e.g., **majority vote** for classification). Specify how you break ties. Assume `X` is a list of feature vectors and `y` is a list of class labels.

Quick Answer: This question evaluates a candidate's competency in ensemble learning and practical implementation skills, including bootstrap sampling, decision tree base learners, aggregation via majority voting with tie-breaking, and managing data using pure Python lists and standard library routines.

Related Interview Questions

  • Explain overfitting, underfitting, and regularization - Pinterest (hard)
  • Answer core ML fundamentals questions - Pinterest (hard)
  • Implement Naive Bayes classifier from scratch - Pinterest (hard)
  • Explain bias–variance, overfitting, and vanishing gradients - Pinterest (medium)
  • Explain learning-rate fluctuation and vanishing gradients - Pinterest (easy)
Pinterest logo
Pinterest
Feb 9, 2026, 12:00 AM
Machine Learning Engineer
Take-home Project
Machine Learning
3
0

Implement a simple bagging (bootstrap aggregating) classifier that uses decision trees as base learners.

You are given a template with a DecisionTree class that supports at least:

  • fit(X, y)
  • predict(X)

Implement these methods for a BaggingClassifier (do not use NumPy; use Python lists and the standard library):

  1. bootstrapping(X, y, seed=None) : sample n=len(X) training examples with replacement and return (X_sample, y_sample) .
  2. fit(X, y) : for n_estimators trees, create a bootstrap sample and train one tree per sample.
  3. predict(X) : aggregate predictions from all trees and output the final prediction per row (e.g., majority vote for classification). Specify how you break ties.

Assume X is a list of feature vectors and y is a list of class labels.

Solution

Show

Comments (0)

Sign in to leave a comment

Loading comments...

Browse More Questions

More Machine Learning•More Pinterest•More Machine Learning Engineer•Pinterest Machine Learning Engineer•Pinterest Machine Learning•Machine Learning Engineer Machine Learning
PracHub

Master your tech interviews with 7,500+ real questions from top companies.

Product

  • Questions
  • Learning Tracks
  • Interview Guides
  • Resources
  • Premium
  • For Universities
  • Student Access

Browse

  • By Company
  • By Role
  • By Category
  • Topic Hubs
  • SQL Questions
  • Compare Platforms
  • Discord Community

Support

  • support@prachub.com
  • (916) 541-4762

Legal

  • Privacy Policy
  • Terms of Service
  • About Us

© 2026 PracHub. All rights reserved.