PracHub
QuestionsPremiumLearningGuidesCheatsheetNEWCareers
|Home/Machine Learning/Pinterest

Implement Naive Bayes classifier from scratch

Last updated: Mar 29, 2026

Quick Overview

This question evaluates understanding of probabilistic classification, specifically competence in estimating class priors, feature likelihood parameters, and computing posterior scores for Naive Bayes variants.

  • hard
  • Pinterest
  • Machine Learning
  • Machine Learning Engineer

Implement Naive Bayes classifier from scratch

Company: Pinterest

Role: Machine Learning Engineer

Category: Machine Learning

Difficulty: hard

Interview Round: Take-home Project

Implement a **Naive Bayes** classifier from scratch (you may use NumPy). Write a class with: - `fit(X, y)`: estimate class priors and feature likelihood parameters. - `predict(X)`: compute posteriors (or unnormalized log-posteriors) and output the most likely class. Specify which variant you implement (e.g., **Multinomial Naive Bayes** for count features, **Bernoulli NB** for binary features, or **Gaussian NB** for continuous features). Your implementation should clearly compute: - prior \(P(y=c)\) - likelihood parameters for \(P(x\mid y=c)\) - posterior scoring using Bayes’ rule with numerical stability (typically log-probabilities).

Quick Answer: This question evaluates understanding of probabilistic classification, specifically competence in estimating class priors, feature likelihood parameters, and computing posterior scores for Naive Bayes variants.

Related Interview Questions

  • Explain overfitting, underfitting, and regularization - Pinterest (hard)
  • Answer core ML fundamentals questions - Pinterest (hard)
  • Implement bagging with decision trees - Pinterest (hard)
  • Explain bias–variance, overfitting, and vanishing gradients - Pinterest (medium)
  • Explain learning-rate fluctuation and vanishing gradients - Pinterest (easy)
Pinterest logo
Pinterest
Feb 9, 2026, 12:00 AM
Machine Learning Engineer
Take-home Project
Machine Learning
2
0

Implement a Naive Bayes classifier from scratch (you may use NumPy).

Write a class with:

  • fit(X, y) : estimate class priors and feature likelihood parameters.
  • predict(X) : compute posteriors (or unnormalized log-posteriors) and output the most likely class.

Specify which variant you implement (e.g., Multinomial Naive Bayes for count features, Bernoulli NB for binary features, or Gaussian NB for continuous features). Your implementation should clearly compute:

  • prior P(y=c)P(y=c)P(y=c)
  • likelihood parameters for P(x∣y=c)P(x\mid y=c)P(x∣y=c)
  • posterior scoring using Bayes’ rule with numerical stability (typically log-probabilities).

Solution

Show

Comments (0)

Sign in to leave a comment

Loading comments...

Browse More Questions

More Machine Learning•More Pinterest•More Machine Learning Engineer•Pinterest Machine Learning Engineer•Pinterest Machine Learning•Machine Learning Engineer Machine Learning
PracHub

Master your tech interviews with 7,500+ real questions from top companies.

Product

  • Questions
  • Learning Tracks
  • Interview Guides
  • Resources
  • Premium
  • Careers
  • For Universities
  • Student Access

Browse

  • By Company
  • By Role
  • By Category
  • Topic Hubs
  • SQL Questions
  • Compare Platforms
  • Discord Community

Support

  • support@prachub.com
  • (916) 541-4762

Legal

  • Privacy Policy
  • Terms of Service
  • About Us

© 2026 PracHub. All rights reserved.