PracHub
QuestionsPremiumLearningGuidesCheatsheetNEWCoaches
|Home/Coding & Algorithms/Startups.Com

Implement common neural network layers in PyTorch

Last updated: Mar 29, 2026

Quick Overview

This question evaluates proficiency in implementing core neural-network primitives (ReLU, numerically-stable Softmax, LayerNorm, BatchNorm with running statistics, and RMSNorm), numerical stability and normalization concepts, and understanding of autograd-compatible tensor operations.

  • medium
  • Startups.Com
  • Coding & Algorithms
  • Machine Learning Engineer

Implement common neural network layers in PyTorch

Company: Startups.Com

Role: Machine Learning Engineer

Category: Coding & Algorithms

Difficulty: medium

Interview Round: Onsite

Implement the following neural-network building blocks **from scratch in PyTorch** (using only basic tensor ops such as `matmul`, `sum`, `mean`, `var`, `exp`, `max`, `reshape`, `transpose`, broadcasting). Do **not** call `torch.nn.*` or `torch.nn.functional.*` implementations of the same ops. 1) **ReLU** - Input: tensor `x` of any shape. - Output: `y = max(x, 0)`. 2) **Softmax (numerically stable)** - Input: tensor `x` and an integer `dim`. - Output: softmax along `dim`. - Must be stable for large/small values. 3) **LayerNorm** - Input: `x` with shape `[*, normalized_dim]` (or more generally normalize over the last `k` dims), learnable `gamma`, `beta`, and `eps`. - Output: normalized tensor. 4) **BatchNorm (training and inference)** - Input: `x` (e.g., `[N, C, H, W]`), learnable `gamma`, `beta`, `eps`, `momentum`, plus `running_mean` and `running_var`. - Implement: - Training forward: compute batch stats and update running stats. - Inference forward: use running stats. 5) **RMSNorm** - Input: `x`, learnable `gamma`, `eps`. - Normalize by RMS (no mean subtraction). For each component, describe how you would verify correctness (e.g., compare to a reference implementation, run gradient checks / finite-difference checks, test edge cases such as fp16, tiny variance, and different tensor shapes).

Quick Answer: This question evaluates proficiency in implementing core neural-network primitives (ReLU, numerically-stable Softmax, LayerNorm, BatchNorm with running statistics, and RMSNorm), numerical stability and normalization concepts, and understanding of autograd-compatible tensor operations.

Startups.Com logo
Startups.Com
Mar 10, 2026, 12:00 AM
Machine Learning Engineer
Onsite
Coding & Algorithms
2
0

Implement the following neural-network building blocks from scratch in PyTorch (using only basic tensor ops such as matmul, sum, mean, var, exp, max, reshape, transpose, broadcasting). Do not call torch.nn.* or torch.nn.functional.* implementations of the same ops.

  1. ReLU
  • Input: tensor x of any shape.
  • Output: y = max(x, 0) .
  1. Softmax (numerically stable)
  • Input: tensor x and an integer dim .
  • Output: softmax along dim .
  • Must be stable for large/small values.
  1. LayerNorm
  • Input: x with shape [*, normalized_dim] (or more generally normalize over the last k dims), learnable gamma , beta , and eps .
  • Output: normalized tensor.
  1. BatchNorm (training and inference)
  • Input: x (e.g., [N, C, H, W] ), learnable gamma , beta , eps , momentum , plus running_mean and running_var .
  • Implement:
    • Training forward: compute batch stats and update running stats.
    • Inference forward: use running stats.
  1. RMSNorm
  • Input: x , learnable gamma , eps .
  • Normalize by RMS (no mean subtraction).

For each component, describe how you would verify correctness (e.g., compare to a reference implementation, run gradient checks / finite-difference checks, test edge cases such as fp16, tiny variance, and different tensor shapes).

Comments (0)

Sign in to leave a comment

Loading comments...

Browse More Questions

More Coding & Algorithms•More Startups.Com•More Machine Learning Engineer•Startups.Com Machine Learning Engineer•Startups.Com Coding & Algorithms•Machine Learning Engineer Coding & Algorithms
PracHub

Master your tech interviews with 7,500+ real questions from top companies.

Product

  • Questions
  • Learning Tracks
  • Interview Guides
  • Resources
  • Premium
  • For Universities
  • Student Access

Browse

  • By Company
  • By Role
  • By Category
  • Topic Hubs
  • SQL Questions
  • Compare Platforms
  • Discord Community

Support

  • support@prachub.com
  • (916) 541-4762

Legal

  • Privacy Policy
  • Terms of Service
  • About Us

© 2026 PracHub. All rights reserved.