Sequence Modeling Architectures and MPC (Technical Screen)
You worked on a sequence-modeling project involving multivariate time-series signals and multi-step prediction/control. Address the following:
-
Architecture usage
-
Explain how you used RNNs, LSTMs, and Transformers in your project (what the task was, why each model was chosen, and what changed across iterations).
-
Capability and training comparison
-
Compare RNNs, LSTMs, and Transformers for sequence modeling: handling long-range dependencies, data/computation needs, latency, and robustness.
-
Discuss training considerations: optimization, stability, batching/streaming, masking, teacher forcing vs. autoregression, and regularization.
-
Transformer design choice
-
Specify which Transformer architecture you used (encoder-only, decoder-only, or encoder–decoder) and why it fit your task.
-
MPC fundamentals
-
Briefly explain Model Predictive Control (MPC): cost function, constraints, prediction/control horizons, and the receding horizon principle.
-
Combining MPC with learning
-
Discuss how MPC can be integrated with learning-based models (e.g., learned dynamics, uncertainty handling, imitation/distillation, safety).