{"blocks": [{"key": "0264c1b0", "text": "Question", "type": "header-two", "depth": 0, "inlineStyleRanges": [], "entityRanges": [], "data": {}}, {"key": "3aaab285", "text": "Explain the interpretation of eigenvectors in PCA. Describe how Gini impurity is used in decision trees. Write the Bellman equation and explain its role in reinforcement learning. Explain dropout as a regularization technique in neural networks. Describe gradient clipping, the vanishing-gradient problem, and how ResNets help. Why are many deep-learning loss surfaces non-convex? Explain attention mechanisms and scaling in Transformers.", "type": "unstyled", "depth": 0, "inlineStyleRanges": [], "entityRanges": [], "data": {}}], "entityMap": {}}