Explain Transformers, attention, decoding, RL, and evaluation | Scale AI