Machine Learning - Self-Attention & Transformer
惡補 ML https://www.youtube.com/watch?v=Ye018rCVvOo&list=PLJV_el3uVTsMhtt7_Y6sgTHGHp1Vb2P2J Self-Attention先前提到的 Input 都只是一個 Vector,然而很多時候,模型吃的是 一組 Vector,又稱 Vector Set、Sequence,又可以分成三類 每個 Vector