Research
-
The Illustrated TransformerResearch/NLP_reference 2024. 2. 13. 10:05
※ 출처: https://jalammar.github.io/illustrated-transformer/ Attention is a concept that helped improve the performance of neural machine translation applications. The Transformer is a model that uses attention to boost the speed with which these models can be trained. The Transformer outperforms the Google Neural Machine Translation model in specific tasks. The biggest benefit, however, comes from..