Research/NLP_Stanford
-
Self-Attention & TransformersResearch/NLP_Stanford 2024. 6. 22. 19:08
※ Writing while taking a course 「Stanford CS224N NLP with Deep Learning」 ※ https://www.youtube.com/watch?v=LWMzyfvuehA&list=PLoROMvodv4rMFqRtEuo6SGjY4XbRIVRd4&index=9Lecture 8 - Self-Attention & Transformershttps://web.stanford.edu/class/archive/cs/cs224n/cs224n.1234/slides/cs224n-2023-lecture08-transformers.pdf
-
Neural Machine TranslationResearch/NLP_Stanford 2024. 6. 22. 12:50
※ Writing while taking a course 「Stanford CS224N NLP with Deep Learning」 ※ https://www.youtube.com/watch?v=0LixFSa7yts&list=PLoROMvodv4rMFqRtEuo6SGjY4XbRIVRd4&index=6&t=2101sLecture 7 - Translation, Seq2Seq, Attention Sequence to sequence models is an example of conditional language models. Previously, the main thing we were doing was just to start at the beginning of the sentence and generate a..
-
Sequence-to-Sequence modelResearch/NLP_Stanford 2024. 6. 22. 12:26
※ Writing while taking a course 「Stanford CS224N NLP with Deep Learning」 ※ https://www.youtube.com/watch?v=0LixFSa7yts&list=PLoROMvodv4rMFqRtEuo6SGjY4XbRIVRd4&index=6&t=2101sLecture 7 - Translation, Seq2Seq, Attention Neural machine translation means you're using a neural network to do machine translation. But in practice, it's meant slightly more than that. It has meant that we're going to buil..
-
Bidirectional and Multi-layer RNNsResearch/NLP_Stanford 2024. 6. 22. 10:16
※ Writing while taking a course 「Stanford CS224N NLP with Deep Learning」 ※ https://www.youtube.com/watch?v=0LixFSa7yts&list=PLoROMvodv4rMFqRtEuo6SGjY4XbRIVRd4&index=6&t=2101sLecture 6 - Simple and LSTM RNNs We can regard the hidden states as a representation of a word in context. That the low, that we have just a word vector for terribly. But we then looked at our context and we've created a hid..
-
Secret of LSTMResearch/NLP_Stanford 2024. 6. 22. 09:22
※ Writing while taking a course 「Stanford CS224N NLP with Deep Learning」 ※ https://www.youtube.com/watch?v=0LixFSa7yts&list=PLoROMvodv4rMFqRtEuo6SGjY4XbRIVRd4&index=6&t=2101sLecture 6 - Simple and LSTM RNNsAs to understanding why something that's different is happening here, the thing to notice is that the cell state from t-1 is passing right through to be the cell state at time t, without very ..