notes/science/nlp/seq2seq.txt
Ihar Hancharenka 5dff80e88e first
2023-03-27 16:52:17 +03:00

19 строки
849 B
Plaintext

2021
Astahov - Seq2Seq Webinar ru
https://www.youtube.com/watch?v=d8A1nxoZDDk
https://drive.google.com/file/d/1OPeH7gNKrvonDueh2Lay6dpDvRcnvQ9_/view
2020
https://www.geeksforgeeks.org/understanding-of-openseq2seq/
https://nvidia.github.io/OpenSeq2Seq/html/machine-translation.html
https://nvidia.github.io/OpenSeq2Seq/html/speech-recognition.html
2019
https://habr.com/ru/post/440472/
2018
https://jalammar.github.io/visualizing-neural-machine-translation-mechanics-of-seq2seq-models-with-attention/
https://habr.com/ru/post/486158/
https://habr.com/company/otus/blog/432302/
https://habr.com/company/otus/blog/430780/
https://medium.com/analytics-vidhya/https-medium-com-tomkenter-why-care-about-byte-level-seq2seq-models-in-nlp-26bcf05dd7d3
Sequence to Sequence models: Attention Models
https://www.youtube.com/watch?v=oiNFCbD_4Tk