зеркало из
https://github.com/iharh/notes.git
synced 2025-10-29 04:44:18 +02:00
83 строки
4.5 KiB
Plaintext
83 строки
4.5 KiB
Plaintext
https://lena-voita.github.io/nlp_course.html
|
|
|
|
Jain - 2022 - Introduction to Transformers for NLP
|
|
!!! concise
|
|
|
|
RazinkovEvgeniy - AI From Fundamentals to Transformers of p29
|
|
https://www.youtube.com/playlist?list=PL6-BrcpR2C5Q1ivGTQcglILJG6odT2oCY
|
|
|
|
AI/Run Agents Design and Integration VCN
|
|
https://engage.cloud.microsoft/main/threads/eyJfdHlwZSI6IlRocmVhZCIsImlkIjoiMzQzMDU0NjE2MzYyMTg4OCJ9
|
|
|
|
https://medium.com/@shreya.rao/list/ae6c27de1640
|
|
https://towardsdatascience.com/author/shreya-rao/
|
|
https://medium.com/data-science-collective/deep-learning-illustrated-part-7-attention-accab5fa31dc
|
|
https://medium.com/data-science-collective/im-deep-learning-illustrated-part-6-seq2seq-793d67b63c79
|
|
https://towardsdatascience.com/deep-learning-illustrated-part-5-long-short-term-memory-lstm-d379fbbc9bc6/
|
|
https://towardsdatascience.com/deep-learning-illustrated-part-4-recurrent-neural-networks-d0121f27bc74/
|
|
https://towardsdatascience.com/deep-learning-illustrated-part-3-convolutional-neural-networks-96b900b0b9e0/
|
|
https://towardsdatascience.com/deep-learning-illustrated-part-2-how-does-a-neural-network-learn-481f70c1b474/
|
|
https://towardsdatascience.com/neural-networks-illustrated-part-1-how-does-a-neural-network-work-c3f92ce3b462/
|
|
|
|
Thompson - The ChatGPT Prompt Book
|
|
https://medium.com/@adria.cabello/the-evolution-of-language-models-a-journey-through-time-3179f72ae7eb
|
|
https://voicebot.ai/large-language-models-history-timeline/
|
|
https://aws.amazon.com/what-is/large-language-model/
|
|
https://explore.skillbuilder.aws/learn/public/learning_plan/view/2068/generative-ai-learning-plan-for-developers
|
|
GoogleCloudTech - Introduction to Generative AI 0:00 of 22:07
|
|
https://www.youtube.com/watch?v=G2fqAlgmoPo
|
|
AndrejKarpathy - Intro to Large Language Models 0:00 of 59:47
|
|
https://www.youtube.com/watch?v=zjkBMFhNj_g
|
|
https://www.cloudskillsboost.google/paths/118
|
|
https://www.linkedin.com/learning/what-is-generative-ai/generative-ai-is-a-tool-in-service-of-humanity
|
|
https://learn.microsoft.com/en-us/training/paths/introduction-generative-ai/
|
|
https://microsoft.github.io/generative-ai-for-beginners/#/
|
|
|
|
ai - word2vec, embeddings
|
|
https://jalammar.github.io/illustrated-word2vec/
|
|
https://habr.com/ru/companies/otus/articles/787116/
|
|
https://habr.com/ru/articles/778048/
|
|
https://habr.com/ru/companies/mws/articles/770202/
|
|
!!! need to read
|
|
https://neptune.ai/blog/exploratory-data-analysis-natural-language-processing-tools
|
|
https://neptune.ai/blog/wasserstein-distance-and-textual-similarity
|
|
llm learning
|
|
https://mellain.github.io/
|
|
https://mellain.github.io/data/YNDX_Meetup_LLM_Train_Slides_2024.pdf
|
|
https://www.youtube.com/watch?v=T3dCGPaCu5w
|
|
http://www.machinelearning.ru/wiki/index.php?title=%D0%9C%D0%B0%D1%82%D0%B5%D0%BC%D0%B0%D1%82%D0%B8%D1%87%D0%B5%D1%81%D0%BA%D0%B8%D0%B5_%D0%BC%D0%B5%D1%82%D0%BE%D0%B4%D1%8B_%D0%B0%D0%BD%D0%B0%D0%BB%D0%B8%D0%B7%D0%B0_%D1%82%D0%B5%D0%BA%D1%81%D1%82%D0%BE%D0%B2_%28%D0%BA%D1%83%D1%80%D1%81_%D0%BB%D0%B5%D0%BA%D1%86%D0%B8%D0%B9%29_/_%D0%BE%D1%81%D0%B5%D0%BD%D1%8C_2020
|
|
https://github.com/MelLain/mipt-python
|
|
https://www.postman.com/postman/published-postman-templates/documentation/ae2ja6x/postman-echo
|
|
|
|
ai - transformers
|
|
https://habr.com/ru/users/Kouki_RUS/
|
|
https://jalammar.github.io/illustrated-transformer/
|
|
https://arxiv.org/abs/1706.03762
|
|
https://habr.com/ru/companies/ruvds/articles/725618/
|
|
https://habr.com/ru/companies/ruvds/articles/723538/
|
|
http://jalammar.github.io/illustrated-transformer/
|
|
https://habr.com/ru/articles/486358/
|
|
https://peterbloem.nl/blog/transformers
|
|
https://habr.com/ru/companies/wunderfund/articles/592231/
|
|
http://nlp.seas.harvard.edu/2018/04/03/attention.html
|
|
Warmerdam - Why Transformers Work 1:00 of 44:06
|
|
https://www.youtube.com/watch?v=QHkpGtDySqM
|
|
https://www.youtube.com/watch?v=cFUSjztXbL8
|
|
https://www.youtube.com/playlist?list=PLIXJ-Sacf8u60G1TwcznBmK6rEL3gmZmV
|
|
https://huggingface.co/learn/nlp-course/chapter1/1
|
|
Stanford
|
|
https://www.youtube.com/playlist?list=PLXmxd248wv0RC-YsuPDcgkcTgLyaxUOMW
|
|
https://www.youtube.com/watch?v=G9qbEp8b-gY
|
|
https://www.youtube.com/watch?v=d90rvZQo6ZA
|
|
https://www.youtube.com/watch?v=kTuZIF0Psnc
|
|
|
|
transformers/rag
|
|
llamaindex
|
|
|
|
llm
|
|
SashaRush - Large Language Models in Five Formulas 0:00 of 58:01
|
|
https://www.youtube.com/watch?v=KCXDr-UOb9A
|
|
https://link.excalidraw.com/p/readonly/aBWlNjEckdUlrszwwo6V
|
|
https://github.com/srush/LLM-Talk/tree/main
|
|
https://github.com/srush/LLM-Talk/blob/main/Tutorial.pdf
|