Stanford CS224N: NLP with Deep Learning Winter 2021 Lecture 1 – Intro & Word Vectors

Stanford CS224N: NLP with Deep Learning Winter 2021 Lecture 1 – Intro & Word Vectors

HomeStanford OnlineStanford CS224N: NLP with Deep Learning Winter 2021 Lecture 1 – Intro & Word Vectors
Stanford CS224N: NLP with Deep Learning Winter 2021 Lecture 1 – Intro & Word Vectors
ChannelPublish DateThumbnail & View CountDownload Video
Channel AvatarPublish Date not found Thumbnail
0 Views
To learn more about Stanford's professional and graduate programs in artificial intelligence, visit: https://stanford.io/3w46jar

This lecture is about:
1. The course (10min)
2. Human language and word meaning (15 min)
3. Introduction to the Word2vec algorithm (15 min)
4. Word2vec Objective Function Gradients (25 min)
5. Basics of Optimization (5min)
6. View Word Vectors (10 min or less)

Key learning point: The (really surprising!) result that the meaning of words can be represented quite well by a large vector of real numbers.

This course teaches:
1. The foundations of effective modern deep learning methods applied to NLP. First the basics, then the main methods used in NLP: recurrent networks, attention, transformers, etc.
2. A general understanding of human languages and the difficulties in understanding and producing them
3. Understanding the ability to build systems (in Pytorch) for some of the key problems in NLP. Word meaning, dependency parsing, machine translation, answering questions.

For more information about this course, visit: https://online.stanford.edu/courses/cs224n-natural-lingual-processing-deep-learning
To follow the course schedule and syllabus, visit: http://web.stanford.edu/class/cs224n/

Professor Christopher Manning
Thomas M. Siebel Professor of Machine Learning, Professor of Linguistics and Computer Science
Director, Stanford Artificial Intelligence Laboratory (SAIL)

0:00 Introduction
1:43 Goals
3:10 Human language
10:07 Google Translate
10:43 GPT
14:13 Meaning
16:19 Wordnet
19:11 Word relationships
20:27 Distributional semantics
23:33 Word embeddings
27:31 Word tovec
37:55 How to minimize loss
39:55 Interactive whiteboard
41:10 Progress
48:50 Chain rule

Please take the opportunity to connect and share this video with your friends and family if you find it helpful.