In the autumn of 2019 the participants of the Natural Language Processing SIG follow the course CS224n: Natural Language Processing with Deep Learning from Stanford University. This is an overview of the lessons that we have viewed.
We meet at Thursdays 14:00-15:30 in the room Compass (or Collab).
Links to:
- course video (Bias in AI)
- Join meeting
Planned slide selection:
- 00:00-56:03 Bias in AI
Links to:
- course video (Constituency Parsing, TreeRNNs)
- Join meeting
Planned slide selection:
- 02:37-1:15:40 Constituency Parsing, TreeRNNs
Links to:
- course video (Multitask Learning)
- Join meeting
Planned slide selection:
- 00:00-1:11:53 Multitask Learning
Link to: course video (Coreference resolution)
Planned slide selection:
- 01:40-1:19:20 Coreference resolution
Skipped: 00:00-01:40 course announcements
Link to: course video (Natural Language Generation)
Planned slide selection:
- 01:09-1:19:36 Natural Language Generation
Skipped: 00:00-01:09 course announcements
Link to: course video (Transformers and Self-Attention)
Planned slide selection:
- 00:00-53:47 Transformers and Self-Attention
Paper tip: oLMpics - On what Language Model Pre-training Captures
Link to: course video (Contextual Word Representations)
Planned slide selection:
- 05:00-1:20:18 Contextual Word Representations and Pre-training: Elmo and Bert
Skipped: 00:00-05:00 course announcements
Recommended notebook related to the paper Attention Is All You Need
Link to: course video (Subword Models)
Planned slide selection:
- 03:52-1:15:29 Subword models
Skipped: 00:00-03:52 course announcements
Link to: course video (Convolutional Networks for NLP)
Planned slide selection:
- 07:56-1:20:18 Convolutional NN
Skipped: 00:00-07:56 deep NN good practice is still shifting & Pytorch NLP book
Upcoming:
- Subword Models
- Contextual Word Embeddings
- Transformers and Self-Attention
- Natural Language Generation
- Coreference Resolution
- Multitask Learning
- Consituency Parsing, TreeRNNs
- Bias in AI
- Future of NLP + Deep Learning
Link to: course video (Question Answering)
Planned slide selection:
- 06:19-1:21:00 Question Answering with NN
Skipped: 00:00-06:19 course announcements
Link to: course video (Practical Tips for Projects)
Planned slide selection:
- 36:36-59:10: Gated Recurrent Units, Vocabulary sizes
- 59:10-1:07:40: Evaluation of Machine Translation (optional)
- 1:11:44-1:18:08: Data set splits (optional)
Skipped: 00:00-36:36, 1:07:40-1:11:44, 1:18:08-END: course announcements
Link to: course video (Translation, Seq2Seq, Attention)
Planned slide selection:
- 01:06-END: Statistical Machine Translation and Neural Machine Translation
Skipped: 00:00-01:06 course announcements
Link to: course video (Vanishing Gradients, Fancy RNNs)
Planned slide selection:
- 03:06-END: RNN problems & solutions
Skipped: 00:00-03:06: course announcements
Link to: course video (Language Models and RNN's)
Planned slide selection:
- 15:07-END: Language models & recurrent neural networks
Skipped: 00:00-15:07: language models, ngram models and smoothing
Link to: course video (Dependency parsing)
Planned slide selection:
- 50:54-END: Dependency parsing & Neural parsing
Skipped: 00:00-50:54: Introduction to NLP
Link to: course video (Backpropagation)
Planned slide selection:
- 00:00-END: Backpropagation & Tips
Also:
- Gensim word vector demo: [code] [data]
- Andrej Karpathy's blog post Yes you should understand backprop
Link to: course video (Neural networks)
Planned slide selection:
- 00:00-02:02: Course plan
- 04:44-END: Feed forward networks and named entity recognition
Link to: course video (Word Vectors and Word Senses)
Planned slide selection:
- 00:00-END: Word vectors and word senses
Link to: course video (Introduction and Word Vectors)
Planned slide selection:
- 03:52: Course goals
- 09:00: Assignments
- 10:00-44:00: Introduction
- 74:08-END: Assignment 1 (download from course page)
Skipped: 00:00-10:00 and 44:00-74:08.