cs-552-modern-nlp

CS-552: Modern Natural Language Processing

Course Description

Natural language processing is ubiquitous in modern intelligent technologies, serving as a foundation for language translators, virtual assistants, search engines, and many more. In this course, we cover the foundations of modern methods for natural language processing, such as word embeddings, recurrent neural networks, transformers, and pretraining, and how they can be applied to important tasks in the field, such as machine translation and text classification. We also cover issues with these state-of-the-art approaches (such as robustness, interpretability, sensitivity), identify their failure modes in different NLP applications, and discuss analysis and mitigation techniques for these issues.

Class

Platform Where & when
Lectures Wednesdays: 11:15-13:00 [STCC - Cloud C] & Thursdays: 13:15-14:00 [CE16]
Exercises Session Thursdays: 14:15-16:00 [CE11]
Project Assistance
(not every week)
Wednesdays: 13:15-14:00 [STCC - Cloud C]
QA Forum & Annoucements Ed Forum [link]
Grades Moodle [link]

All lectures will be given in person and live streamed on Zoom. The link to the Zoom is available on the Ed Forum (pinned post). Beware that, in the event of a technical failure during the lecture, continuing to accompany the lecture live via zoom might not be possible.

Recording of the lectures will be made available on SwitchTube. We will reuse some of last year’s recordings and we may record a few new lectures in case of different lecture contents.

Lecture Schedule

Week Date Topic Suggested Reading Instructor
Week 1 21 Feb
22 Feb
Introduction | Building a simple neural classifier [slides]
Neural LMs: word embeddings [slides]
Suggested reading: Antoine Bosselut
Week 2 Feb
29 Feb
LM basics | Neural LMs: Fixed Context Models [slides]
Neural LMs: RNNs, Backpropagation, Vanishing Gradients; LSTMs [slides]
Suggested reading: Antoine Bosselut
Week 3 6 Mar
7 Mar
Seq2seq + decoding + attention | Transformers [slides]
Transformers + Greedy Decoding; GPT [slides]
Suggested reading: Antoine Bosselut
Week 4 13 Mar
14 Mar
[Online only]Pretraining: ELMo, BERT, MLM, task generality | Transfer Learning: Introduction [slides]
Assignment 1 Q&A
Suggested reading: Antoine Bosselut
Simin Fan
Week 5 20 Mar
21 Mar
Transfer Learning: Dataset Biases [slides]
Generation: Task [slides]
Suggested reading: - Antoine Bosselut
Week 6 27 Mar
28 Mar
Generation: Decoding and Training [slides]
Generation: Evaluation [slides]
Suggested reading: Antoine Bosselut
*EASTER BREAK*
Week 7 10 Apr
11 Apr
In-context Learning - GPT-3 + Prompts | Instruction Tuning [slides]
Project Description
Suggested reading: - Antoine Bosselut
Week 8 17 Apr
18 Apr
[Online only]Scaling laws | Model Compression [slides]
No class
Suggested reading: Antoine Bosselut
Week 9 24 Apr
25 Apr
Ethics in NLP: Bias / Fairness and Toxicity, Privacy, Disinformation [slides]
No class (Project work; A1 Grade Review Session)
Suggested reading: Anna Sotnikova
Week 10 1 May
2 May
Tokenization: BPE, WP, Char-based | Multilingual LMs
Guest Lecture: Kayo Yin
Suggested reading: Negar Foroutan
Kayo Yin
Week 11 8 May
9 May
Syntactic and Semantic Tasks (NER) | Interpretability: BERTology
No class (Project work; A2 Grade Review Session)
Suggested reading: - Gail Weiss
Week 12 15 May
16 May
Reading Comprehension | Retrieval-augmented LMs
No class (Project work; A2 Grade Review Session)
Suggested reading: Antoine Bosselut
Week 13 22 May
23 May
Multimodality: L & V
Looking forward
Suggested reading: - Syrielle Montariol
Antoine Bosselut
Week 14 29 May
30 May
No class (Project work; A3 Grade Review Session)

Exercise Schedule

Week Date Topic Instructor  
Week 1 22 Feb Setup + Word embeddings [code] Mete Ismayilzada  
         
Week 2 29 Feb Word embeddings review
Language and Sequence-to-sequence models [code]
Mete Ismayilzada
Badr AlKhamissi
 
         
Week 3 6 Mar Assignment 1 Q&A Mete Ismayilzada  
Week 3 7 Mar Language and Sequence-to-sequence models review
Attention + Transformers [code]
Badr AlKhamissi  
         
Week 4 13 Mar [Online only] Pretraining S2S: BART, T5 [[slides][4s]] Antoine Bosselut  
Week 4 14 Mar Attention + Transformers review
Pretraining and Transfer Learning Pt. 1 [code]
Badr AlKhamissi
Simin Fan
 
         
Week 5 20 Mar No lecture -  
Week 5 21 Mar Pretraining and Transfer Learning Pt. 1 review
Transfer Learning Pt. 2 [code]
Simin Fan  
         
Week 6 27 Mar Assignment 2 Q&A Simin Fan, Silin Gao  
Week 6 28 Mar Transfer Learning Pt. 2 review
Text Generation & Assignment 2 Q&A [code]
Simin Fan
Deniz Bayazit, Silin Gao
 
         
    EASTER BREAK    
         
Week 7 10 Apr Assignment 3 Q&A Badr AlKhamissi
Deniz Bayazit
 
Week 7 11 Apr Text Generation review
In-context Learning [code]
Deniz Bayazit
Mete Ismayilzada
 
         
Week 8 17 Apr No lecture -  
Week 8 18 Apr Assignment 3 Q&A
A1 Grade Review Session
Badr AlKhamissi
Deniz Bayazit
Mete Ismayilzada
 
Week 9 24 & 25 Apr Project TA meetings on-demand
       
Week 10 1 & 2 May Project TA meetings on-demand
       
Week 11 8 & 9 May Project
Milestone 1 Feedback
TA meetings on-demand
       
Week 12 15 & 16 May Project TA meetings on-demand
       
Week 13 22 May A3 Grade Review Session Badr AlKhamissi
Deniz Bayazit
       
Week 13 23 May Project TA meetings on-demand
       
Week 14 30 May Project
Milestone 2 Feedback
TA meetings on-demand

Exercises Session format:

Note: Please make sure you have already done the setup prerequisites to run the coding parts of the exercises. You can find the instructions here.

Grading:

Your grade in the course will be computed according to the following guidelines.

Submission Format

Assignment and project release annoucements will be on Ed. Your work will be submitted as a repository created by GitHub classroom. Clicking the assignment link (announced on its release date) will automatically create a repository under your username (ensure it matches the one on the CS-552 GitHub registration form). Your last push to the repository will be considered as your final submission, with its timestamp determining any late days (see below for the policy).

All large files such as model checkpoints need to be pushed to the repository with Git LFS. Large files can take time to upload, therefore please avoid last-minute uploads that can create potential submission delays. We also propose to use Colab as a free GPU resource. You can find tutorials on all of these resources here.

Late Days Policy

All assignments and milestones are due at 23:59 on their due date. As we understand that circumstances can make it challenging to abide by these due dates, you will receive 7 late days over the course of the semester to be allocated to the assignments and project milestones as you see fit. No further extensions will be granted. The only exception to this rule is for the final report, code, and data. No extensions will be granted beyond June 14th. We will automatically calculate the late days according to your last commit; hence you don’t have to inform us. For group projects, when everyone has some late days, we will deduct individually from everyone. In the scenario where one person has no more late days, that student will lose points for the late submission. The other students in the team will continue to use their late days (i.e. no points will be deducted from them).

Assignments (40%):

There will be three assignments throughout the course. They will be released and due according to the following schedule:

Assignment 1 (10%)

Assignment 2 (15%)

Assignment 3 (15%)

Project (60%):

The project will be divided into 2 milestones and a final submission. Each milestone will be worth 15% of the final grade with the remaining 30% being allocated to the final report. Each team will be supervised by one of the course TAs or AEs.

More details on the content of the project and the deliverables of each milestone will be released at a later date.

Milestone 1:

Milestone 2:

Final Deliverable:

Contacts

Please email us at nlp-cs552-spring2024-ta-team [at] groupes [dot] epfl [dot] ch for any administrative questions, rather than emailing TAs individually. All course content questions need to be asked via Ed.

Lecturer: Antoine Bosselut

Teaching assistants: Negar Foroutan Eghlidi, Badr AlKhamissi, Deniz Bayazit, Beatriz Borges, Zeming (Eric) Chen, Simin Fan, Silin Gao, Mete Ismayilzada