Natural language processing is ubiquitous in modern intelligent technologies, serving as a foundation for language translators, virtual assistants, search engines, and many more. In this course, we cover the foundations of modern methods for natural language processing, such as word embeddings, recurrent neural networks, transformers, and pretraining, and how they can be applied to important tasks in the field, such as machine translation and text classification. We also cover issues with these state-of-the-art approaches (such as robustness, interpretability, sensitivity), identify their failure modes in different NLP applications, and discuss analysis and mitigation techniques for these issues.
Platform | Where & when |
---|---|
Lectures | Wednesdays: 11:15-13:00 [STCC - Cloud C] & Thursdays: 13:15-14:00 [CE16] |
Exercises Session | Thursdays: 14:15-16:00 [CE11] |
Project Assistance (not every week) |
Wednesdays: 13:15-14:00 [STCC - Cloud C] |
QA Forum & Annoucements | Ed Forum [link] |
Grades | Moodle [link] |
All lectures will be given in person and live streamed on Zoom. The link to the Zoom is available on the Ed Forum (pinned post). Beware that, in the event of a technical failure during the lecture, continuing to accompany the lecture live via zoom might not be possible.
Recording of the lectures will be made available on SwitchTube. We will reuse some of last year’s recordings and we may record a few new lectures in case of different lecture contents.
Week | Date | Topic | Instructor |
---|---|---|---|
Week 1 | 22 Feb | Setup + Word embeddings [code] | Mete Ismayilzada |
Week 2 | 29 Feb | Word embeddings review Language and Sequence-to-sequence models [code] |
Mete Ismayilzada Badr AlKhamissi |
Week 3 | 6 Mar | Assignment 1 Q&A | Mete Ismayilzada |
Week 3 | 7 Mar | Language and Sequence-to-sequence models review Attention + Transformers [code] |
Badr AlKhamissi |
Week 4 | 13 Mar | [Online only] Pretraining S2S: BART, T5 [[slides][4s]] | Antoine Bosselut |
Week 4 | 14 Mar | Attention + Transformers review Pretraining and Transfer Learning Pt. 1 [code] |
Badr AlKhamissi Simin Fan |
Week 5 | 20 Mar | No lecture | - |
Week 5 | 21 Mar | Pretraining and Transfer Learning Pt. 1 review Transfer Learning Pt. 2 [code] |
Simin Fan |
Week 6 | 27 Mar | Assignment 2 Q&A | Simin Fan, Silin Gao |
Week 6 | 28 Mar | Transfer Learning Pt. 2 review Text Generation & Assignment 2 Q&A [code] |
Simin Fan Deniz Bayazit, Silin Gao |
EASTER BREAK | |||
Week 7 | 10 Apr | Assignment 3 Q&A | Badr AlKhamissi Deniz Bayazit |
Week 7 | 11 Apr | Text Generation review In-context Learning [code] |
Deniz Bayazit Mete Ismayilzada |
Week 8 | 17 Apr | No lecture | - |
Week 8 | 18 Apr | Assignment 3 Q&A A1 Grade Review Session |
Badr AlKhamissi Deniz Bayazit Mete Ismayilzada |
Week 9 | 24 & 25 Apr | Project | TA meetings on-demand |
Week 10 | 1 & 2 May | Project | TA meetings on-demand |
Week 11 | 8 & 9 May | Project Milestone 1 Feedback |
TA meetings on-demand |
Week 12 | 15 & 16 May | Project | TA meetings on-demand |
Week 13 | 22 May | Project | Badr AlKhamissi Deniz Bayazit |
Week 13 | 23 May | A3 Grade Review Session | TA meetings on-demand |
Week 14 | 30 May | Project Milestone 2 Feedback |
TA meetings on-demand |
Note: Please make sure you have already done the setup prerequisites to run the coding parts of the exercises. You can find the instructions here.
Your grade in the course will be computed according to the following guidelines.
Assignment and project release annoucements will be on Ed. Your work will be submitted as a repository created by GitHub classroom. Clicking the assignment link (announced on its release date) will automatically create a repository under your username (ensure it matches the one on the CS-552 GitHub registration form). Your last push to the repository will be considered as your final submission, with its timestamp determining any late days (see below for the policy).
All large files such as model checkpoints need to be pushed to the repository with Git LFS. Large files can take time to upload, therefore please avoid last-minute uploads that can create potential submission delays. We also propose to use Colab as a free GPU resource. You can find tutorials on all of these resources here.
All assignments and milestones are due at 23:59 on their due date. As we understand that circumstances can make it challenging to abide by these due dates, you will receive 7 late days over the course of the semester to be allocated to the assignments and project milestones as you see fit. No further extensions will be granted. The only exception to this rule is for the final report, code, and data. No extensions will be granted beyond June 14th. We will automatically calculate the late days according to your last commit; hence you don’t have to inform us. For group projects, when everyone has some late days, we will deduct individually from everyone. In the scenario where one person has no more late days, that student will lose points for the late submission. The other students in the team will continue to use their late days (i.e. no points will be deducted from them). After you have used all your allotted late days, the penalty policy is a 25% deduction of the grade per day.
There will be three assignments throughout the course. They will be released and due according to the following schedule:
The project will be divided into 2 milestones and a final submission. Each milestone will be worth 15% of the final grade with the remaining 30% being allocated to the final report. Each team will be supervised by one of the course TAs or AEs.
More details on the content of the project and the deliverables of each milestone will be released at a later date.
Please email us at nlp-cs552-spring2024-ta-team [at] groupes [dot] epfl [dot] ch for any administrative questions, rather than emailing TAs individually. All course content questions need to be asked via Ed.
Lecturer: Antoine Bosselut
Teaching assistants: Negar Foroutan Eghlidi, Badr AlKhamissi, Deniz Bayazit, Beatriz Borges, Zeming (Eric) Chen, Simin Fan, Silin Gao, Mete Ismayilzada