Weekly Materials
Applications in NLP (at the end of the lecture; self-study)
Professor's lecture slides (PDF)
TA Practice Slides
Hands-on tutorials and practice exercises
β
References & Resources
Recap Materials
Discussion of the previous exercises (with TA)
Additional Notes
Week 9: Recap Week
Overview
This week provides a comprehensive review of the most important concepts covered in the first 8 weeks of the course. Itβs designed to consolidate learning and prepare students for the advanced topics in the remaining weeks.
Learning Objectives
- Consolidate understanding of fundamental machine learning concepts
- Review supervised learning approaches: regression and classification
- Revisit optimization techniques and gradient descent methods
- Strengthen knowledge of deep learning fundamentals
- Practice with sequence modeling and neural networks
Topics Reviewed
- Supervised Learning - Regression: Linear and polynomial regression recap
- Supervised Learning - Classification: k-NN, Naive Bayes, Decision Trees
- Optimization Methods: Gradient Descent and Stochastic Gradient Descent
- Deep Learning Foundations: Feed-forward networks and backpropagation
- Sequence Modeling: RNNs, LSTMs, and their applications
- Applications in NLP: Brief introduction (self-study)
Schedule
- Lecture: Monday, November 10, 2025 (10:15 - 12:00)
- Practice Session: Monday, November 10, 2025 (16:30 - 18:00)
- TA Session: Review and Q&A on previous exercises
Recap Materials
- Comprehensive notebooks covering all major topics
- Step-by-step implementation reviews
- Problem-solving strategies and best practices
- Common pitfalls and how to avoid them
Key Learning Reinforcements
- Model Selection: Choosing appropriate algorithms for different problems
- Evaluation Metrics: Understanding when to use different performance measures
- Overfitting Prevention: Regularization and validation strategies
- Practical Implementation: Framework usage and debugging techniques
Assignment
No new assignment this week - focus on reviewing and strengthening understanding of previous concepts. Use this time to catch up on any missed exercises or concepts.