WEEK 4: SUPERVISED LEARNING - REGRESSION

Supervised Learning, Linear Regression, Gradient Descent, and Exercise 2 distribution

October 06, 2025

← All Weeks

Weekly Materials

Additional Notes

Week 4: Supervised Learning - Regression

Learning Objectives

  • Understand the fundamental concepts of supervised learning
  • Master linear and polynomial regression techniques
  • Learn optimization methods including gradient descent
  • Apply regression models to real-world datasets
  • Understand model complexity and overfitting

Topics Covered

  • Supervised Learning: The general framework and approach
  • Linear Regression: Single and multiple variables
  • Gradient Descent: Optimization algorithm for model training
  • Polynomial Regression: Non-linear relationships
  • Tuning Model Complexity: Bias-variance tradeoff
  • Practical Applications: Stock market prediction (if time permits)
  • Data Manipulation: Introduction to Pandas

Schedule

  • Lecture: Monday, October 6, 2025 (10:15 - 12:00)
  • Practice Session: Monday, October 6, 2025 (16:30 - 18:00)
  • TA Session: Discussion of exercises and finger exercises

Key Concepts

  • Cost functions and loss minimization
  • Normal equation vs gradient descent
  • Feature scaling and normalization
  • Model evaluation metrics (MSE, MAE, R²)
  • Overfitting vs underfitting

Practical Skills

  • Implementing regression from scratch
  • Using scikit-learn for regression tasks
  • Data preprocessing with Pandas
  • Model evaluation and validation

Assignments

  • Exercise 2: Distributed this week
  • Complete finger exercises with TA guidance
  • Practice implementing gradient descent