WEEK 7: DEEP LEARNING BASICS

Deep learning basics, MLPs, Feed-forward networks, Exercise 4 distributed

October 27, 2025

← All Weeks

Weekly Materials

Additional Notes

Week 7: Deep Learning Basics

Learning Objectives

  • Understand the fundamentals of neural networks and deep learning
  • Learn about multi-layer perceptrons (MLPs) and their architecture
  • Master feed-forward networks and their training process
  • Understand stochastic gradient descent (SGD) and backpropagation
  • Recognize overfitting issues and regularization techniques
  • Apply deep learning to real-world problems

Topics Covered

  • Deep Learning Basics: Neural network fundamentals and motivation
  • Multi-layer Perceptron (MLP): Architecture and design principles
  • Feed-forward Networks: Information flow and network topology
  • Network Training: Stochastic Gradient Descent (SGD) optimization
  • Error Backpropagation: Algorithm for computing gradients
  • Overfitting Prevention: Regularization techniques and best practices
  • Introduction to TensorFlow: Applied to supervised learning problems

Schedule

  • Lecture: Monday, October 27, 2025 (10:15 - 12:00)
  • Practice Session: Monday, October 27, 2025 (16:30 - 18:00)
  • TA Session: Discussion of exercises and neural network implementations

Key Concepts

  • Perceptron: Single neuron model and limitations
  • Universal Approximation Theorem: Theoretical foundation of neural networks
  • Activation Functions: ReLU, sigmoid, tanh, and their properties
  • Loss Functions: Mean squared error, cross-entropy for different tasks
  • Optimization: Gradient descent variants and learning rates
  • Regularization: Dropout, weight decay, early stopping

Practical Skills

  • Building neural networks from scratch
  • Implementing backpropagation algorithm
  • Using TensorFlow for deep learning tasks
  • Network architecture design and hyperparameter tuning
  • Debugging and troubleshooting neural networks

Assignments

  • Exercise 4: Distributed this week - Neural network implementation
  • Practice building and training MLPs
  • Explore different activation functions and architectures

Further Reading

  • Deep Learning textbook chapters on MLPs and backpropagation
  • TensorFlow documentation and tutorials