Weekly Materials
Lecture slides, week 7
Professor's lecture slides (PDF)
TA Practice Slides
Hands-on tutorials and practice exercises
→
Exercise 4
Distribution of Exercise sheet 4
→
References & Resources
Deep Learning Materials
Discussion of the previous exercises (with TA)
Additional Notes
Week 7: Deep Learning Basics
Learning Objectives
- Understand the fundamentals of neural networks and deep learning
- Learn about multi-layer perceptrons (MLPs) and their architecture
- Master feed-forward networks and their training process
- Understand stochastic gradient descent (SGD) and backpropagation
- Recognize overfitting issues and regularization techniques
- Apply deep learning to real-world problems
Topics Covered
- Deep Learning Basics: Neural network fundamentals and motivation
- Multi-layer Perceptron (MLP): Architecture and design principles
- Feed-forward Networks: Information flow and network topology
- Network Training: Stochastic Gradient Descent (SGD) optimization
- Error Backpropagation: Algorithm for computing gradients
- Overfitting Prevention: Regularization techniques and best practices
- Introduction to TensorFlow: Applied to supervised learning problems
Schedule
- Lecture: Monday, October 27, 2025 (10:15 - 12:00)
- Practice Session: Monday, October 27, 2025 (16:30 - 18:00)
- TA Session: Discussion of exercises and neural network implementations
Key Concepts
- Perceptron: Single neuron model and limitations
- Universal Approximation Theorem: Theoretical foundation of neural networks
- Activation Functions: ReLU, sigmoid, tanh, and their properties
- Loss Functions: Mean squared error, cross-entropy for different tasks
- Optimization: Gradient descent variants and learning rates
- Regularization: Dropout, weight decay, early stopping
Practical Skills
- Building neural networks from scratch
- Implementing backpropagation algorithm
- Using TensorFlow for deep learning tasks
- Network architecture design and hyperparameter tuning
- Debugging and troubleshooting neural networks
Assignments
- Exercise 4: Distributed this week - Neural network implementation
- Practice building and training MLPs
- Explore different activation functions and architectures
Further Reading
- Deep Learning textbook chapters on MLPs and backpropagation
- TensorFlow documentation and tutorials