Machine Learning

BSc - Autumn Semester
442173, Lectures and exercises, 5.0 ECTS

Lecturer Prof. Dr. Paolo Favaro
Teaching assistants Aram Davtyan
Sepehr Sameni
Alp Sari
Location Hörsaal B006 ExWi Building, Sidlerstrasse 5
Time Wednesdays 13.15-15.00 (lecture) and 15.15-16.00 (tutorials)
Exam 10th of January 2024 from 10:00-12:00 at ExWi A6
ILIAS KSL

*** GENERAL INFORMATION *** 

We will also try to stream the class live via ILIAS (if the resources allow).
Also, we will record the classes and make the recordings
available as podcasts in ILIAS. Attendance is not mandatory but strongly
recommended as the classes will be interactive.
++++++++++++++++++++ 

 

Course description

This course covers fundamental topics in machine learning and pattern recognition. The course will provide an introduction to supervised learning, unsupervised learning,  and reinforcement learning. The approach used throughout the course is mostly based on convex optimization theory. However, it is not necessary to have a background in optimization as the methods presented will be self-contained.

Learning outcomes

On satisfying the requirements of this course, students will have the knowledge and skills to: 

  1. Understand a number of models for supervised, unsupervised, and reinforcement machine learning 
  2. Describe the strength and weakness of each of these models 
  3. Understand the mathematical background from Linear Algebra, Statistics, and Probability Theory used in these machine learning models 
  4. Implement efficient machine learning algorithms on a computer 
  5. Design test procedures in order to evaluate a model 
  6. Combine several models in order to gain better results 
  7. Make choices for a model for new machine learning tasks based on reasoned argument

Prerequisites

The course requires students to be familiar with the basics of linear algebra, probability theory and MATLAB programming. A brief review of these subjects will be carried out during the exercise sessions.

Resources

The handouts are the reference material. There is no required textbook for this course. The following books are recommended as additional reading:

  • Richard Duda, Peter Hart and David Stork, Pattern Classification, 2nd ed. John Wiley & Sons, 2001.
  • Kevin P. Murphy, Machine Learning: A Probabilistic Perspective, The MIT Press, 2012.
  • Trevor Hastie, Robert Tibshirani and Jerome Friedman, The Elements of Statistical Learning. Springer, 2009.

Course handouts and other materials can be found in ILIAS.

Exercises

The exercises are a prerequisite for registering for the exam. There will be homework assignments and the deadlines will be given on the first lecture (see ILIAS).

Schedule and material

The following table provides an overview of the content of the lectures during the semester. Please check it periodically as it might be updated.

Week Lecture Reading
1 Intro and application of ML. Supervised learning: Least mean squares Handout 0, 1
2 Supervised learning: Probabilistic interpretation Handout 1
3 Supervised learning: Generalized linear models Handout 1
4 Supervised learning: Generative learning, Naïve Bayes Handout 2
5 Supervised learning: Support vector machines Handout 3
6 Supervised learning: Support vector machines Handout 3
7 Decision Trees and Ensembles Handout 4a
8 Ensemble Boosting. Regularization and model selection Handout 4b, 5
9 Unsupervised learning: Clustering and K-means Handout 6
10 Unsupervised learning: EM and Factor analysis Handout 7, 8, 9
11 Unsupervised learning: PCA and ICA Handout 10, 11
12 Reinforcement learning Handout 12
13 Reinforcement learning: TD and Q-learning Handout 13
14 Revision