Statistical methods for machine learning

A.Y. 2020/2021
Overall hours
Learning objectives
The course describes and analyzes, in a rigorous statistical framework, some of the most important machine learning techniques. This will provide the student with a rich set of conceptual and methodological tools for understanding the general phenomenon of learning in machines.
Expected learning outcomes
Upon completion of the course students will be able to:
1. understand the notion of overfitting and its role in controlling the statistical risk
2. describe some of the most important machine learning algorithms and explain how they avoid overfitting
3. run machine learning experiments using the correct statistical methodology
These objectives are measured via a combination of two components: the project report and the oral discussion. The final grade is formed by assessing the project report, and then using the oral discussion for fine tuning.
Course syllabus and organization

Single session

Lesson period
Second semester
In case the Covid emergency prevents lectures from being given in class, these will be delivered live via the Zoom platform according to the regular schedule. Each live lecture will be video recorded and immediately made available to all students via a link. The teaching modality (in class vs. online) and the instructions for attending classes will be advertised on the course web page (see the Reference Materials section).

The syllabus and the reference material will not change.

The methods of assessment and the evaluation criteria will not change. However, exams may take place via Zoom depending on the rules being enforced at the time of the exam session.
Course syllabus
The goal of this course is to provide a methodological foundation to machine learning. The emphasis is on the design and analysis of learning algorithms with theoretical performance guarantees.

The Nearest Neighbour algorithm
Tree predictors
Statistical learning
Hyperparameter tuning and risk estimates
Risk analysis of Nearest Neighbour
Risk analysis of tree predictors
Consistency, surrogate functions, and nonparametric algorithms
Linear predictors
Online gradient descent
From sequential risk to statistical risk
Kernel functions
Support Vector Machines
Stability bounds and risk control for SVM
Boosting and ensemble methods
Neural networks and deep learning
Prerequisites for admission
The course requires basic knowledge in calculus, linear algebra, and statistics.

Before attending this course, students are strongly advised to take the following exams: Calculus, Discrete mathematics, Statistics and data analysis.
Teaching methods
Teaching Resources
The main reference material are the lecture notes available through the link

A further reference is the textbook: Shai Shalev-Shwartz e Shai Ben-David, Understanding Machine Learning: From Theory to Algorithms, Cambridge University Press, 2014.
Assessment methods and Criteria
The exam consists in writing a paper of about 10-15 pages containing either a report describing experimental results (experimental project) or a in-depth analysis of a theoretical topic (theory project). The paper will be discussed in an oral examination, in which students will be also asked questions on the rest of the syllabus. The final grade is computed by combining the project evaluation and the oral discussion. As a function of the number of students attending the course, the oral discussion may be replaced by a written test.
INF/01 - INFORMATICS - University credits: 6
Lessons: 48 hours
Wednesday 9:30AM-12:30PM
39, via Comelico. Room P101