Studying at the University of Verona
Here you can find information on the organisational aspects of the Programme, lecture timetables, learning activities and useful contact details for your time at the University, from enrolment to graduation.
Study Plan
This information is intended exclusively for students already enrolled in this course.If you are a new student interested in enrolling, you can find information about the course of study on the course page:
Laurea magistrale in Mathematics - Enrollment from 2025/2026The Study Plan includes all modules, teaching and learning activities that each student will need to undertake during their time at the University.
Please select your Study Plan based on your enrollment year.
1° Year
Modules | Credits | TAF | SSD |
---|
Modules | Credits | TAF | SSD |
---|
Modules | Credits | TAF | SSD |
---|
1 module between the following
1 module between the following
3 modules among the following
Legend | Type of training activity (TTA)
TAF (Type of Educational Activity) All courses and activities are classified into different types of educational activities, indicated by a letter.
Statistical learning - PART I (2020/2021)
Teaching code
4S008279
Teacher
Credits
3
Language
English
Scientific Disciplinary Sector (SSD)
MAT/06 - PROBABILITY AND STATISTICS
Period
I semestre dal Oct 1, 2020 al Jan 29, 2021.
Learning outcomes
The objective is to introduce statistical modelling and exploratory data analysis. The mathematical foundations of Statistical Learning (supervised and unsupervised learning, deep learning) are developed with emphasis on the underlying abstract mathematical framework, aiming to provide a rigorous, self-contained derivation and theoretical analysis of the main models currently used in applications.
Program
The entire course will be available online. In addition, a number of the lessons/all the lessons (see the course
schedule) will be held in-class.
1. Linear regression
Normal random vectors and their properties. Linear regression models. Least squares and
projections. Parameter estimators and their optimality (Gauss-Markov Theorem, with
proof). Distribution of the estimators. Testing predictors’ significance. Best subset selection
and its formulation as Mixed Integer Optimization problem. Ridge regression.
Interpretation of ridge regression with the singular value decomposition (with proof).
LASSO.
2. Linear methods for classification
Bayes classifier and its optimality (with proof). Linear regression after binary coding. Linear
discriminant analysis. Separating hyperplanes. The perceptron algorithm (with proof of
termination)
3. Model selection and assessment.
Loss function; training and prediction error. Cross validation. Explicit expression of cross
validation for linear regression (with proof). Bootstrap and application to model
assessment.
4. Clustering.
Center based clustering. K-center clustering; K-median clustering; K-means clustering.
Lloyd’s algorithm for K-means. Ward’s algorithm. Spectral clustering: graph Laplacian. The
multiplicity of the eigenvalue 0 of the graph Laplacian equals the number of connected
components (with proof). Unnormalized and normalized spectral clustering algorithms.
Relation of spectral clustering with graph-cut.
5. Introduction to Neural Networks.
Single layer neural networks. Cybenko’s density theorem (with proof). Multilayer neural
networks. Training a neural network: the gradient descent algorithm.
Author | Title | Publishing house | Year | ISBN | Notes |
---|---|---|---|---|---|
T. Hastie, R. Tibshirani, J. Friedman. | The elements of statistical learning. Data mining, inference, and prediction. (Edizione 2) | Springer | 2009 |
Examination Methods
Oral exam