Studying at the University of Verona
Here you can find information on the organisational aspects of the Programme, lecture timetables, learning activities and useful contact details for your time at the University, from enrolment to graduation.
Type D and Type F activities
years | Modules | TAF | Teacher |
---|---|---|---|
1° 2° | Algorithms | D |
Roberto Segala
(Coordinator)
|
1° 2° | Scientific knowledge and active learning strategies | F |
Francesca Monti
(Coordinator)
|
1° 2° | Genetics | D |
Massimo Delledonne
(Coordinator)
|
1° 2° | History and Didactics of Geology | D |
Guido Gonzato
(Coordinator)
|
years | Modules | TAF | Teacher |
---|---|---|---|
1° 2° | Advanced topics in financial engineering | F |
Luca Di Persio
(Coordinator)
|
1° 2° | Algorithms | D |
Roberto Segala
(Coordinator)
|
1° 2° | Python programming language | D |
Vittoria Cozza
(Coordinator)
|
1° 2° | Organization Studies | D |
Giuseppe Favretto
(Coordinator)
|
years | Modules | TAF | Teacher |
---|---|---|---|
1° 2° | ECMI modelling week | F | Not yet assigned |
1° 2° | ESA Summer of code in space (SOCIS) | F | Not yet assigned |
1° 2° | Google summer of code (GSOC) | F | Not yet assigned |
1° 2° | Introduzione all'analisi non standard | F |
Sisto Baldo
|
1° 2° | C Programming Language | D |
Pietro Sala
(Coordinator)
|
1° 2° | LaTeX Language | D |
Enrico Gregorio
(Coordinator)
|
1° 2° | Mathematics mini courses | F |
Marco Caliari
(Coordinator)
|
Statistical learning - PART I (2020/2021)
Teaching code
4S008279
Teacher
Credits
3
Language
English
Scientific Disciplinary Sector (SSD)
MAT/06 - PROBABILITY AND STATISTICS
Period
I semestre dal Oct 1, 2020 al Jan 29, 2021.
Learning outcomes
The objective is to introduce statistical modelling and exploratory data analysis. The mathematical foundations of Statistical Learning (supervised and unsupervised learning, deep learning) are developed with emphasis on the underlying abstract mathematical framework, aiming to provide a rigorous, self-contained derivation and theoretical analysis of the main models currently used in applications.
Program
The entire course will be available online. In addition, a number of the lessons/all the lessons (see the course
schedule) will be held in-class.
1. Linear regression
Normal random vectors and their properties. Linear regression models. Least squares and
projections. Parameter estimators and their optimality (Gauss-Markov Theorem, with
proof). Distribution of the estimators. Testing predictors’ significance. Best subset selection
and its formulation as Mixed Integer Optimization problem. Ridge regression.
Interpretation of ridge regression with the singular value decomposition (with proof).
LASSO.
2. Linear methods for classification
Bayes classifier and its optimality (with proof). Linear regression after binary coding. Linear
discriminant analysis. Separating hyperplanes. The perceptron algorithm (with proof of
termination)
3. Model selection and assessment.
Loss function; training and prediction error. Cross validation. Explicit expression of cross
validation for linear regression (with proof). Bootstrap and application to model
assessment.
4. Clustering.
Center based clustering. K-center clustering; K-median clustering; K-means clustering.
Lloyd’s algorithm for K-means. Ward’s algorithm. Spectral clustering: graph Laplacian. The
multiplicity of the eigenvalue 0 of the graph Laplacian equals the number of connected
components (with proof). Unnormalized and normalized spectral clustering algorithms.
Relation of spectral clustering with graph-cut.
5. Introduction to Neural Networks.
Single layer neural networks. Cybenko’s density theorem (with proof). Multilayer neural
networks. Training a neural network: the gradient descent algorithm.
Author | Title | Publishing house | Year | ISBN | Notes |
---|---|---|---|---|---|
T. Hastie, R. Tibshirani, J. Friedman. | The elements of statistical learning. Data mining, inference, and prediction. (Edizione 2) | Springer | 2009 |
Examination Methods
Oral exam