Subject: FONDAMENTI DI MACHINE LEARNING (A.A. 2020/2021)
Unit Fondamenti di Machine Learning
Related or Additional Studies (lesson)
Introduction to basic mathematical concepts for using Machine Learning methodologies.
Introduction to the main numerical methods for deterministic and stochastic optimization; application of these methods to large-scale machine learning problems.
Basic ideas in computer programming, mathematical analysis and linear algebra.
Introduction to Machine Learning
Machine Learning problems (supervised, semisupervised, online/incremental).
Supervised learning: Loss Function, empirical risk, hypotheses space.
Empirical risk minimization: overfitting and generalization property.
Regularization theory for machine learning: regularization functionals and complexity of the solution. Hypotheses spaces defined by Kernels. Linear and nonlinear Kernels. The Representer Theorem.
Mathematical background: differentiable functions and convex analysis.
Optimality conditions, gradient methods (convergence and convergence rate).
Acceleration techniques: adaptive steplength and quasi-Newton strategies. Conjugate direction methods and conjugate gradient method.
Constrained optimization: optimality conditions, KKT conditions and duality results. Gradient projection methods: descent direction, convergence properties and acceleration techniques.
Optimization problems in training Support Vector Machines: the SVM learning methodology for binary classification, the SVM dual problem, decomposition techniques and software for SVM training.
Stochastic gradient methods and their convergence properties. Acceleration strategies: dynamic sample size methods, gradient aggregation, inexact Newton methods, diagonal scaling methods. Stochastic optimization in machine learning: empirical risk minimization, regularized models, stochastic gradient methods for large-scale problems.
Lectures and practical exercises in computer laboratory. Working students who can not attend classes regularly, shall contact the teacher in order to verify the teaching material necessary for the final examination. Student Consultation hours: Friday, 15.00-18.00 at the Department of Physics, Informatics and Mathematics or by appointment requested by an e-mail.
The evaluation consists in an oral assessment at the end of the course. In the oral assessment, the machine learning methodologies and the numerical methods presented in the course will be discussed.
Knowledge and understanding:
At the end of the course, the student will know the
mathematical notions at the basis of the machine learning methodologies and the numerical optimization methods useful for training these methodologies.
Applying knowledge and understanding:
At the end of the course, the student will have adequate knowledge for using machine learning methodologies and numerical optimization methods
At the end of the course, the student must be able to choose the appropriate approaches for solving machine learning problems and the numerical optimization methods suitable for training a machine learning methodology.
At the end of the course, the student must be able to explain the machine learning concepts and the numerical optimization methods presented in the course.
At the end of the course, the student must be able to go deep by himself into the main aspects of the subjects proposed in the course.
Dispense fornite dal docente
Scholkopf B., Smola A.J., Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond, MIT Press Cambridge, MA, USA, 2001
Bishop C.M., Pattern Recognition and Machine Learning, Springer, 2006
Bertsekas D.P., Nonlinear Programming, Athena Scientific,1999.
Nocedal J., Wright S.J., Numerical Optimization, Springer-Verlag, 2000.
Bottou, L., Curtis, F.E., Nocedal, J., Optimization methods for large-scale machine learning, SIAM Review, 2018.