HSG_Logo_EN_RGB

Petyo Bonev

Course location

Home university

University of St.Gallen
University of St.Gallen

Course location

University of St.Gallen

Home university

University of St.Gallen
Bonev-Petyo
Petyo Bonev wrote a PhD dissertation in econometrics at the University of Mannheim. During this time, he was also a visiting scholar at the University College of London and at CREST in Paris. After he finished his PhD, he took up a position as an assistant professor at the elite technical school Ecole des Mines in Paris where he remained until 2017. Since 2018, Petyo is an assistant professor at the Swiss Institute for Empirical Economic Research (SEW) at the University of St. Gallen. He is also a research fellow at the research institute Ratio in Stockholm and a Vice President of AI and Analytics at the data analytics firm Proof Analytics.

Courses taught by this instructor

Course

Description

Instructor

Level

B = Basic
M = Intermediate
A = Advanced

Next course

Location

Course

Description

Instructor

Level

B = Basic
M = Intermediate
A = Advanced

Location

Next course

The Mathematical Foundations of Machine Learning

Learning objectives: The goal of this course is to provide a comprehensive overview of the mathematical theory behind machine learning. How can we characterize a good prediction? How can we construct good predictions based on machine learning methods? What is the relationship between (1) estimation error, (2) sample size and (3) model complexity? How do these abstract concepts apply in particular Machine Learning methods such as Boosting, Support Vector Machine, Ridge and LASSO? The objective of the course is to give detailed and intuitively clear answers to those questions. As a result, participants will receive a good preparation for theoretical and empicial work with/on Machine Learning methods. Course content: 1. Principles of statistical theory (loss function and risk, approximation vs estimation error, no free lunch theorems) 2. Concentration inequalities for bounded loss functions (Hoeffding’s Lemma, Azuma-Hoeffding’s inequality, Bounded difference inequality, Bernstein’s inequality, McDiarmid inequality) 3. Classification (binary case and its loss function, Bayesian classifier, Optimality of the Bayesian Classifier, Oracle inequalities for the Bayesian classifier, Finite dictionary learning case, The impact of noise on convergence rates, infinite dictionary) 4. General case (general loss functions, symmetrization, Rademacher complexity, Covering numbers, Chaining) 5. Applications Part 1: Vector Machine support, boosting 6. The mathematics and statistics of regularization methods (LASSO, Ridge, elastic net) 7. Applications Part 2: applying LASSO and Ridge
...

...

A

2023

The Mathematical Foundations of Machine Learning

Learning objectives: The goal of this course is to provide a comprehensive overview of the mathematical theory behind machine learning. How can we characterize a good prediction? How can we construct good predictions based on machine learning methods? What is the relationship between (1) estimation error, (2) sample size and (3) model complexity? How do these abstract concepts apply in particular Machine Learning methods such as Boosting, Support Vector Machine, Ridge and LASSO? The objective of the course is to give detailed and intuitively clear answers to those questions. As a result, participants will receive a good preparation for theoretical and empicial work with/on Machine Learning methods. Course content: 1. Principles of statistical theory (loss function and risk, approximation vs estimation error, no free lunch theorems) 2. Concentration inequalities for bounded loss functions (Hoeffding’s Lemma, Azuma-Hoeffding’s inequality, Bounded difference inequality, Bernstein’s inequality, McDiarmid inequality) 3. Classification (binary case and its loss function, Bayesian classifier, Optimality of the Bayesian Classifier, Oracle inequalities for the Bayesian classifier, Finite dictionary learning case, The impact of noise on convergence rates, infinite dictionary) 4. General case (general loss functions, symmetrization, Rademacher complexity, Covering numbers, Chaining) 5. Applications Part 1: Vector Machine support, boosting 6. The mathematics and statistics of regularization methods (LASSO, Ridge, elastic net) 7. Applications Part 2: applying LASSO and Ridge
...

...