From 2 November 2020, the autumn semester 2020 will take place online. Exceptions: Courses that can only be carried out with on-site presence.
Please note the information provided by the lecturers via e-mail.

Fadoua Balabdaoui: Catalogue data in Autumn Semester 2019

Name Prof. Dr. Fadoua Balabdaoui
Address
Mathematik, Bühlmann
ETH Zürich, HG G 12
Rämistrasse 101
8092 Zürich
SWITZERLAND
E-mailfadoua.balabdaoui@stat.math.ethz.ch
URLhttp://stat.ethz.ch/~fadouab/
DepartmentMathematics
RelationshipAdjunct Professor

NumberTitleECTSHoursLecturers
401-3619-69LMathematics Tools in Machine Learning4 credits2GF. Balabdaoui
AbstractThe course reviews many essential mathematical tools used in statistical learning. The lectures will cover the notions of hypotheses classes, sample complexity, PAC learnability, model validation and selection as well as results on several well-known algorithms and their convergence.
ObjectiveIn the exploding world of artifical intelligence and automated learning, there is an urgent need to go back to the basis of what is driving many of the well-establsihed methods in statistical learning. The students attending the lectures will get acquainted with the main theoretical results needed to establish the theory of statistical learning. We start with defining what is meant by learning a task, a training sample, the trade-off between choosing a big class of functions (hypotheses) to learn the task and the difficulty of estimating the unknown function (generating the observed sample). The course will also cover the notion of learnability and the conditions under which it is possible to learn a task. In a second part, the lectures will cover algoritmic apsects where some well-known algorithms will be described and their convergence proved.

Through the exerices classes, the students will deepen their understanding using their knowledge of the learned theory on some new situations, examples or some counterexamples.
ContentThe course will cover the following subjects:

(*) Definition of Learning and Formal Learning Models

(*) Uniform Convergence

(*) Linear Predictors

(*) The Bias-Complexity Trade-off

(*) VC-classes and the VC dimension

(*) Model Selection and Validation

(*) Convex Learning Problems

(*) Regularization and Stability

(*) Stochastic Gradient Descent

(*) Support Vector Machines

(*) Kernels
LiteratureThe course will be based on the book

"Understanding Machine Learning: From Theory to Algorithms"
by S. Shalev-Shwartz and S. Ben-David, which is available online through the ETH electronic library.

Other good sources can be also read. This includes

(*) the book "Neural Network Learning: Theoretical Foundations" de Martin Anthony and Peter L. Bartlett. This book can be borrowed from the ETH library.

(*) the lectures notes on "Mathematics of Machine Learning" taught by Philippe Rigollet available through the OpenCourseWare website of MIT
Prerequisites / NoticeBeing able to follow the lectures requires a solid background in Probability Theory and Mathematical Statistical. Notions in computations, convergence of algorithms can be helpful but are not required.
406-2604-AALProbability and Statistics
Enrolment ONLY for MSc students with a decree declaring this course unit as an additional admission requirement.

Any other students (e.g. incoming exchange students, doctoral students) CANNOT enrol for this course unit.
7 credits15RF. Balabdaoui
AbstractIntroduction to probability and statistics with many examples, based on chapters from the books "Probability and Random Processes" by G. Grimmett and D. Stirzaker and "Mathematical Statistics and Data Analysis" by J. Rice.
ObjectiveThe goal of this course is to provide an introduction to the basic ideas and concepts from probability theory and mathematical statistics. In addition to a mathematically rigorous treatment, also an intuitive understanding and familiarity with the ideas behind the definitions are emphasized. Measure theory is not used systematically, but it should become clear why and where measure theory is needed.
ContentProbability:
Chapters 1-5 (Probabilities and events, Discrete and continuous random variables, Generating functions) and Sections 7.1-7.5 (Convergence of random variables) from the book "Probability and Random Processes". Most of this material is also covered in Chap. 1-5 of "Mathematical Statistics and Data Analysis", on a slightly easier level.

Statistics:
Sections 8.1 - 8.5 (Estimation of parameters), 9.1 - 9.4 (Testing Hypotheses), 11.1 - 11.3 (Comparing two samples) from "Mathematical Statistics and Data Analysis".
LiteratureGeoffrey Grimmett and David Stirzaker, Probability and Random Processes.
3rd Edition. Oxford University Press, 2001.

John A. Rice, Mathematical Statistics and Data Analysis, 3rd edition.
Duxbury Press, 2006.