Shay Moran (Math, Technion)
Wednesday, 15.12.2021, 12:15
Recent years have witnessed tremendous progress in the field of Machine Learning (ML).
However, many of the recent breakthroughs demonstrate phenomena that lack explanations, and sometimes even contradict conventional wisdom.
One main reason for this is because classical ML theory adopts a worst-case perspective which seems too pessimistic to explain practical ML: in reality data is rarely worst-case, and experiments indicate that often much less data is needed than predicted by traditional theory.
In this talk we will discuss two variations of classical PAC learning theory. These variants are based on a distribution- and data-dependent perspective which complements the distribution-free worst-case perspective of classical theory, and is suitable for exploiting specific properties of a given learning task.
Based on two joint works with Noga Alon, Olivier Bousquet, Steve Hanneke, Ron Holzman, Ramon van Handel, and Amir Yehudayoff.