אירועים
אירועים והרצאות בפקולטה למדעי המחשב ע"ש הנרי ומרילין טאוב
שי מורן (הרצאה סמינריונית לדוקטורט)
יום חמישי, 19.05.2016, 12:00
מנחה: Prof. A. Shpilka and Prof. A. Yehudayoff
[Note the change of time and date]
Generalization and simplification are deeply related to each other:
simpler explanations often reveal principles that apply more generally,
and predictive scientific theories often boil down
to a few simple (and deep) laws.
This talk is about a manifestation of the
"generalization--simplification" link
within computational learning theory.
We will consider the following formal
frameworks for `generalization' and `simplification'.
(i) Probably Approximately Correct (PAC) learning --
a statistical framework that formalizes generalization.
(ii) Sample compression schemes --
a combinatorial framework that formalizes simplification.
Littlestone and Warmuth showed that,
for binary classification problems,
the ability to simplify implies the ability to generalize,
and asked whether the other direction holds.
In the first part of this talk we will give
an affirmative answer to this question.
Apart from being a formal manifestation of the
"generalization--manifestation" link,
the connection between compression and generalization
found applications in binary-classification theory.
A notable example is Freund and Schapire's boosting theory,
which is based on a weak type of compression.
It is therefore interesting, to extend the notion of compression
beyond binary classification problems.
In the second part of the talk we will discuss such extensions,
and establish their equivalence with learnability.
No knowledge in machine learning will be assumed in the talk.