אירועים
אירועים והרצאות בפקולטה למדעי המחשב ע"ש הנרי ומרילין טאוב
יאיר וינר (הרצאה סמינריונית לדוקטורט)
יום רביעי, 12.06.2013, 13:00
In selective prediction, a predictor (e.g. classifier or regressor) is allowed to abstain from prediction on part of the domain. The objective is to improve the accuracy of predictions by compromising coverage. This talk will focus on the theoretical foundations of selective prediction and its applications for selective classification, selective regression, and active learning.
A classical result in statistical learning theory is that the excess risk of a classifier can be written as a sum of estimation and approximation errors. In this talk we will show how the estimation error can be reduced to any desirable level by sufficiently compromising coverage. We will present a new family of selective prediction strategies and analyze their coverage rates. These strategies can ensure zero estimation error, thus achieving the same performance as the best predictor in hindsight over the accepted domain.
These developments rely on a new complexity measure termed characterizing set complexity. This measure characterizes coverage rates in realizable and agnostic selective classification, as well as label complexity speedup in active learning. Using classical results from probabilistic geometry we will bound this complexity measure for a number of interesting settings. The resulting new technique can be used to upper bound Hannekes disagreement coefficient and yields an exponential label complexity speedup for actively learning general (non-homogeneous) linear classifiers when the data distribution is an arbitrary high dimensional mixture of Gaussians.