אירועים
אירועים והרצאות בפקולטה למדעי המחשב ע"ש הנרי ומרילין טאוב
מרגריטה אוסדצ'י (אונ' חיפה)
יום שלישי, 18.12.2012, 11:30
חדר 1061, בניין מאייר, הפקולטה להנדסת חשמל
The majority of current methods in object classification use the
one-against-rest training scheme. We argue that when applied to a
large number of classes, this strategy is problematic: as the
number of classes increases, the negative class becomes a very
large and complicated collection of images. The resulting
classification problem then becomes extremely unbalanced, and
kernel SVM classifiers trained on such sets require long training
time and are slow in prediction. To address these problems, we
propose to consider the negative class as a background and
characterize it by a prior distribution. Further, we propose
to construct "hybrid" classifiers, which are trained to separate
this distribution from the samples of the positive class. A
typical classifier first projects (by a function which may be
non-linear) the inputs to a one-dimensional space, and then
thresholds this projection. Theoretical results and empirical
evaluation suggest that, after projection, the background has a
relatively simple distribution, which is much easier to
parameterize and work with. Our results show that hybrid
classifiers offer an advantage over SVM classifiers, both in
performance and complexity, especially when the negative
(background) class is large.