Events
The Taub Faculty of Computer Science Events and Talks
Mohammed Dabbah (M.Sc. Thesis Seminar)
Monday, 15.11.2021, 12:00
Advisor: Prof. Ran El-yaniv
Humans do not learn all the classes they encounter all at once, rather they learn them gradually. Moreover, they can learn new classes with little or no examples. This has inspired new branches of machine learning research, such as zero-shot learning and life-long learning, which aim to replicate this ability in machines.
In zero-shot learning, the model is required to recognize classes it had not previously seen in training data.
This is usually achieved by defining each class with a set of attributes and then learning a general discriminative mapping from images to attributes. In testing, we ask the model which combination of attributes best describes the presented image, regardless of whether the combination has appeared in the training data (seen classes) or not (unseen classes).
Focusing on discriminative zero-shot learning, in this work we introduce a novel mechanism that dynamically augments during training the set of previously seen classes to produce additional fictitious classes. These fictitious classes diminish the model's tendency to fixate during training on attribute correlations that appear in the training set but will not appear in newly exposed classes.
The proposed model is tested within the two formulations of the zero-shot learning framework; namely, generalized zero-shot learning (GZSL) and classical zero-shot learning (CZSL). Our model improves the state-of-the-art performance on the CUB dataset and reaches comparable results on the other common datasets, AWA2 and SUN. We investigate the strengths and weaknesses of our method, including the effects of catastrophic forgetting when training an end-to-end zero-shot model.