Boosting Selective Regression with Ensembles

Speaker:
Amit Gross, M.Sc. Thesis Seminar
Date:
Sunday, 25.2.2018, 08:30
Place:
Taub 601
Advisor:
Prof. R. El-Yaniv

Using selective regression, it is possible to increase accuracy of predictions by abstaining from answering when there is insufficient knowledge. This work is about increasing the accuracy of selective regression even further and using simple selective models to create a more complex one, by using an ensemble of selective regressors. We demonstrate how to achieve improved accuracy by using two methods to build our ensemble. In the first approach, we first split the samples in the input dataset into several clusters, and use each such cluster to train a regressor. Then, when given a new instance, we choose a regressor result that did not reject the new instance. In the second approach we train several regressors, where each regressor is using only a subset of the data’s original features. This allows us to create several lower dimensionality regressors that are less prone to overfitting, especially when the training set is fairly small. We then choose which regressor should be used by discarding those that reject an example given for labeling. We empirically tested the two approaches on various datasets, and saw that it can indeed boost accuracy compared to a single regressor or non-selective ensembles, depending on the distribution of the actual data. Finally, we present conclusions drawn from our findings and raise some follow up re- search questions that arise from this work.

Back to the index of events