On the Generalization of Gaussian Dropout using PAC-Bayesian bounds and Log-Sobolev Inequalities

Speaker:
Yaniv Nemcovsky, M.Sc. Thesis Seminar
Date:
Monday, 29.7.2019, 10:00
Place:
Room 601 Taub Bld.
Advisor:
Dr. Tamir Hazan

The omnipresence of increasingly large deep networks derives primarily from their empirical successes. Indeed, empirical evidence places a strong emphasis on operating at scale, more parameters and layers, in order to help both optimization and generalization. This counter-intuitive trend has eluded theoretical analysis. In this work, we present a PAC-Bayesian generalization bound for the Gaussian dropout that only requires an on-average loss function bound and on-average norm-gradient bounds by relying on log-Sobolev inequalities for Gaussian measures. Our preliminary experimental evaluation shows that our bounds \emph{decrease} when adding more layers or more parameters to the network.

Back to the index of events