Skip to content (access key 's')
Logo of Technion
Logo of CS Department
Logo of CS4People

The Taub Faculty of Computer Science Events and Talks

On the Generalization of Gaussian Dropout using PAC-Bayesian bounds and Log-Sobolev Inequalities
event speaker icon
Yaniv Nemcovsky (M.Sc. Thesis Seminar)
event date icon
Monday, 29.07.2019, 10:00
event location icon
Room 601 Taub Bld.
event speaker icon
Advisor: Dr. Tamir Hazan
The omnipresence of increasingly large deep networks derives primarily from their empirical successes. Indeed, empirical evidence places a strong emphasis on operating at scale, more parameters and layers, in order to help both optimization and generalization. This counter-intuitive trend has eluded theoretical analysis. In this work, we present a PAC-Bayesian generalization bound for the Gaussian dropout that only requires an on-average loss function bound and on-average norm-gradient bounds by relying on log-Sobolev inequalities for Gaussian measures. Our preliminary experimental evaluation shows that our bounds \emph{decrease} when adding more layers or more parameters to the network.