אירועים
אירועים והרצאות בפקולטה למדעי המחשב ע"ש הנרי ומרילין טאוב
עומר ליבוביץ (הרצאה סמינריונית למגיסטר)
יום שני, 12.04.2021, 12:30
Zoom Lecture:
98712430421
For password to lecture, please contact: mayasidis@campus.technion.ac.il
A butterfly network consists of logarithmically many layers, each with a linear number of pre-specified nonzero weights. We propose to replace a dense linear layer in any neural network by an architecture based on the butterfly network. The proposed architecture significantly improves upon the quadratic number of weights required in a standard dense layer to nearly linear with little compromise in expressibility of the resulting operator. In a collection of wide variety of experiments, including supervised prediction on both the NLP and vision data, we show that this not only produces results that match and often outperform existing well-known architectures, but it also offers faster training and prediction in deployment.
Theoretical result presented in the paper explain why the training speed and outcome are not compromised by our proposed approach.