דלג לתוכן (מקש קיצור 's')
אירועים

אירועים והרצאות בפקולטה למדעי המחשב ע"ש הנרי ומרילין טאוב

event speaker icon
יוחאי צור (הרצאה סמינריונית למגיסטר)
event date icon
יום חמישי, 14.03.2019, 11:30
event location icon
Taub 601
event speaker icon
מנחה: Prof. Alexander M. Bronstein
Neural Architecture Search (NAS) aims to facilitate the design of deep networks for a given task. This is part of a larger trend – automated machine learning (AutoML) – that promises to solve or at least alleviate the scarcity of ML experts needed to design custom architectures. We are working on a differentiable search method unlike conventional approaches of applying evolution or reinforcement learning methods. Recent works show that it is possible to reduce a convolutional neural network (CNN) arithmetic complexity by quantization or by reducing the number of filters in each layer, with negligible impact on network accuracy. These works use homogeneous architectures, i.e., all layers are quantized with the same bitwidth or same number of filters. Our goal is to search for superior heterogeneous architectures, i.e., layers can be quantized with different bitwidth or have different number of filters.