Skip to content (access key 's')
Logo of Technion
Logo of CS Department
Logo of CS4People
Events

The Taub Faculty of Computer Science Events and Talks

Differentiable Neural Architecture Search with Arithmetic Complexity Constraint
event speaker icon
Yochai Zur (M.Sc. Thesis Seminar)
event date icon
Thursday, 14.03.2019, 11:30
event location icon
Taub 601
event speaker icon
Advisor: Prof. Alexander M. Bronstein
Neural Architecture Search (NAS) aims to facilitate the design of deep networks for a given task. This is part of a larger trend – automated machine learning (AutoML) – that promises to solve or at least alleviate the scarcity of ML experts needed to design custom architectures. We are working on a differentiable search method unlike conventional approaches of applying evolution or reinforcement learning methods. Recent works show that it is possible to reduce a convolutional neural network (CNN) arithmetic complexity by quantization or by reducing the number of filters in each layer, with negligible impact on network accuracy. These works use homogeneous architectures, i.e., all layers are quantized with the same bitwidth or same number of filters. Our goal is to search for superior heterogeneous architectures, i.e., layers can be quantized with different bitwidth or have different number of filters.