דלג לתוכן (מקש קיצור 's')
אירועים

אירועים והרצאות בפקולטה למדעי המחשב ע"ש הנרי ומרילין טאוב

event speaker icon
בועז נדלר (מכון ויצמן למדע)
event date icon
יום שלישי, 22.06.2021, 11:30
event location icon
Zoom Lecture: 95693258864
Consider the sparse approximation or best subset selection problem: Given a vector y and a matrix A, find a k-sparse vector x that minimizes the residual ||Ax-y||. This sparse linear regression problem, and related variants, plays a key role in high dimensional statistics, compressed sensing, machine learning and more. In this talk we focus on the trimmed lasso penalty, defined as the L_1 norm of x minus the L_1 norm of its top k entries in absolute value. We advocate using this penalty by deriving sparse recovery guarantees for it, and by presenting a practical approach to optimize it. Our computational approach is based on the generalized soft-min penalty, a smooth surrogate that takes into account all possible k-sparse patterns. We derive a polynomial time algorithm to compute it, which in turn yields a novel method for the best subset selection problem. Numerical simulations illustrate its competitive performance compared to current state of the art. Joint work with Tal Amir and Ronen Basri. Short Bio: Boaz Nadler is a professor at the department of computer science at the Weizmann Institute of Science. His research interests include mathematical statistics, machine learning, signal and image processing.