דלג לתוכן (מקש קיצור 's')
אירועים

אירועים והרצאות בפקולטה למדעי המחשב ע"ש הנרי ומרילין טאוב

event speaker icon
טאו הונג (הרצאה סמינריונית לדוקטורט)
event date icon
יום רביעי, 16.06.2021, 11:00
event location icon
Zoom Lecture: 96123577236
event speaker icon
מנחה: Prof. Irad Yavneh and Dr. Michael Zibulevsky
Work 1: we introduce a way to adapt Nesterov's well-known scheme to accelerating stationary iterative solvers for linear systems. Compared with classical Krylov subspace acceleration methods, the proposed scheme requires more iterations, but it is trivial to implement and retains essentially the same computational cost as the unaccelerated method. An explicit formula for a fixed optimal parameter is derived in the case where the stationary iteration matrix has only real eigenvalues, based only on the smallest and largest eigenvalues. The fixed parameter, and corresponding convergence factor, are shown to maintain their optimality when the iteration matrix also has complex eigenvalues that are contained within an explicitly defined disk in the complex plane. A comparison to Chebyshev acceleration based on the same information of the smallest and largest real eigenvalues (dubbed Restricted Information Chebyshev acceleration) demonstrates that Nesterov's scheme is more robust in the sense that it remains optimal over a larger domain when the iteration matrix does have some complex eigenvalues. Work 2: we introduce a general framework called weighted proximal methods (WPMs) for regularization by denoising (RED) model which uses abstract image demonising algorithms to build the prior. In work, we first show that two recently introduced RED solvers (using the fixed point and accelerated proximal gradient methods) are particular cases of WPMs. Then we show by numerical experiments that slightly more sophisticated variants of WPM can lead to reduced run times for RED by requiring a significantly smaller number of calls to the denoiser.