The Taub Faculty of Computer Science Events and Talks
Monday, 13.09.2021, 14:00
Mechanical image stabilization using actuated gimbals enables capturing long-exposure shots without suffering from blur due to camera motion. These devices can be externally attached to any camera with no need for specialized optics, making them the most common stabilization solution; however, they are often physically cumbersome and require high amounts of power, limiting their widespread use. In particular, an alternative solution for light airborne imaging systems, which are inherently prone to motion blur due to their long focal lengths, is left to be desired.
In this work, we propose to digitally emulate a mechanically stabilized system from the input of a fast unstabilized camera. To exploit the trade-off between motion blur at long exposures and low SNR at short exposures, we train a CNN that estimates a sharp high-SNR image by aggregating a burst of noisy short-exposure frames related by unknown motion. We further suggest learning the burst's optimal exposure times under specific illumination and motion ranges in an end-to-end manner, thus balancing the noise and blur across the frames. To achieve this, we train our model using a novel network layer that models raw frame acquisition as a differentiable function of the exposure time. We demonstrate this method's advantage over the traditional approach of deblurring a single long-exposure image or denoising a fixed short-exposure burst. We further show that although training is done on synthetically generated bursts, our network adapts well to real scenarios, obtaining state-of-the-art results in both indoor and outdoor scenes.