From Convolutional Sparse Coding to Deep Sparsity

Speaker:
Jeremias Sulam, Ph.D. Thesis Seminar
Date:
Wednesday, 8.11.2017, 10:30
Place:
Taub 201
Advisor:
Prof. M. Elad

Sparse approximation and dictionary learning have been applied with great success to several image processing tasks, often leading to state-of-the-art results. Yet, these methods have traditionally been restricted to small dimensions due to the computational constraints that these problems entail. This paradigm results in a series of inconsistencies, however, with both practical and theoretical implications. I will first review a series of algorithmic solutions to this local-global dichotomy and then focus on the Convolutional Sparse Coding (CSC) model, which assumes a global structure with a shift-invariant local model. While several works have been devoted to the practical aspects of this model, a systematic theoretical understanding of CSC seems to have been left aside. In this talk I will present a novel analysis of the CSC problem based on the observation that, while being global, this model can be characterized and analyzed locally. By imposing only local sparsity conditions, we show that uniqueness of solutions, stability to noise contamination and success of pursuit algorithms are globally guaranteed, resulting in much stronger and informative bounds. I will then present a recent extension of this model, the Multi-Layer CSC, and show its close relation to Convolutional Neural Networks (CNNs). We will further develop a sound pursuit for signals in this model by adopting a projection approach, providing bounds on the stability of its solution and analyzing different alternatives to implement this in practice. Last, but not least, we will derive a learning algorithm for the ML-CSC and demonstrate its applicability for several applications in an unsupervised setting.

Back to the index of events