Adi Shraibman (Math & CS, Weizmann Inst.)
We will describe the results from a series of papers, starting from papers of Forster, Ben-David and others to more recent works of Sherstov
and Chattopadhyay. The topics of these papers are learning, communication and computational complexity. Examples of some contributions of these papers are
1. The first lower bound on unbounded error probabilistic communication complexity.
2. Definition and study of learning theoretic quantities such as margin complexity which eventually led to the solution of a few questions in
3. New lower bounds on multiparty communication complexity.
The interesting thing is that there is an underlying theory connecting all these results, which essentially reduces to questions in matrix
analysis. We will try to describe this line of research, its relation to computational complexity, the progress that was made in the last few years and perhaps some nice open questions.