Theory Seminar: On Identity Testing of Tensors, Low-rank Recovery and Compressed Sensing

Michael Forbes (MIT)

Wednesday, 06.06.2012, 12:30

Taub 201

Let M be a low-rank matrix. For vectors x,y, define the bilinear form f(x,y)=x^t M y. We study the question of reconstructing M from evaluations to f. Much of previous work allowed randomized evaluations, or a stronger query model (or both). We show how to, in an optimal number of 4nr measurements, efficiently reconstruct M from deterministically chosen queries to f. This can be seen as a (noiseless) generalization of compressed sensing, and we make this connection formal by reducing (in the noiseless case) the task of recovering a low-rank matrix to the task of recovering a sparse vector.

We also generalize the above ideas to higher dimensional matrices, known as tensors. This generalization can be seen as a question in black-box polynomial identity testing (PIT), which is the task of (deterministically) determining if a given algebraic circuit computes the zero polynomial. For a certain model of circuits (known as depth-3 set-multilinear circuits), we give the first quasipolynomial algorithm for the black-box PIT question.

Joint with Amir Shpilka, appeared at STOC 2012

We also generalize the above ideas to higher dimensional matrices, known as tensors. This generalization can be seen as a question in black-box polynomial identity testing (PIT), which is the task of (deterministically) determining if a given algebraic circuit computes the zero polynomial. For a certain model of circuits (known as depth-3 set-multilinear circuits), we give the first quasipolynomial algorithm for the black-box PIT question.

Joint with Amir Shpilka, appeared at STOC 2012