Low-rank matrices play a vital role in many important problems in statistics, machine learning, and applied mathematics. For many such problems, the desired low-rank matrix cannot be directly measured or fully sampled. “Matrix recovery” tackles the problem of recovering from indirect, linear measurements, whereas “matrix completion” studies the subproblem of completing from a partial observation of its entries. In this talk, I will present a new information-theoretic method, called MaxEnt, for designing the data-collection procedure in matrix recovery and completion. The idea is to choose measurements or samples which maximize information on , thereby allowing for improved recovery with limited data. MaxEnt makes use of two key ingredients: a low-rank stochastic model for matrix , and the maximum entropy sampling principle. With this, we develop an effective initial and active sampling scheme for both matrix recovery and completion, and reveal novel insights between (a) information-theoretic sampling and (b) related work in compressive sensing, coding theory, and uncertainty quantification. We then illustrate the usefulness of MaxEnt in applications to image recovery, database compression, and collaborative filtering.
Georgia Institute of Technology, USA
Tue, 26/06/2018 - 11:05am
RC-4082, The Red Centre, UNSW