The Power of Convex Relaxation: The Surprising Stories of Matrix Completion and Compressed Sensing

In many important problems, only partial information about an object of interest is available. In signal processing, we may have far fewer measurements about an image we wish to reconstruct than the number of pixels in the image---as in common biomedical applications. In data analysis, we may be interested in knowing users' preferences for a collection of items, but only observe their preferences for just a few (this is the famous Netflix Prize problem). For these problems to make sense, the object under study needs to depend upon a smaller number of degrees of freedom, and frequently discussed approaches assume that the signal we wish to reconstruct is sparse or approximately sparse, or that the matrix we wish to infer has approximately low-rank.

However, finding a vector with minimum cardinality, or a matrix with minimum rank in a convex set is a combinatorially hard problem. In this talk, we will explain that -- surprisingly -- simple convex relaxations have the same solution as these combinatorially hard problems. Further, these solutions give good approximations when the data are noisy and this is why these techniques are broadly applicable, and have already impacted several applications areas such as signal processing, optics, machine learning and so on.

Emmanuel Candes, Stanford University

 

Renew SIAM · Contact Us · Site Map · Join SIAM · My Account
Facebook Twitter Youtube