Bilinear Compressed Sensing Under Known Signs via Convex Programming

Bilinear Compressed Sensing Under Known Signs via Convex Programming

We consider the bilinear inverse problem of recovering two vectors, x \in R^{L} and w \in R^{L} , from their entrywise product. We consider the case where x and w have known signs and are sparse with respect to known dictionaries of size K and N, respectively. Here, K and N may be larger than, smaller than, or equal to L. We introduce l_{1} -BranchHull, which is a convex program posed in the natural parameter space and does not require an approximate solution or initialization in order to be stated or solved. Under the assumptions that x and w satisfy a comparable-effective-sparsity condition and are S_{1} – and S_{2} -sparse with respect to a random dictionary, we present a recovery guarantee in a noisy case. We show that l_{1} -BranchHull is robust to small dense noise with high probability if the number of measurements satisfy L \ge  Ω((S_{1}+S_{2}) log^{2}(K+N)). Numerical experiments show that the scaling constant in the theorem is not too large. We also introduce variants of l_{1} -BranchHull for the purposes of tolerating noise and outliers, and for the purpose of recovering piecewise constant signals. We provide an ADMM implementation of these variants and show they can extract piecewise constant behavior from real images.

wpChatIcon