Problem 24: “Ultimate” Deterministic Sparse Recovery
Proposed by | Piotr Indyk |
---|---|
Source | Kanpur 2009 |
Short link | http://sublinear.info/24 |
We say that a vector is k-sparse for some k \in \{0,\ldots,n\} if there are no more than k non-zero coordinates in v. The goal in the problem being considered is to design an m \times n matrix A such that for any x \in R^n, one can recover from Ax a vector x^* \in \mathbb R^n that satisfies the following “L_2/L_1” approximation guarantee: \left\|x^*-x\right\|_2 \leq \min_{k\text{-sparse}\,x'\in\mathbb R^n} \frac{C}{\sqrt{k}} \left\|x'-x\right\|_1,where C>0 is a constant.
We conjecture that there is a solution that (a) uses m=O(k \log (n/k)) and (b) supports recovery algorithms running in time O(n \operatorname{polylog} n).
Background
It is known that one can achieve either (a) or (b) (see, e.g., [NeedellT-10]). It is also possible to achieve both (a) and (b), but with a different “L_1/L_1” approximation guarantee, where \|x^*-x\|_1 \leq \min_{k\text{-sparse}\,x'} C \|x'-x\|_1 [IndykR-08,BerindeIR-08].