Editing Open Problems:51
Warning: You are not logged in. Your IP address will be publicly visible if you make any edits. If you log in or create an account, your edits will be attributed to your username, along with other benefits.
The edit can be undone.
Please check the comparison below to verify that this is what you want to do, and then save the changes below to finish undoing the edit.
Latest revision | Your text | ||
Line 1: | Line 1: | ||
{{Header | {{Header | ||
+ | |title="For all" guarantee for computationally bounded adversaries | ||
|source=dortmund12 | |source=dortmund12 | ||
|who=Martin Strauss | |who=Martin Strauss | ||
Line 6: | Line 7: | ||
using two players: | using two players: | ||
− | * ''For all'': Charlie constructs the sensing matrix $\phi$, and then Mallory constructs the signal $x=x(\phi)$ as a function of $\phi$. The Compressed Sensing question is to recover the approximate signal $\tilde x$ from the measurement $\phi x$. The best guarantee possible is the following $\ell_2/\ell_1$ guarantee:$$||\tilde x - x||_2 \le \epsilon/\sqrt{k} ||x_{ | + | * ''For all'': Charlie constructs the sensing matrix $\phi$, and then Mallory constructs the signal $x=x(\phi)$ as a function of $\phi$. The Compressed Sensing question is to recover the approximate signal $\tilde x$ from the measurement $\phi x$. The best guarantee possible is the following $\ell_2/\ell_1$ guarantee:$$||\tilde x - x||_2 \le \epsilon/\sqrt{k} ||x_{opt} - x||_1.$$ |
− | * ''For each'': Charlie construct a distribution $D$ over sensing matrices $\phi$. Then Mallory constructs a vector $x=x(D)$ dependent on the distribution only. Finally, a sensing matrix $\phi$ is sampled from the distribution $D$. The goal is again to recover $\tilde x$, with good probability over the choice of $\phi$. It turns out a stronger guarantee, termed $\ell_2/\ell_2$, is possible: $$||\tilde x - x||_2 \le (1+\epsilon)||x_{ | + | * ''For each'': Charlie construct a distribution $D$ over sensing matrices $\phi$. Then Mallory constructs a vector $x=x(D)$ dependent on the distribution only. Finally, a sensing matrix $\phi$ is sampled from the distribution $D$. The goal is again to recover $\tilde x$, with good probability over the choice of $\phi$. It turns out a stronger guarantee, termed $\ell_2/\ell_2$, is possible: $$||\tilde x - x||_2 \le (1+\epsilon)||x_{opt} - x||_2.$$ |
− | In some sense the two | + | In some sense the two "worlds" are incomparable: the first one works |
for all $x$ but obtains weaker error guarantee, and the second one | for all $x$ but obtains weaker error guarantee, and the second one | ||
works for each $x$ with some probability but gets better error guarantee. | works for each $x$ with some probability but gets better error guarantee. | ||
− | '''Question | + | '''Question is''': How can we get the best of both worlds ("for all" with |
$\ell_2/\ell_2$ error) ? | $\ell_2/\ell_2$ error) ? | ||
− | Once we require | + | Once we require "for all", it is provably impossible to obtain $\ell_2/\ell_2$ guarantee. But what if Mallory has bounded computational resources to construct a "bad" $x$? |
A preliminary result considers the following setting. Mallory sees $\phi$ and writes down a sketch of $\phi$ (in bounded space). Then Mallory produces $x$ from this sketch only. Then $\ell_2/\ell_2$ is | A preliminary result considers the following setting. Mallory sees $\phi$ and writes down a sketch of $\phi$ (in bounded space). Then Mallory produces $x$ from this sketch only. Then $\ell_2/\ell_2$ is |