# Item

ITEM ACTIONSEXPORT

Released

Paper

#### For-all Sparse Recovery in Near-optimal Time

##### External Resource

No external resources are shared

##### Fulltext (restricted access)

There are currently no full texts shared for your IP range.

##### Fulltext (public)

arXiv:1402.1726.pdf

(Preprint), 236KB

##### Supplementary Material (public)

There is no public supplementary material available

##### Citation

Gilbert, A. C., Li, Y., Porat, E., & Strauss, M. J. (2014). For-all Sparse Recovery in Near-optimal Time. Retrieved from http://arxiv.org/abs/1402.1726.

Cite as: https://hdl.handle.net/11858/00-001M-0000-0024-40E7-E

##### Abstract

An approximate sparse recovery system in $\ell_1$ norm consists of parameters
$k$, $\epsilon$, $N$, an $m$-by-$N$ measurement $\Phi$, and a recovery
algorithm, $\mathcal{R}$. Given a vector, $\mathbf{x}$, the system approximates
$x$ by $\widehat{\mathbf{x}} = \mathcal{R}(\Phi\mathbf{x})$, which must satisfy
$\|\widehat{\mathbf{x}}-\mathbf{x}\|_1 \leq
(1+\epsilon)\|\mathbf{x}-\mathbf{x}_k\|_1$. We consider the 'for all' model, in
which a single matrix $\Phi$, possibly 'constructed' non-explicitly using the
probabilistic method, is used for all signals $\mathbf{x}$. The best existing
sublinear algorithm by Porat and Strauss (SODA'12) uses $O(\epsilon^{-3}
k\log(N/k))$ measurements and runs in time $O(k^{1-\alpha}N^\alpha)$ for any
constant $\alpha > 0$.
In this paper, we improve the number of measurements to $O(\epsilon^{-2} k
\log(N/k))$, matching the best existing upper bound (attained by super-linear
algorithms), and the runtime to $O(k^{1+\beta}\textrm{poly}(\log
N,1/\epsilon))$, with a modest restriction that $\epsilon \leq (\log k/\log
N)^{\gamma}$, for any constants $\beta,\gamma > 0$. When $k\leq \log^c N$ for
some $c>0$, the runtime is reduced to $O(k\textrm{poly}(N,1/\epsilon))$. With
no restrictions on $\epsilon$, we have an approximation recovery system with $m
= O(k/\epsilon \log(N/k)((\log N/\log k)^\gamma + 1/\epsilon))$ measurements.