Abstract
We study the problem of exact support recovery: given an (unknown) vector θ*∈{−1,0,1}D with known sparsity k=∥θ*∥0, we are given access to the noisy measurement y=Xθ*+ω,where X∈RN×D is a (known) Gaussian matrix and the noise ω∈RN is an (unknown) Gaussian vector. How small can N be for reliable recovery of the support of θ*? We present RAWLS (Randomly Aggregated unWeighted Least Squares Support Recovery): the main idea is to take random subsets of the N equations, perform least squares over this reduced bit of information, and average over many random subsets. We show that the proposed procedure can provably recover an approximation of θ* and demonstrate its use through numerical examples. We use numerical simulations to demonstrate that the proposed procedure is beneficial for the task of support recovery. Finally, we observe that RAWLS is at par with several strong baselines in the low information regime (i.e. N is small or k is large).
Original language | English |
---|---|
Article number | 107858 |
Journal | Signal Processing |
Volume | 180 |
DOIs | |
State | Published - Mar 2021 |
Externally published | Yes |
Bibliographical note
Publisher Copyright:© 2020 Elsevier B.V.
Funding
Funders | Funder number |
---|---|
Directorate for Mathematical and Physical Sciences | 1763179 |
Keywords
- Compressed sensing
- Least squares
- Support recovery