Randomly aggregated least squares for support recovery

Ofir Lindenbaum, Stefan Steinerberger

Research output: Contribution to journalArticlepeer-review

6 Scopus citations

Abstract

We study the problem of exact support recovery: given an (unknown) vector θ*∈{−1,0,1}D with known sparsity k=∥θ*0, we are given access to the noisy measurement y=Xθ*+ω,where X∈RN×D is a (known) Gaussian matrix and the noise ω∈RN is an (unknown) Gaussian vector. How small can N be for reliable recovery of the support of θ*? We present RAWLS (Randomly Aggregated unWeighted Least Squares Support Recovery): the main idea is to take random subsets of the N equations, perform least squares over this reduced bit of information, and average over many random subsets. We show that the proposed procedure can provably recover an approximation of θ* and demonstrate its use through numerical examples. We use numerical simulations to demonstrate that the proposed procedure is beneficial for the task of support recovery. Finally, we observe that RAWLS is at par with several strong baselines in the low information regime (i.e. N is small or k is large).

Original languageEnglish
Article number107858
JournalSignal Processing
Volume180
DOIs
StatePublished - Mar 2021
Externally publishedYes

Bibliographical note

Publisher Copyright:
© 2020 Elsevier B.V.

Funding

FundersFunder number
Directorate for Mathematical and Physical Sciences1763179

    Keywords

    • Compressed sensing
    • Least squares
    • Support recovery

    Fingerprint

    Dive into the research topics of 'Randomly aggregated least squares for support recovery'. Together they form a unique fingerprint.

    Cite this