Refined least squares for support recovery

Ofir Lindenbaum, Stefan Steinerberger

Research output: Contribution to journalArticlepeer-review

1 Scopus citations


Sparse linear regression is a key step in many applications such as communication, image denoising or speech recognition. In these applications, the unknown signal of interest is often modeled using linear set of noisy equations. If the vector of regression coefficients (the signal) is sparse, then its values can be identified even if the system of equations is undetermined: recovery becomes possible with fewer observations than variables. In this work, we study the problem of exact support recovery based on noisy observations and present Refined Least Squares (ReLS). Given a set of noisy measurements, our goal is to recover the support of the (unknown) sparse signal. To recover the support of the unknown signal we use an average of multiple least squares solutions, each computed based on a subset of the full set of equations. The support is estimated by identifying the most significant coefficients of the average least squares solution. We demonstrate that in a wide variety of settings our method outperforms state-of-the-art support recovery algorithms.

Original languageEnglish
Article number108493
JournalSignal Processing
StatePublished - Jun 2022

Bibliographical note

Publisher Copyright:
© 2022 Elsevier B.V.


  • Compressed sensing
  • Least squares
  • Support recovery


Dive into the research topics of 'Refined least squares for support recovery'. Together they form a unique fingerprint.

Cite this