Multi-view kernel consensus for data analysis

Moshe Salhov, Ofir Lindenbaum, Yariv Aizenbud, Avi Silberschatz, Yoel Shkolnisky, Amir Averbuch

Research output: Contribution to journalArticlepeer-review

4 Scopus citations


Input data is high-dimensional while the intrinsic dimension of this data maybe low. Data analysis methods aim to uncover the underlying low dimensional structure imposed by the low dimensional hidden parameters. In general, uncovering these hidden parameters is achieved by utilizing distance metrics that considers the set of attributes as a single monolithic set. However, the transformation of a low dimensional phenomena into measurement of high dimensional observations can distort the distance metric. This distortion can affect the quality of the desired estimated low dimensional geometric structure. In this paper, we propose to utilize the redundancy in the feature domain by analyzing multiple subsets of features that are called views. The proposed methods utilize the consensus between different views to extract valuable geometric information that unifies multiple views about the intrinsic relationships among several different observations. This unification enhances the information better than what a single view or a simple concatenations of views can provide.

Original languageEnglish
Pages (from-to)208-228
Number of pages21
JournalApplied and Computational Harmonic Analysis
Issue number1
StatePublished - Jul 2020
Externally publishedYes

Bibliographical note

Publisher Copyright:
© 2019 Elsevier Inc.


  • Itô Lemma
  • Kernel
  • Multi-view
  • Uncovering underlying low dimensional space
  • View as subsets of features


Dive into the research topics of 'Multi-view kernel consensus for data analysis'. Together they form a unique fingerprint.

Cite this