In this study we consider learning a reduced dimensionality representation from datasets obtained under multiple views. Such multiple views of datasets can be obtained, for example, when the same underlying process is observed using several different modalities, or measured with different instrumentation. Our goal is to effectively exploit the availability of such multiple views for various purposes, such as nonlinear embedding, manifold learning, spectral clustering, anomaly detection and non-linear system identification. Our proposed method exploits the intrinsic relation within each view, as well as the mutual relations between views. We do this by defining a cross-view model, in which an implied Random Walk process between objects is restrained to hop between the different views. Our method is robust to scaling of each dataset, and is insensitive to small structural changes in the data. Within this framework, we define new diffusion distances and analyze the spectra of the implied kernels.
|Title of host publication||Latent Variable Analysis and Signal Separation - 12th International Conference, LVA/ICA 2015, Proceedings|
|Editors||Zbynĕk Koldovský, Emmanuel Vincent, Arie Yeredor, Petr Tichavský|
|Number of pages||8|
|State||Published - 2015|
|Event||12th International Conference on Latent Variable Analysis and Signal Separation, LVA/ICA 2015 - Liberec, Czech Republic|
Duration: 25 Aug 2015 → 28 Aug 2015
|Name||Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)|
|Conference||12th International Conference on Latent Variable Analysis and Signal Separation, LVA/ICA 2015|
|Period||25/08/15 → 28/08/15|
Bibliographical notePublisher Copyright:
© Springer International Publishing Switzerland 2015.
- Diffusion maps
- Dimensionality reduction
- Manifold learning