Mending the big-data missing information

Hadassa Daltrophe, Shlomi Dolev, Zvi Lotker

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Scopus citations

Abstract

Consider a high-dimensional data set, in which for every data-point there is incomplete information. Each object in the data set represents a real entity, which is described by a point in high-dimensional space. We model the lack of information for a given object as an affine subspace in Rd whose dimension k is the number of missing features. Our goal in this study is to find clusters of objects where the main problem is to cope with partial information and high dimension. Assuming the data set is separable, namely, its emergence from clusters that can be modeled as a set of disjoint ball in Rd, we develop a simple data clustering algorithm. Our suggested algorithm use the affine subspaces minimum distance and calculates pair-wise projection of the data achieving poly-logarithmic time complexity. We use probabilistic considerations to prove the algorithm's correctness. These probabilistic results are of independent interest, and can serve to better understand the geometry of high dimensional objects.

Original languageEnglish
Title of host publication2016 IEEE International Conference on the Science of Electrical Engineering, ICSEE 2016
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781509021529
DOIs
StatePublished - 4 Jan 2017
Externally publishedYes
Event2016 IEEE International Conference on the Science of Electrical Engineering, ICSEE 2016 - Eilat, Israel
Duration: 16 Nov 201618 Nov 2016

Publication series

Name2016 IEEE International Conference on the Science of Electrical Engineering, ICSEE 2016

Conference

Conference2016 IEEE International Conference on the Science of Electrical Engineering, ICSEE 2016
Country/TerritoryIsrael
CityEilat
Period16/11/1618/11/16

Bibliographical note

Publisher Copyright:
© 2016 IEEE.

Fingerprint

Dive into the research topics of 'Mending the big-data missing information'. Together they form a unique fingerprint.

Cite this