Term set expansion based NLP architect by Intel AI lab

Jonathan Mamou, Oren Pereg, Moshe Wasserblat, Alon Eirew, Yael Green, Shira Guskin, Peter Izsak, Daniel Korat

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

10 Scopus citations

Abstract

We present SetExpander, the term set expansion system based NLP Architect by Intel AI Lab. SetExpander is a corpus-based system for expanding a seed set of terms into a more complete set of terms that belong to the same semantic class. It implements an iterative end-to-end workflow and enables users to easily select a seed set of terms, expand it, view the expanded set, validate it, re-expand the validated set and store it, thus simplifying the extraction of domain-specific fine-grained semantic classes. SetExpander has been used successfully in real-life use cases including integration into an automated recruitment system and an issues and defects resolution system.1

Original languageEnglish
Title of host publicationEMNLP 2018 - Conference on Empirical Methods in Natural Language Processing
Subtitle of host publicationSystem Demonstrations, Proceedings
PublisherAssociation for Computational Linguistics (ACL)
Pages19-24
Number of pages6
ISBN (Electronic)9781948087858
DOIs
StatePublished - 2018
Externally publishedYes
Event2018 Conference on Empirical Methods in Natural Language Processing: System Demonstrations, EMNLP 2018 - Brussels, Belgium
Duration: 31 Oct 20184 Nov 2018

Publication series

NameEMNLP 2018 - Conference on Empirical Methods in Natural Language Processing: System Demonstrations, Proceedings

Conference

Conference2018 Conference on Empirical Methods in Natural Language Processing: System Demonstrations, EMNLP 2018
Country/TerritoryBelgium
CityBrussels
Period31/10/184/11/18

Bibliographical note

Publisher Copyright:
© 2018 Association for Computational Linguistics.

Fingerprint

Dive into the research topics of 'Term set expansion based NLP architect by Intel AI lab'. Together they form a unique fingerprint.

Cite this