A Novel Deep Learning Attention Based Sequence to Sequence Model for Automatic Abstractive Text Summarization

Yousef Methkal Abd Algani

Research output: Contribution to journalArticlepeer-review

2 Scopus citations

Abstract

Abstractive text summarization is one of the trending topics in the field of natural language processing (NLP). In this type of text summarization, new sentences are generated from the original text, irrespective of whether these sentences exist in the original corpus. There are several existing sequence-to-sequence models for performing abstractive text summarization, but they are equipped with challenges such as redundancy, lack of vocabulary distribution, and irrelevant results. To overcome this, the given paper introduces a novel attention-based sequence-to-sequence model for automatic summarization of abstractive text. The proposed sequence-to-sequence model comprises a Bi-LSTM (bidirectional long-short-term memory) encoder, a unidirectional LSTM decoder, an attention mechanism, and a word embedding layer to compute the probability distribution of each word in the original document. This leads to the feature extraction of words semantically, which results in higher relevance. The performance of the proposed model is validated by taking the CNN/Daily Mail dataset into consideration and assessed using Recall-Oriented Understudy for Gisting Evaluation (ROUGE) metrics. The results show that the proposed model achieves higher ROUGE-1, ROUGE-2, and ROUGE-L values as compared to the existing baseline models.

Original languageEnglish
Pages (from-to)3597-3603
Number of pages7
JournalInternational Journal of Information Technology (Singapore)
Volume16
Issue number6
DOIs
StatePublished - Aug 2024
Externally publishedYes

Bibliographical note

Publisher Copyright:
© Bharati Vidyapeeth's Institute of Computer Applications and Management 2024.

Keywords

  • Abstractive text summarization
  • Attention mechanism
  • LSTM
  • Sequence to sequence model
  • Word embedded layer

Fingerprint

Dive into the research topics of 'A Novel Deep Learning Attention Based Sequence to Sequence Model for Automatic Abstractive Text Summarization'. Together they form a unique fingerprint.

Cite this