Autores
Gelbukh Alexander
Título An Abstractive Text Summarization Using Recurrent Neural Network
Tipo Revista
Sub-tipo Memoria
Descripción 19th International Conference on Computational Linguistics and Intelligent Text Processing, CICLing 2018
Resumen With the accelerated advancement of technology and massive content surging over the Internet, it has become an arduous task to abstract the information efficiently. However, automatic text summarization provides an acceptable means for fast procurement of such information in the form of a summary through compression and refinement. Abstractive text summarization, in particular, builds an internal semantic representation of the text and uses natural language generation techniques to create summaries closer to human-generated summaries. This paper uses Long Short Term Memory (LSTM) based Recurrent Neural Network to generate comprehensive abstractive summaries. To train LSTM based model requires a corpus having a significant number of instances containing parallel running article and summary pairs. For this purpose, we have used various news corpus, namely DUC 2003, DUC 2004 and Gigaword corpus, after eliminating the noise and other irrelevant data. Experiments and analyses of this work are performed on a subset of these whole corpora and evaluated using ROUGE evaluation. The experimental result verifies the accuracy and validity of the proposed system. © 2023, Springer Nature Switzerland AG.
Observaciones DOI 10.1007/978-3-031-23804-8_29 Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), V. 13397
Lugar Hanoi
País Vietnam
No. de páginas 364-378
Vol. / Cap. 13397 LNCS
Inicio 2018-03-18
Fin 2018-03-24
ISBN/ISSN