Smart Initialization Yields Better Convergence Properties in Deep Abstractive Summarization
نویسندگان
چکیده
Abstractive text summarization has been proposed as an alternative to the inherently limited extractive methods, but extant work is plagued with high training times. In this work, we introduce a set of extensions, including novel initialization techniques, that allow contemporary models to achieve 10x faster training time and comparable results. Our work also provides substantial evidence against the accepted evaluation metric for abstractive summarization, and establishes a speed benchmark for further research.ive text summarization has been proposed as an alternative to the inherently limited extractive methods, but extant work is plagued with high training times. In this work, we introduce a set of extensions, including novel initialization techniques, that allow contemporary models to achieve 10x faster training time and comparable results. Our work also provides substantial evidence against the accepted evaluation metric for abstractive summarization, and establishes a speed benchmark for further research.
منابع مشابه
Neural Abstractive Text Summarization
Abstractive text summarization is a complex task whose goal is to generate a concise version of a text without necessarily reusing the sentences from the original source, but still preserving the meaning and the key contents. We address this issue by modeling the problem as a sequence to sequence learning and exploiting Recurrent Neural Networks (RNNs). This work is a discussion about our ongoi...
متن کاملDeep Recurrent Generative Decoder for Abstractive Text Summarization
We propose a new framework for abstractive text summarization based on a sequence-to-sequence oriented encoderdecoder model equipped with a deep recurrent generative decoder (DRGN). Latent structure information implied in the target summaries is learned based on a recurrent latent random model for improving the summarization quality. Neural variational inference is employed to address the intra...
متن کاملA Pilot Study of Domain Adaptation Effect for Neural Abstractive Summarization
We study the problem of domain adaptation for neural abstractive summarization. We make initial efforts in investigating what information can be transferred to a new domain. Experimental results on news stories and opinion articles indicate that neural summarization model benefits from pre-training based on extractive summaries. We also find that the combination of in-domain and out-of-domain s...
متن کاملGénération de résumés par abstraction complète
This Ph.D. thesis is the result of several years of research on automatic text summarization. Three major contributions are presented in the form of published and yet to be published papers. They follow a path that moves away from extractive summarization and toward abstractive summarization. The first article describes the HexTac experiment, which was conducted to evaluate the performance of h...
متن کاملTL;DR: Mining Reddit to Learn Automatic Summarization
Recent advances in automatic text summarization have used deep neural networks to generate high-quality abstractive summaries, but the performance of these models strongly depends on large amounts of suitable training data. We propose a new method for mining social media for author-provided summaries, taking advantage of the common practice of appending a “TL;DR” to long posts. A case study usi...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2017