Recurrent Attentional Topic Model

نویسندگان

  • Shuangyin Li
  • Yu Zhang
  • Rong Pan
  • Mingzhi Mao
  • Yang Yang
چکیده

In a document, the topic distribution of a sentence depends on both the topics of preceding sentences and its own content, and it is usually affected by the topics of the preceding sentences with different weights. It is natural that a document can be treated as a sequence of sentences. Most existing works for Bayesian document modeling do not take these points into consideration. To fill this gap, we propose a Recurrent Attentional Topic Model (RATM) for document embedding. The RATM not only takes advantage of the sequential orders among sentence but also use the attention mechanism to model the relations among successive sentences. In RATM, we propose a Recurrent Attentional Bayesian Process (RABP) to handle the sequences. Based on the RABP, RATM fully utilizes the sequential information of the sentences in a document. Experiments on two copora show that our model outperforms state-of-the-art methods on document modeling and classification.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Hierarchical Generative Model of Recurrent Object-Based Attention in the Visual Cortex

In line with recent work exploring Deep Boltzmann Machines (DBMs) as models of cortical processing, we demonstrate the potential of DBMs as models of object-based attention, combining generative principles with attentional ones. We show: (1) How inference in DBMs can be related qualitatively to theories of attentional recurrent processing in the visual cortex; (2) that deepness and topographic ...

متن کامل

The SOCKEYE Neural Machine Translation Toolkit at AMTA 2018

We describe SOCKEYE, an open-source sequence-to-sequence toolkit for Neural Machine Translation (NMT). SOCKEYE is a production-ready framework for training and applying models as well as an experimental platform for researchers. Written in Python and built on MXNET, the toolkit offers scalable training and inference for the three most prominent encoderdecoder architectures: attentional recurren...

متن کامل

Attentional and Semantic Anticipations in Recurrent Neural Networks

Why are attentional processes important in the driving of anticipations? Anticipatory processes are fundamental cognitive abilities of living systems, in order to rapidly and accurately perceive new events in the environment, and to trigger adapted behaviors to the newly perceived events. To process anticipations adapted to sequences of various events in complex environments, the cognitive syst...

متن کامل

Investigating reliability and validity of Persian version of the Attentional Style Questionnaire(ASQ)

Objective: In spite of increasing importance of attentional control in conceptualization of psychopathology,there are a few scales to measure it. It is necessary to use valid and reliable scale to study this construct in Iranian studies.This study aimed to provide and investigate psychometric characteristics of Persian version of the attentional style questionnaire. Methods: The sample of 426 ...

متن کامل

A Generative Attentional Neural Network Model for Dialogue Act Classification

We propose a novel generative neural network architecture for Dialogue Act classification. Building upon the Recurrent Neural Network framework, our model incorporates a new attentional technique and a label-to-label connection for sequence learning, akin to Hidden Markov Models. Our experiments show that both of these innovations enable our model to outperform strong baselines for dialogue-act...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017