نتایج جستجو برای: recurrent neural network

تعداد نتایج: 942527  

Journal: :CoRR 2016
Yangfeng Ji Gholamreza Haffari Jacob Eisenstein

This paper presents a novel latent variable recurrent neural network architecture for jointly modeling sequences of words and (possibly latent) discourse relations that link adjacent sentences. A recurrent neural network generates individual words, thus reaping the benefits of discriminatively-trained vector representations. The discourse relations are represented with a latent variable, which ...

1997
Guo-Zheng Sun C. Lee Giles Hsing-Hen Chen

Recurrent neural networks are dynamical network structures which have the capabilities of processing and generating temporal information. To our knowledge the earliest neural network model that processed temporal information was that of MeCulloch and Pitts [McCulloch43]. Kleene [Kleene56] extended this work to show the equivalence of finite automata and McCulloch and Pitts' representation of ne...

2003
Wang Xiangrui Narendra S. Chaudhari

Structure identification has been used widely in many contexts. Grammatical Learning methods are used to find structure information through sequences. Due to negative results, alternative representations have to be used for Grammatical Learning. One such representation is recurrent neural network. Recurrent neural networks are proposed as extended automata. In this chapter, we first summarize r...

Journal: :Neurocomputing 2011
Louiza Dehyadegary Seyyed Ali Seyyedsalehi Isar Nejadgholi

Here, formation of continuous attractor dynamics in a nonlinear recurrent neural network is used to achieve a nonlinear speech denoising method, in order to implement robust phoneme recognition and information retrieval. Formation of attractor dynamics in recurrent neural network is first carried out by training the clean speech subspace as the continuous attractor. Then, it is used to recogniz...

Journal: :Mathematics and Computers in Simulation 2012
Yan Zhao Qingshan Liu

In this paper, a generalized recurrent neural network is proposed for solving -insensitive support vector regression ( -ISVR). The -ISVR is first formulated as a convex non-smooth programming problem, and then a generalize recurrent neural network with lower model complexity is designed for training the support vector machine. Furthermore, simulation results are given to demonstrate the effecti...

2014
Ngoc Thang Vu Tanja Schultz

This paper presents our latest investigations of the jointly trained maximum entropy and recurrent neural network language models for Code-Switching speech. First, we explore extensively the integration of part-of-speech tags and language identifier information in recurrent neural network language models for CodeSwitching. Second, the importance of the maximum entropy model is demonstrated alon...

2004
Jinmiao Chen Narendra S. Chaudhari

Bidirectional recurrent neural network(BRNN) is a noncausal system that captures both upstream and downstream information for protein secondary structure prediction. Due to the problem of vanishing gradients, the BRNN can not learn remote information efficiently. To limit this problem, we propose segmented memory recurrent neural network(SMRNN) and use SMRNNs to replace the standard RNNs in BRN...

2002
P. II. Kirkegaard S. R. K. Nielsen H. I. Hansen

Two different partially recurrent neural networks st,ructured as Multi Layer Perceptrons (MLP) are inves tigated for time domain identification of a nonlinear structure. The one partially recurrent neural network has feedback of a displacement component from t,he output layer to a tapped-delay-line (TDL) input layer. The other recurrent neural network bazd on the Innovation State Space model (I...

2013
N. Sivasankari M. Malleswaran

Integration of Global Positioning System (GPS) and Inertial Navigation System (INS) has been extensively used in aircraft applications like autopilot, to provide better navigation, even in the absence of GPS. Even though Kalman Filter (KF) based GPS/INS integration provides a robust solution to navigation, it requires prior knowledge of the error model of INS, which increases the complexity of ...

2013
Philemon Brakel Dirk Stroobandt Benjamin Schrauwen

We propose a bidirectional truncated recurrent neural network architecture for speech denoising. Recent work showed that deep recurrent neural networks perform well at speech denoising tasks and outperform feed forward architectures [1]. However, recurrent neural networks are difficult to train and their simulation does not allow for much parallelization. Given the increasing availability of pa...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید