نتایج جستجو برای: recurrent neural network

تعداد نتایج: 942527  

1996
Davor Pavisic Jean-Philippe Draye Roberto Teran Gustavo Calderon Guy Cheron Gaetan Libert

Journal: :CoRR 2018
Sovan Biswas Juergen Gall

A group of persons can be analyzed at various semantic levels such as individual actions, their interactions, and the activity of the entire group. In this paper, we propose a structural recurrent neural network (SRNN) that uses a series of interconnected RNNs to jointly capture the actions of individuals, their interactions, as well as the group activity. While previous structural recurrent ne...

2000
Alessio Micheli Diego Sona Alessandro Sperduti

Recurrent neural networks fail to deal with prediction tasks which do not satisfy the causality assumption. We propose to exploit bi-causality to extend the Recurrent Cascade Correlation model in order to deal with contextual prediction tasks. Preliminary results on artificial data show the ability of the model to preserve the prediction capability of Recurrent Cascade Correlation on strict cau...

2017
Shun Hasegawa Yuta Kikuchi Hiroya Takamura Manabu Okumura

In English, high-quality sentence compression models by deleting words have been trained on automatically created large training datasets. We work on Japanese sentence compression by a similar approach. To create a large Japanese training dataset, a method of creating English training dataset is modified based on the characteristics of the Japanese language. The created dataset is used to train...

Journal: :CoRR 2014
Christopher S. Kirk

This paper describes recent development and test implementation of a continuous time recurrent neural network that has been configured to predict rates of change in securities. It presents outcomes in the context of popular technical analysis indicators and highlights the potential impact of continuous predictive capability on securities market trading operations.

1999
Barbara Hammer

In this paper we show several approximation results for folding networks { a generalization of partial recurrent neural networks such that not only time sequences but arbitrary trees can serve as input: Any measurable function can be approximated in probability. Any continuous function can be approximated in the maximum norm on inputs with restricted height, but the resources necessarily increa...

1995
Davor Pavisic Laurent Blondel Jean-Philippe Draye Gaetan Libert Pierre Chapelle

Journal: :Neurocomputing 2005
Patrick D. Roberts

This study investigates the temporal dynamics of recurrent layers and their relevance to storage of temporal information. A recurrent layer is shown to generate a dynamical basis that allows cancellation of predictable sensory images via and adaptive mechanism based on spiketiming dependent plasticity. r 2004 Elsevier B.V. All rights reserved.

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید