نتایج جستجو برای: backpropagation

تعداد نتایج: 7478  

2004
Erik Hulthén Mattias Wahde

Some results from a method for generating recurrent neural networks (RNN) for prediction of financial and macroeconomic time series are presented. In the presented method, a feedforward neural network (FFNN) is first obtained using backpropagation. While backpropagation is usually able to find a fairly good predictor, all FFNN are limited by their lack of short-term dynamic memory. RNNs, by con...

Journal: :CoRR 2017
Nan Rosemary Ke Anirudh Goyal Olexa Bilaniuk Jonathan Binas Laurent Charlin Christopher Joseph Pal Yoshua Bengio

A major drawback of backpropagation through time (BPTT) is the difficulty of learning long-term dependencies, coming from having to propagate credit information backwards through every single step of the forward computation. This makes BPTT both computationally impractical and biologically implausible. For this reason, full backpropagation through time is rarely used on long sequences, and trun...

2004
Z. Zainuddin N. Mahat Y. Abu Hassan

Since the presentation of the backpropagation algorithm, a vast variety of improvements of the technique for training a feed forward neural networks have been proposed. This article focuses on two classes of acceleration techniques, one is known as Local Adaptive Techniques that are based on weightspecific only, such as the temporal behavior of the partial derivative of the current weight. The ...

1999
Nathalie Japkowicz Stephen Jose Hanson

Possible paradigms for concept learning by feedfor-ward neural networks include discrimination and recognition. An interesting aspect of this di-chotomy is that the recognition-based implementation can learn certain domains much more ee-ciently than the discrimination-based one, despite the close structural relationship between the two systems. The purpose of this paper is to explain this diier...

1995
Barak A. Pearlmutter

| We survey learning algorithms for recurrent neural networks with hidden units, and put the various techniques into a common framework. We discuss xedpoint learning algorithms, namely recurrent backpropagation and deterministic Boltzmann Machines, and non-xedpoint algorithms , namely backpropagation through time, Elman's history cutoo, and Jordan's output feedback architecture. Forward propaga...

2014
Danilo Jimenez Rezende Shakir Mohamed Daan Wierstra

We marry ideas from deep neural networks and approximate Bayesian inference to derive a generalised class of deep, directed generative models, endowed with a new algorithm for scalable inference and learning. Our algorithm introduces a recognition model to represent an approximate posterior distribution and uses this for optimisation of a variational lower bound. We develop stochastic backpropa...

2004
James N. Etheredge

For many applications random access to data is critical to providing users with the level of efficiency necessary to make applications usable. It is also common to maintain data files in sequential order to allow batch processing of the data. This paper presents a method that uses a modified backpropagation neural network to locate records in a file randomly. The modifications necessary to the ...

2010
Frauke Günther

Artificial neural networks are applied in many situations. neuralnet is built to train multi-layer perceptrons in the context of regression analyses, i.e. to approximate functional relationships between covariates and response variables. Thus, neural networks are used as extensions of generalized linear models. neuralnet is a very flexible package. The backpropagation algorithm and three versio...

1998
Steven Walczak Walter E. Pofahl Ronald J. Scorpio

Critical care providers are faced with resource shortages and must find ways to effectively plan their resource utilization. Neural networks provide a new method for evaluating trauma patient (and other medical patient) level of illness and accurately predicting a patient’s length of stay at the critical care facility. Backpropagation, radial-basis-function, and fuzzy ARTMAP neural networks are...

2016
Audrunas Gruslys Rémi Munos Ivo Danihelka Marc Lanctot Alex Graves

We propose a novel approach to reduce memory consumption of the backpropagation through time (BPTT) algorithm when training recurrent neural networks (RNNs). Our approach uses dynamic programming to balance a trade-off between caching of intermediate results and recomputation. The algorithm is capable of tightly fitting within almost any user-set memory budget while finding an optimal execution...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید