نتایج جستجو برای: backpropagation

تعداد نتایج: 7478  

Journal: :Proxies: Jurnal Informatika 2021

In this modern era, there are many algorithms that can be used to classify the weather, one of is Backpropagation. Using Backpropagation, research were using temperature, pressure, humidity, wind speed, rain and clouds as input parameters. And output clear, rain. Backpropagation consists learning process testing process. The get optimal weight test classification from Data was in 1600 data (80%...

2002
Deniz Erdogmus José Carlos Príncipe Luis Vielva David Luengo

Adaptive systems research is mainly concentrated around optimizing cost functions suitable to problems. Recently, Principe et al. proposed a particle interaction model for information theoretical learning. In this paper, inspired by this idea, we propose a generalization to the particle interaction model for learning and system adaptation. In addition, for the special case of supervised multi-l...

Journal: :IEEE transactions on neural networks 1996
Marco Gori Marco Maggini

Many researchers are quite skeptical about the actual behavior of neural network learning algorithms like backpropagation. One of the major problems is with the lack of clear theoretical results on optimal convergence, particularly for pattern mode algorithms. In this paper, we prove the companion of Rosenblatt's PC (perceptron convergence) theorem for feedforward networks (1960), stating that ...

2008
Abdallah El Ali Loes Bazen Iris Groen Elisa Hermanides Wouter Kool David Neville Kendall Rattner Jaap Murre

Various methods to overcome the catastrophic interference effect in backpropagation networks are directly compared on a simple learning task. Interleaved learning delivered the best results: in a backpropagation network the pattern “McClelland” was retained after learning the pattern “soup”. Neither the implementation of a sharpening function, nor adjustment of the activation function improved ...

2013
Gunjan Mehta Sonia Vatta

Face recognition is a system that identifies human faces from an image database or from a video frame . The paper presents a literature review on face recognition approaches. It then explains two different algorithms for feature extraction which are Principal Component Analysis and Fisher Faces algorithm. It also explains how images can be recognized using a Backpropagation algorithm on a Feedf...

1998
Tomasz J. Cholewo Jacek M. Zurada

FIR neural networks are feedforward neural networks with regular scalar synapses replaced by linear finite impulse response filters. This paper introduces the Second Order Temporal Backpropagation algorithm which enables the exact calculation of the second order error derivatives for a FIR neural network. This method is based on the error gradient calculation method first proposed by Wan and re...

2009
Christopher Altman Roman R. Zapatrin

We introduce a robust, error-tolerant adaptive training algorithm for generalized learning paradigms in high-dimensional superposed quantum networks, or adaptive quantum networks. The formalized procedure applies standard backpropagation training across a coherent ensemble of discrete topological configurations of individual neural networks, each of which is formally merged into appropriate lin...

1992
William Finnoff

In this paper we discuss the asymptotic properties of the most commonly used variant of the backpropagation algorithm in which network weights are trained by means of a local gradient descent on examples drawn randomly from a fixed training set, and the learning rate TJ of the gradient updates is held constant (simple backpropagation). Using stochastic approximation results, we show that for TJ...

Journal: :CoRR 2017
Benjamin Scellier Yoshua Bengio

Recurrent Backpropagation and Equilibrium Propagation are algorithms for fixed point recurrent neural networks which differ in their second phase. In the first phase, both algorithms converge to a fixed point which corresponds to the configuration where the prediction is made. In the second phase, Recurrent Backpropagation computes error derivatives whereas Equilibrium Propagation relaxes to an...

Journal: :CoRR 2015
Shixiang Gu Sergey Levine Ilya Sutskever Andriy Mnih

Deep neural networks are powerful parametric models that can be trained efficiently using the backpropagation algorithm. Stochastic neural networks combine the power of large parametric functions with that of graphical models, which makes it possible to learn very complex distributions. However, as backpropagation is not directly applicable to stochastic networks that include discrete sampling ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید