نتایج جستجو برای: back propagation algorithm

تعداد نتایج: 984050  

2017
Tian Han Yang Lu Song-Chun Zhu Ying Nian Wu

This paper proposes an alternating back-propagation algorithm for learning the generator network model. The model is a nonlinear generalization of factor analysis. In this model, the mapping from the continuous latent factors to the observed signal is parametrized by a convolutional neural network. The alternating back-propagation algorithm iterates the following two steps: (1) Inferential back...

2009
Masashi Nakagawa Takashi Inoue Yoshifumi Nishio

Cellular neural networks (CNN) were introduced by Chua and Yang in 1998 [1]. The idea of the CNN was inspired from the architecture of the cellular automata and the neural networks. Unlike the conventional neural networks, the CNN has local connectivity property. Since the structure of the CNN resembles the structure of animals retina, the CNN can be used for various image processing applicatio...

پایان نامه :وزارت علوم، تحقیقات و فناوری - دانشگاه شهید باهنر کرمان - دانشکده مهندسی 1392

در این تحقیق از روش دسته بندی گروهی داده ها (gmdh) جهت تخمین عمق آبشستگی اطراف سازه های هیدرولیکی مورد مطالعه شامل دیواره های جانبی پل، خطوط انتقال سیال و پایه پل می باشند. شبکه gmdh توسط الگوریتم های انتشار برگشتی(back propagation)، لونبرگ-مارکوئت(levenberg-marquardt)، برنامه ریزی ژنتیک(genetic programming)، جامعه پرندگان (particle swarm optimization) و جستجوی گرانشی (gravitational search algo...

2014
J.-R. VIALA S. Makram-Ebeid J.-A. Sirat

We propose a method for learning in multilayer perceptrons (MLPs). It includes new self-adapting features that make it suitable for dealing with a variety of problems without the need for parameter re-adjustments. The validity of our approach is benchmarked for two types of problems. The first benchmark is performed for the topologically complex parity problem with a number ofbinary inputs rang...

2009
John E. W. Mayhew Neil A. Thacker

The original back-propagation methods were plagued with variable parameters which affected both the convergence properties of the training and the generalisation abilities of the resulting network. These parameters presented many difficulties when attempting to use these networks to solve particular mapping problems. A combination of established numerical minimisation methods (Polak-Ribiere Con...

1994
Shin-ichiro Mori Hiroshi Nakashima Shinji Tomita Olav Landsverk

| This paper describes several algorithms, mapping the back propagation learning algorithm onto a large 2-D torus architecture. To obtain high speedup, we have suggested an approach to combine the possible parallel aspects (training set parallelism, node parallelism and pipelining of training patterns) of the algorithm. Several algorithms were implemented on a 512 processor Fujitsu AP1000 and c...

ده‌باشیان, مریم , ظهیری , سیدحمید,

Image compression is one of the important research fields in image processing. Up to now, different methods are presented for image compression. Neural network is one of these methods that has represented its good performance in many applications. The usual method in training of neural networks is error back propagation method that its drawbacks are late convergence and stopping in points of lo...

1996
Paolo Campolucci Aurelio Uncini Francesco Piazza

This paper concerns dynamic neural networks for signal processing: architectural issues are considered but the paper focuses on learning algorithms that work on-line. Locally recurrent neural networks, namely MLP with IIR synapses and generalization of Local Feedback MultiLayered Networks (LF MLN), are compared to more traditional neural networks, i.e. static MLP with input and/or output buffer...

1996
Paolo Campolucci Aurelio Uncini Francesco Piazza

This paper concerns dynamic neural networks for signal processing: architectural issues are considered but the paper focuses on learning algorithms that work on-line. Locally recurrent neural networks, namely MLP with IIR synapses and generalization of Local Feedback MultiLayered Networks (LF MLN), are compared to more traditional neural networks, i.e. static MLP with input and/or output buffer...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید