نتایج جستجو برای: multilayer perceptrons

تعداد نتایج: 20290  

2010
Frauke Günther

Artificial neural networks are applied in many situations. neuralnet is built to train multi-layer perceptrons in the context of regression analyses, i.e. to approximate functional relationships between covariates and response variables. Thus, neural networks are used as extensions of generalized linear models. neuralnet is a very flexible package. The backpropagation algorithm and three versio...

Journal: :Processes 2022

The derivation of minimal bioreaction models is primary importance to develop monitoring and control strategies cell/microorganism culture production. These can be obtained based on the selection a basis elementary flux modes (EFMs) using an algorithm starting from relatively large set EFMs progressively reducing their numbers geometric least-squares residual criteria. reaction rates associated...

2000
Rafał ADAMCZAK Geerd H.F. DIERCKSEN

A framework for Similarity-Based Methods (SBMs) includes many classification models as special cases: neural network of the Radial Basis Function Networks type, Feature Space Mapping neurofuzzy networks based on separable transfer functions, Learning Vector Quantization, variants of the k nearest neighbor methods and several new models that may be presented in a network form. Multilayer Percept...

1999
M. Skurichina

Training M. Skurichina1, .Raudys2 and R.P.W. Duin1 1Pattern Recognition Group, Department of Applied Physics, Delft University of Technology, P.O. Box 5046, 2600GA Delft, The Netherlands. E-mail: [email protected], [email protected] 2Department of Data Analysis, Institute of Mathematics and Informatics, Akademijos 4, Vilnius 2600, Lithuania. Email: [email protected] Abstract T...

2008
Shun-ichi Amari

When a parameter space has a certain underlying structure, the ordinary gradient of a function does not represent its steepest direction but the natural gradient does. Information geometry is used for calculating the natural gradients in the parameter space of perceptrons, the space of matrices (for blind source separation) and the space of linear dynamical systems (for blind source deconvoluti...

2012
Anindya Roy Mathew Magimai-Doss Sébastien Marcel

In a recent work, the framework of Boosted Binary Features (BBF) was proposed for ASR. In this framework, a small set of localized binary-valued features are selected using the Discrete Adaboost algorithm. These features are then integrated into a standard HMM-based system using either single layer perceptrons (SLP) or multilayer perceptrons (MLP). The features were found to perform significant...

2013
Arindam Sarkar J. K. Mandal

In this paper, simulated annealing guided traingularized encryption using multilayer perceptron generated session key (SATMLP) has been proposed for secured wireless communication. Both sender and receiver station uses identical multilayer perceptron and depending on the final output of the both side multilayer perceptron, weights vector of hidden layer get tuned in both ends. After this tunnin...

2004
Walter H. Delashmit Michael T. Manry

Due to the chaotic nature of multilayer perceptron training, training error usually fails to be a monotonically nonincreasing function of the number of hidden units. New training algorithms are developed where weights and thresholds from a well-trained smaller network are used to initialize a larger network. Methods are also developed to reduce the total amount of training required. It is shown...

2007
V. Rivas G. Romero

This paper proposes a new version of a method (G-Prop-III, genetic backpropaga-tion) that attempts to solve the problem of nding appropriate initial weights and learning parameters for a single hidden layer Mul-tilayer Perceptron (MLP) by combining a genetic algorithm (GA) and backpropagation (BP). The GA selects the initial weights and the learning rate of the network, and changes the number o...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید