نتایج جستجو برای: multilayer perceptrons

تعداد نتایج: 20290  

Journal: :Frontiers in computer science 2021

Human emotion recognition is an important issue in human–computer interactions, and electroencephalograph (EEG) has been widely applied to due its high reliability. In recent years, methods based on deep learning technology have reached the state-of-the-art performance EEG-based recognition. However, there exist singularities parameter space of neural networks, which may dramatically slow down ...

Journal: :Neural computation 2006
Shun-ichi Amari Hyeyoung Park Tomoko Ozeki

The parameter spaces of hierarchical systems such as multilayer perceptrons include singularities due to the symmetry and degeneration of hidden units. A parameter space forms a geometrical manifold, called the neuromanifold in the case of neural networks. Such a model is identified with a statistical model, and a Riemannian metric is given by the Fisher information matrix. However, the matrix ...

2001
Kenji Fukumizu

This paper discusses the maximum likelihood estimation in a statistical model with unidentiÞability, using the framework of conic singularity. The likelihood ratio may diverge in unidentiÞable cases, though in regular cases it converges to a χ distribution. A useful sufficient condition of such divergence is obtained, and is applied to neural networks. The exact order for multilayer perceptrons...

Journal: :Neural computation 2010
Dan C. Ciresan Ueli Meier Luca Maria Gambardella Jürgen Schmidhuber

Good old online backpropagation for plain multilayer perceptrons yields a very low 0.35% error rate on the MNIST handwritten digits benchmark. All we need to achieve this best result so far are many hidden layers, many neurons per layer, numerous deformed training images to avoid overfitting, and graphics cards to greatly speed up learning.

Journal: :CoRR 2017
Raúl Rojas

This paper shows that a long chain of perceptrons (that is, a multilayer perceptron, or MLP, with many hidden layers of width one) can be a universal classifier. The classification procedure is not necessarily computationally efficient, but the technique throws some light on the kind of computations possible with narrow and deep MLPs.

2004
BAO-LiANG Lu YAN BAI YOSHIKAZU NISHIKAWA

Abs t rac t : W e propose an architecture of a multilayer quadratic perceptron (MLQP) that combines advantages of multilayer perceptrons(MLPs) and higher-order feedforward neural networks. The features of MLQP are in its simple structure, practical number of adjustable connection weights and powerful learning ability. I n this paper, the architecture of MLQP is described, a backpropagation lear...

1998
Alexandre R. S. Romariz P. U. A. Ferreira J. V. Campêlo M. L. Graciano O. R. Maia J. C. da Costa

A hybrid architecture for neural coprocessing is presented. A fixed set of analog multipliers and capacitors (analog memory) emulates Multilayer Perceptrons through digitally-controlled multiplexing. Parallelism is partially preserved, then, without direct analog implementation of the whole structure. Details of system VLSI implementation are given, along with simulation results that validate s...

1995
Gerhard Hirzinger

The paper extends a previously proposed method for improving the path accuracy of robots. Especially during high speed movements nonlinear couplings between the joints deteriorate and degrade the robot's accuracy. Such couplings cannot be compensated by linear feedforward control. They require additionally a general function approximator as e. g. a multilayer perceptron. The learning system tra...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید