نتایج جستجو برای: multilayer perceptrons

تعداد نتایج: 20290  

1996
Gary William Flake

Consider a multilayer perceptron (MLP) with d inputs, a single hidden sigmoidal layer and a linear output. By adding an additional d inputs to the network with values set to the square of the rst d inputs, properties reminiscent of higher-order neural networks and radial basis function networks (RBFN) are added to the architecture with little added expense in terms of weight requirements. Of pa...

2008
Pitoyo Hartono

In this study we propose a new ensemble model composed of several linear perceptrons. The objective of this study is to build a piecewise-linear classifier that is not only competitive to Multilayer Perceptrons(MLP) in generalization performance but also interpretable in the form of human-comprehensible rules. We present a simple competitive training method that allows the ensemble to effective...

Journal: :IEEE transactions on neural networks 2000
Guang-Bin Huang Yan Qiu Chen Haroon Atique Babri

Multilayer perceptrons with hard-limiting (signum) activation functions can form complex decision regions. It is well known that a three-layer perceptron (two hidden layers) can form arbitrary disjoint decision regions and a two-layer perceptron (one hidden layer) can form single convex decision regions. This paper further proves that single hidden layer feedforward neural networks (SLFN's) wit...

2002
Brieuc Conan-Guez Fabrice Rossi

In this paper, we propose a new way to use Functional MultiLayer Perceptrons (FMLP). In our previous work, we introduced a natural extension of Multi Layer Perceptrons (MLP) to functional inputs based on direct manipulation of input functions. We propose here to rely on a representation of input and weight functions thanks to projection on a truncated base. We show that the proposed model has t...

2005
Ana M. González Iván Cantador José R. Dorronsoro

Parallel perceptrons (PPs), a novel approach to committee machine training requiring minimal communication between outputs and hidden units, allows the construction of efficient and stable nonlinear classifiers. In this work we shall explore how to improve their performance allowing their output weights to have real values, computed by applying Fisher’s linear discriminant analysis to the commi...

2009
Violeta Sandu Florin Leon

Neural networks are often used for pattern recognition. They prove to be a popular choice for OCR (Optical Character Recognition) systems, especially when dealing with the recognition of printed text. In this paper, multilayer perceptrons are used for the recognition of handwritten digits. The accuracy achieved proves that this application is a working prototype that can be further extended int...

2000
Fadzilah Siraj Derek Partridge

This paper discusses the empirical evaluation of improving generalization performance of neural networks by systematic treatment of training and test failures. As a result of systematic treatment of failures, multilayer perceptron (MLP) discriminants were developed as discrimination techniques. The experiments presented in this paper illustrate the application of discrimination techniques using...

2012
Dan C. Ciresan Ueli Meier Luca Maria Gambardella Jürgen Schmidhuber

The competitive MNIST handwritten digit recognition benchmark has a long history of broken records since 1998. The most recent advancement by others dates back 8 years (error rate 0.4%). Good old on-line back-propagation for plain multi-layer perceptrons yields a very low 0.35% error rate on the MNIST handwritten digits benchmark with a single MLP and 0.31% with a committee of seven MLP. All we...

2006
Bumghi Choi Ju-Hong Lee Tae-Su Park

Multilayer perceptrons have been applied successfully to solve some difficult and diverse problems with the backpropagation learning algorithm. However, the algorithm is known to have slow and false convergence aroused from flat surface and local minima on the cost function. Many algorithms announced so far to accelerate convergence speed and avoid local minima appear to pay some trade-off for ...

2012
Rita Lovassy László T. Kóczy László Gál

The concept of fuzzy flip-flop was introduced in the middle of 1980’s by Hirota (with his students). The Hirota Lab recognized the essential importance of the concept of a fuzzy extension of a sequential circuit and the notion of fuzzy memory. From this point of view they proposed alternatives for “fuzzifying” digital flip-flops. The starting elementary digital units were the binary J-K flipflo...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید