نتایج جستجو برای: layer perceptron

تعداد نتایج: 288007  

2001
Fabien Langlet Hasan Abdulkader Daniel Roviras L. Lapierre Francis Castanie

In this paper, we present a neural network architecture that belongs to the multi-layer perceptron family, associated with two different algorithms: the ordinary gradient and the natural gradient, we compare performances of those algorithms. The identification of a non-normalized power amplifier yielded to the introduction of an additional weight in the classical multilayer perceptron structure...

2013
Taranjeet Kaur Rupinder Kaur

In software engineering there are plenty of applications used for reduced complexity and improved fault prediction approaches. In this paper we study various metrics that are not very much suitable to find fault classes in software. Basically using the concept of metrics to find fault classes and reduced complexity of classes. . various techniques like linear regression, logistic regression, on...

1996
S. Mertens

We investigate the VC-dimension of the perceptron and simple two-layer networks like the committeeand the parity-machine with weights restricted to values ±1. For binary inputs, the VC-dimension is determined by atypical pattern sets, i.e. it cannot be found by replica analysis or numerical Monte Carlo sampling. For small systems, exhaustive enumerations yield exact results. For systems that ar...

Journal: :Kybernetika 1998
Igor Vajda Belomír Lonek Viktor Nikolov Arnost Veselý

For general Bayes decision rules there are considered perceptron approximations based on sufficient statistics inputs. A particular attention is paid to Bayes discrimination and classification. In the case of exponentially distributed data with known model it is shown that a perceptron with one hidden layer is sufficient and the learning is restricted to synaptic weights of the output neuron. I...

1998
Barbara Hammer

The loading problem is the problem to decide if a neural architecture can map a training set correctly with an appropriate choice of the weights. The following results will be shown: The loading problem is NP-complete for any feedforward perceptron architecture with at least two neurons in the rst hidden layer and varying input dimension. Further, it is NP-complete if the input dimension is xed...

1992
John M. Zelle

The ideas presented here are based on two observations of perceptrons: (1) when the perceptron learning algorithm cycles among hyperplanes, the hyperplanes may be compared to select one that gives a best split of the examples, and (2) it is always possible for the perceptron to build a hyper-plane that separates at least one example from all the rest. We describe the Extentron which grows multi...

Journal: :Int. Arab J. Inf. Technol. 2016
Tarig Almehmadi Zaw Zaw Htike

The automatic vehicle classification system has emerged as an important field of study in image processing and machine vision technologies’ implementation because of its variety of applications. Despite many alternative solutions for the classification issue, the vision-based approaches remain the dominant solutions due to their ability to provide a larger number of parameters than other approa...

1996
Youngjoo Suh Youngjik Lee

In this paper, we propose a new method of phoneme segmentation using MLP(multi-layer perceptron). The structure of the proposed segmenter consists of three parts: preprocessor, MLP-based phoneme segmenter, and postprocessor. The preprocessor utilizes a sequence of 44 order feature parameters for each frame of speech, based on the acoustic-phonetic knowledge. The MLP has one hidden layer and an ...

2014
Arindam Sarkar J. K. Mandal

In this paper, a multilayer perceptron guided encryption/decryption (STMLP) in wireless communication has been proposed for exchange of data/information. Multilayer perceptron transmitting systems at both ends generate an identical output bit and the network are trained based on the output which is used to synchronize the network at both ends and thus forms a secret-key at end of synchronizatio...

2008
G. S. Gill

Neural networks are increasingly used to solve highly non linear control problems. The current paper addresses the problem of forecasting the result of general elections in India. The neural network is first made to learn and then the trained network is made to forecast the result of the election. While training the network minimal disturbance principle was followed, which suggests that during ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید