نتایج جستجو برای: single layer perceptron
تعداد نتایج: 1125882 فیلتر نتایج به سال:
We have proposed the glial network which was inspired from the feature of brain. In the glial network, glias generate independent oscillations and these oscillations propagated neurons and other glias. We confirmed that the glial network improved the learning performance of the Multi-Layer Perceptron (MLP) In this article, we investigate the MLP with the impulse glial network. The glias have on...
Abstract—A glia is a nervous cell in the brain. Currently, the glia is known as a important cell for the human’s cerebration. Because the glia transmits signals to neurons and other glias. We notice features of the glia and consider to apply it for an artificial neural network. In this paper, we propose a Multi-layer perceptron (MLP) with pulse glial chain. The pulse glial chain is inspired fro...
In this paper, we introduce a method that allows to evaluate efficiently the “importance” of each coordinate of the input vector of a neural network. This measurement can be used to obtain informations about the studied data. It can also be used to suppress irrelevant inputs in order to speed up the classification process conducted by the network.
Several neural network architectures have been developed over the past several years. One of the most popular and most powerful architectures is the multilayer perceptron. This architecture will be described in detail and recent advances in training of the multilayer perceptron will be presented. Multilayer perceptrons are trained using various techniques. For years the most used training metho...
There are a lot of extensions made to the classic model of multi-layer perceptron (MLP). A notable amount of them has been designed to hasten the learning process without considering the quality of generalization. The paper proposes a new MLP extension based on exploiting topology of the input layer of the network. Experimental results show the extended model to improve upon generalization capa...
There are a lot of extensions made to the classic model of multi-layer perceptron (MLP). A notable amount of them has been designed to hasten the learning process without considering the quality of generalization. The paper proposes a new MLP extension based on exploiting topology of the input layer of the network. Experimental results show the extended model to improve upon generalization capa...
Statistical mechanics is applied to estimate the maximal capacity per weight (a,) of a two-layer feed-forward network with discrete weights of depth 1, functioning as a parity machine of the K hidden units. For each K and lSZo(K'), the maximal theoretical capacity a, = 10g2(2Z) is achieved, the capacity per bit is 1, the average overlap between different solutions is zero and Zo(K) logK for lar...
This report examines the fault tolerance of multi-layer perceptron networks. First, the operation of a single perceptron unit is analysed, and it is found that they are highly fault tolerant. This suggests that neural networks composed from these units could in theory be extremely reliable. The multi-layer perceptron network was then examined, but surprisingly was found to be non-fault tolerant...
This work explores the Multi-layer Perceptron’s inference capabilities to detect textured relationships of pixels belonging to a squared neighbourhood. Although hidden in the neuron connections, these relationships lend the neural network the necessary discriminant power to classify patterns. Results similar to those involving the combination co-occurrence matrices-MLP have been obtained for su...
We study in this paper several methods that allow one to use interval data as inputs for Multi-layer Perceptrons. We show that interesting results can be obtained by using together two methods: the extremal values method which is based on a complete description of intervals, and the simulation method which is based on a probabilistic understanding of intervals. Both methods can be easily implem...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید