نتایج جستجو برای: perceptron
تعداد نتایج: 8752 فیلتر نتایج به سال:
Recurrent perceptron classiiers generalize the classical perceptron model. They take into account those correlations and dependences among input coordinates which arise from linear digital ltering. This paper provides tight bounds on sample complexity associated to the tting of such models to experimental data.
A linearly separable Boolean function is learned by a diluted perceptron with optimal stability. A diierent level of dilution is allowed for teacher and student perceptron. The learning algorithms used were the optimal annealed dilution and Hebbian dilution. The generalisation ability, i.e. the probability to recognize a pattern which has not been learned before, is calculated in replica symmetry.
The model of the hybrid neural network is considered. This model consists of model ART-2 for clustering and perceptron for preprocessing of images. The perceptron provides invariant recognition of objects. This model can be used in mobile robots for recognition of new objects or scenes in sight the robot during his movement.
In this paper, we will consider the problem of classifying electroencephalogram (EEG) signals of normal subjects, and subjects suuering from psychiatric disorder, e.g., obsessive compulsive disorder, schizophrenia, using a class of artiicial neural networks, viz., multi-layer perceptron. It is shown that the multilayer perceptron is capable of classifying unseen test EEG signals to a high degre...
A new perceptron learning rule which works with multilayer neural networks made of multi-state units is obtained, and the corresponding convergence theorem is proved. The deenition of perceptron of maximal stability is enlarged in order to include these new multi-state perceptrons, and a proof of existence and uniqueness of such optimal solutions is outlined.
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید