نتایج جستجو برای: perceptron

تعداد نتایج: 8752  

2013
Wei-Chen Cheng

This work presents a constructive method to train the multilayer perceptron layer after layer successively and to accomplish the kernel used in the support vector machine. Data in different classes will be trained to map to distant points in each layer. This will ease the mapping of the next layer. A perfect mapping kernel can be accomplished successively. Those distant mapped points can be dis...

1993
Marco Budinich

We extend the geometkcal approach to the Perceptron and show that. given n examples, learning is of maximal difficulty when the number of inputs dis such that n = 5d. We then present a nea' Perceptron algorithm that takes advantage of the peculiarities of the cost function. In our tests it is more than two times faster that the standard algorithm. More importantly it does not have k e d paramet...

2012
Shivani Agarwal

In this lecture we will start to study the online learning setting that was discussed briefly in the first lecture. Unlike the batch setting we have studied so far, where one is given a sample or ‘batch’ of training data and the goal is to learn from this data a model that can make accurate predictions in the future, in the online setting, learning takes place in a sequence of trials: on each t...

Journal: :SIAM J. Comput. 2002
Rocco A. Servedio

We analyze the performance of the widely studied Perceptron andWinnow algorithms for learning linear threshold functions under Valiant’s probably approximately correct (PAC) model of concept learning. We show that under the uniform distribution on boolean examples, the Perceptron algorithm can efficiently PAC learn nested functions (a class of linear threshold functions known to be hard for Per...

Journal: :J. Low Power Electronics 2006
Kaveh Aasaraai Amirali Baniasadi

Perceptron based predictors are highly accurate. This high accuracy is the result of exploiting long history lengths [1] and is achieved at the expense of high complexity. The dot product of two vectors is used as prediction. The first vector is the branch outcome history where the second vector is composed of per branch weights, which represent the correlation between branch outcome and previo...

2006
Xavier Carreras Mihai Surdeanu Lluís Màrquez i Villodre

We describe an online learning dependency parser for the CoNLL-X Shared Task, based on the bottom-up projective algorithm of Eisner (2000). We experiment with a large feature set that models: the tokens involved in dependencies and their immediate context, the surfacetext distance between tokens, and the syntactic context dominated by each dependency. In experiments, the treatment of multilingu...

2010
Constantinos Panagiotakopoulos Petroula Tsampouka

We introduce into the classical Perceptron algorithm with margin a mechanism of unlearning which in the course of the regular update allows for a reduction of possible contributions from “very well classified” patterns to the weight vector. The resulting incremental classification algorithm, called Margin Perceptron with Unlearning (MPU), provably converges in a finite number of updates to any ...

2012
Geraldina Ribeiro Wouter Duivesteijn Carlos Soares Arno J. Knobbe

Label Ranking problems are receiving increasing attention in machine learning. The goal is to predict not just a single value from a finite set of labels, but rather the permutation of that set that applies to a new example (e.g., the ranking of a set of financial analysts in terms of the quality of their recommendations). In this paper, we adapt a multilayer perceptron algorithm for label rank...

Journal: :CoRR 2008
Fabrice Rossi Brieuc Conan-Guez

In some real world situations, linear models are not sufficient to represent accurately complex relations between input variables and output variables of a studied system. Multilayer Perceptrons are one of the most successful non-linear regression tool but they are unfortunately restricted to inputs and outputs that belong to a normed vector space. In this chapter, we propose a general recoding...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید