Sample Complexity for Learning Recurrent Perceptron Mappings
نویسندگان
چکیده
Recurrent perceptron classifiers generalize the classical perceptron model. They take into account those correlations and dependences among input coordinates which arise from linear digital filtering. This paper provides tight bounds on sample complexity associated to the fitting of such models to experimental data.
منابع مشابه
Sample Complexity for Learning Recurrent
Recurrent perceptron classifiers generalize the usual perceptron model. They correspond to linear transformations of input vectors obtained by means of “autoregressive movingaverage schemes”, or infinite impulse response filters, and allow taking into account those correlations and dependences among input coordinates which arise from linear digital filtering. This paper provides tight bounds on...
متن کاملSample Complexity for LearningRecurrent Perceptron
Recurrent perceptron classiiers generalize the classical perceptron model. They take into account those correlations and dependences among input coordinates which arise from linear digital ltering. This paper provides tight bounds on sample complexity associated to the tting of such models to experimental data.
متن کاملPerceptron Learning with Discrete Weights
Perceptron learning bounds with real weights have been presented by several authors. In the present paper we study the perceptron learning task when using integer weights in [−k, k]. We present a sample complexity formula based on an exact counting result of the finite class of functions implemented by the perceptron, and show that this bound is less pessimistic than existing bounds for the dis...
متن کاملAnalysis of Perceptron-Based Active Learning
We start by showing that in an active learning setting, the Perceptron algorithm needs Ω( 1 2 ) labels to learn linear separators within generalization error . We then present a simple selective sampling algorithm for this problem, which combines a modification of the perceptron update with an adaptive filtering rule for deciding which points to query. For data distributed uniformly over the un...
متن کاملA New Threshold Unit Learning Algorithm
A new algorithm for learning a threshold unit is proposed. The Barycentric Correction Procedure (BCP) is an eecient substitute for the Perceptron and its enhanced versions such as the Thermal Perceptron or the Pocket algorithm. Based on geometrical concepts, the BCP is much more eecient than the Perceptron for learning linearly separable mappings. To deal with linearly nonseparable mappings, ex...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- IEEE Trans. Information Theory
دوره 42 شماره
صفحات -
تاریخ انتشار 1995