Boosting in Probabilistic Neural Networks

نویسندگان

  • Jirí Grim
  • Pavel Pudil
  • Petr Somol
چکیده

The basic idea of boosting is to increase the pattern recognition accuracy by combining classifiers which have been derived from differently weighted versions of the original training data. It has been verified in practical experiments that the resulting classification performance can be improved by increasing the weights of misclassified training samples. However, in statistical pattern recognition, the weighted data may influence the form of the estimated conditional distributions and therefore the theoretically achievable classification error could increase. We prove that in case of maximum-likelihood estimation the weighting of discrete data vectors is asymptotically equivalent to multiplication of the estimated discrete conditional distributions by a positive bounded function. Consequently, the Bayesian decision-making is shown to be asymptotically invariant with respect to arbitrary weighting of data provided that (a) the weighting function is defined identically for all classes and (b) the prior probabilities are properly modified.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Probabilistic Contaminant Source Identification in Water Distribution Infrastructure Systems

Large water distribution systems can be highly vulnerable to penetration of contaminant factors caused by different means including deliberate contamination injections. As contaminants quickly spread into a water distribution network, rapid characterization of the pollution source has a high measure of importance for early warning assessment and disaster management. In this paper, a methodology...

متن کامل

Novel Intrusion Detection using Probabilistic Neural Network and Adaptive Boosting

This article applies Machine Learning techniques to solve Intrusion Detection problems within computer networks. Due to complex and dynamic nature of computer networks and hacking techniques, detecting malicious activities remains a challenging task for security experts, that is, currently available defense systems suffer from low detection capability and high number of false alarms. To overcom...

متن کامل

Novel Radial Basis Function Neural Networks based on Probabilistic Evolutionary and Gaussian Mixture Model for Satellites Optimum Selection

In this study, two novel learning algorithms have been applied on Radial Basis Function Neural Network (RBFNN) to approximate the functions with high non-linear order. The Probabilistic Evolutionary (PE) and Gaussian Mixture Model (GMM) techniques are proposed to significantly minimize the error functions. The main idea is concerning the various strategies to optimize the procedure of Gradient ...

متن کامل

Application of Radial Basis Neural Networks in Fault Diagnosis of Synchronous Generator

This paper presents the application of radial basis neural networks to the development of a novel method for the condition monitoring and fault diagnosis of synchronous generators. In the proposed scheme, flux linkage analysis is used to reach a decision. Probabilistic neural network (PNN) and discrete wavelet transform (DWT) are used in design of fault diagnosis system. PNN as main part of thi...

متن کامل

Prediction of methanol loss by hydrocarbon gas phase in hydrate inhibition unit by back propagation neural networks

Gas hydrate often occurs in natural gas pipelines and process equipment at high pressure and low temperature. Methanol as a hydrate inhibitor injects to the potential hydrate systems and then recovers from the gas phase and re-injects to the system. Since methanol loss imposes an extra cost on the gas processing plants, designing a process for its reduction is necessary. In this study, an accur...

متن کامل

Empirical Margin Distributions and Bounding the Generalization Error of Combined Classifiers

We prove new probabilistic upper bounds on generalization error of complex classifiers that are combinations of simple classifiers. Such combinations could be implemented by neural networks or by voting methods of combining the classifiers, such as boosting and bagging. The bounds are in terms of the empirical distribution of the margin of the combined classifier. They are based on the methods ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2002