Pattern Recognition by Probabilistic Neural Networks - Mixtures of Product Components versus Mixtures of Dependence Trees
نویسندگان
چکیده
We compare two probabilistic approaches to neural networks the first one based on the mixtures of product components and the second one using the mixtures of dependence-tree distributions. The product mixture models can be efficiently estimated from data by means of EM algorithm and have some practically important properties. However, in some cases the simplicity of product components could appear too restrictive and a natural idea is to use a more complex mixture of dependence-tree distributions. By considering the concept of dependence tree we can explicitly describe the statistical relationships between pairs of variables at the level of individual components and therefore the approximation power of the resulting mixture may essentially increase. Nonetheless, in application to classification of numerals we have found that both models perform comparably and the contribution of the dependence-tree structures decreases in the course of EM iterations. Thus the optimal estimate of the dependence-tree mixture tends to converge to a simple product mixture model. Regardless of computational aspects, the dependence-tree mixtures could help to clarify the role of dendritic branching in the highly selective excitability of neurons.
منابع مشابه
Neuromorphic features of probabilistic neural networks
We summarize the main results on probabilistic neural networks recently published in a series of papers. Considering the framework of statistical pattern recognition we assume approximation of class-conditional distributions by finite mixtures of product components. The probabilistic neurons correspond to mixture components and can be interpreted in neurophysiological terms. In this way we can ...
متن کاملIterative principles of recognition in probabilistic neural networks
When considering the probabilistic approach to neural networks in the framework of statistical pattern recognition we assume approximation of class-conditional probability distributions by finite mixtures of product components. The mixture components can be interpreted as probabilistic neurons in neurophysiological terms and, in this respect, the fixed probabilistic description contradicts the ...
متن کاملRecurrent Bayesian Reasoning in Probabilistic Neural Networks
Considering the probabilistic approach to neural networks in the framework of statistical pattern recognition we assume approximation of class-conditional probability distributions by finite mixtures of product components. The mixture components can be interpreted as probabilistic neurons in neurophysiological terms and, in this respect, the fixed probabilistic description becomes conflicting w...
متن کاملExtraction of Binary Features by Probabilistic Neural Networks
In order to design probabilistic neural networks in the framework of pattern recognition we estimate class-conditional probability distributions in the form of finite mixtures of product components. As the mixture components correspond to neurons we specify the properties of neurons in terms of component parameters. The probabilistic features defined by neuron outputs can be used to transform t...
متن کاملThe Prediction of Surface Tension of Ternary Mixtures at Different Temperatures Using Artificial Neural Networks
In this work, artificial neural network (ANN) has been employed to propose a practical model for predicting the surface tension of multi-component mixtures. In order to develop a reliable model based on the ANN, a comprehensive experimental data set including 15 ternary liquid mixtures at different temperatures was employed. These systems consist of 777 data points generally containing hydrocar...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2014