نتایج جستجو برای: fisher
تعداد نتایج: 18991 فیلتر نتایج به سال:
Fisher criterion has achieved great success in dimensionality reduction. Two representative methods based on Fisher criterion are Fisher Score and Linear Discriminant Analysis (LDA). The former is developed for feature selection while the latter is designed for subspace learning. In the past decade, these two approaches are often studied independently. In this paper, based on the observation th...
In this paper, an imprecise data classification is considered using new version of Fisher discriminator, namely interval Fisher. In the conventional formulation of Fisher, elements of within-class scatter matrix (related to covariance matrix between clusters) and between-class scatter matrix (related to covariance matrix of centers of clusters) have single values; but in the interval Fisher, th...
We present some optimal criteria to evaluate model-robustness of non-regular two-level fractional factorial designs. Our method is based on minimizing the sum of squares of all the off-diagonal elements in the information matrix, and considering expectation under appropriate distribution functions for unknown contamination of the interaction effects. By considering uniform distributions on symm...
New families of Fisher information and entropy power inequalities for sums of independent random variables are presented. These inequalities relate the information in the sum of n independent random variables to the information contained in sums over subsets of the random variables, for an arbitrary collection of subsets. As a consequence, a simple proof of the monotonicity of information in ce...
We present two extended forms of Fisher information that fit well in the context of nonextensive thermostatistics. We show that there exists an interplay between these generalized Fisher information, the generalized q-Gaussian distributions and the q-entropies. The minimum of the generalized Fisher information among distributions with a fixed moment, or with a fixed qentropy is attained, in bot...
The basic idea behind the Fisher kernel method is to train a (generative) hidden Markov model (HMM) on data to derive a Fisher kernel for a (discriminative) support vector machine (SVM). The Fisher kernel gives a ‘natural’ similarity measure that takes into account the underlying probability distribution. If each data item is a (possibly varying length) sequence, each may be used to train a HMM...
in this paper, the geometric distribution is considered. the means, variances, and covariances of its order statistics are derived. the fisher information in any set of order statistics in any distribution can be represented as a sum of fisher information in at most two order statistics. it is shown that, for the geometric distribution, it can be further simplified to a sum of fisher informatio...
In this paper we show how the generation of documents can be thought of as a k-stage Markov process, which leads to a Fisher kernel from which the n-gram and string kernels can be reconstructed. The Fisher kernel view gives a more exible insight into the string kernel and suggests how it can be parametrised in a way that re-ects the statistics of the training corpus. Furthermore, the prob-abili...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید