نتایج جستجو برای: principal component analyses
تعداد نتایج: 1055377 فیلتر نتایج به سال:
Recently, many l1-norm based PCA methods have been developed for dimensionality reduction, but they do not explicitly consider the reconstruction error. Moreover, they do not take into account the relationship between reconstruction error and variance of projected data. This reduces the robustness of algorithms. To handle this problem, a novel formulation for PCA, namely angle PCA, is proposed....
When modeling multivariate data, one might have an extra parameter of contextual information that could be used to treat some observations as more similar to others. For example, images of faces can vary by age, and one would expect the face of a 40 year old to be more similar to the face of a 30 year old than to a baby face. We introduce a novel manifold approximation method, parameterized pri...
Principal components analysis (PCA) is a dimensionality reduction technique that can be used to give a compact representation of data while minimising information loss. Suppose we are given a set of data, represented as vectors in a high-dimensional space. It may be that many of the variables are correlated and that the data closely fits a lower dimensional linear manifold. In this case, PCA fi...
We propose two new principal component analysis methods in this paper utilizing a semiparametric model. The according methods are named Copula Component Analysis (COCA) and Copula PCA. The semiparametric model assumes that, after unspecified marginally monotone transformations, the distributions are multivariate Gaussian. The COCA and Copula PCA accordingly estimate the leading eigenvectors of ...
In this note we introduce a method for robust principal component regression. Robust principal components are computed from the predictor variables, and they are used afterwards for estimating a response variable by performing robust linear multiple regression. The performance of the method is evaluated at a test data set from geochemistry. Then it is used for the prediction of censored values ...
Principal Component Analysis (PCA) aims to learn compact and informative representations for data and has wide applications in machine learning, text mining and computer vision. Classical PCA based on a Gaussian noise model is fragile to noise of large magnitude. Laplace noise assumption based PCA methods cannot deal with dense noise effectively. In this paper, we propose Cauchy Principal Compo...
The principle of dimensionality reduction with PCA is the representation of the dataset ‘X’in terms of eigenvectors ei ∈ RN of its covariance matrix. The eigenvectors oriented in the direction with the maximum variance of X in RN carry the most relevant information of X. These eigenvectors are called principal components [8]. Ass...
when the number of training samples is limited, feature reduction plays an important role in classification of hyperspectral images. in this paper, we propose a supervised feature extraction method based on discriminant analysis (da) which uses the first principal component (pc1) to weight the scatter matrices. the proposed method, called da-pc1, copes with the small sample size problem and has...
t is the purpose of this paper to contribute to the discussion initiated bywachter about the parallelism between principal component (pc) and atypological grade of membership (gom) analysis. the author testedempirically the close relationship between both analysis in a lowdimensional framework comprising up to nine dichotomous variables and twotypologies. our contribution to the subject is also...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید