نتایج جستجو برای: principal component analysis

تعداد نتایج: 3331272  

Journal: :The Annals of Applied Statistics 2009

A. K. Wadhwani Manish Dubey, Monika Saraswat

The principle of dimensionality reduction with PCA is the representation of the dataset ‘X’in terms of eigenvectors ei ∈ RN  of its covariance matrix. The eigenvectors oriented in the direction with the maximum variance of X in RN carry the most      relevant information of X. These eigenvectors are called principal components [8]. Ass...

Journal: :ACM/IMS Transactions on Data Science 2020

1997
Michael E. Tipping Christopher M. Bishop Peter Dayan Bernhard Schölkopf Alexander Smola Klaus-Robert Müller

Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at http://www.jstor.org/page/info/about/policies/terms.jsp. JSTOR's Terms and Conditions of Use provides, in part, that unless you have obtained prior permission, you may not download an entire issue of a journal or multiple copies of articles, and you may use content in the JSTOR archive o...

2005
JAN DE LEEUW

A. Two quite different forms of nonlinear principal component analysis have been proposed in the literature. The first one is associated with the names of Guttman, Burt, Hayashi, Benzécri, McDonald, De Leeuw, Hill, Nishisato. We call itmultiple correspondence analysis. The second form has been discussed by Kruskal, Shepard, Roskam, Takane, Young, De Leeuw, Winsberg, Ramsay. We call it no...

2004
Jun Liu Songcan Chen Zhi-Hua Zhou

Principal Component Analysis (PCA) is a feature extraction approach directly based on a whole vector pattern and acquires a set of projections that can realize the best reconstruction for an original data in the mean squared error sense. In this paper, the progressive PCA (PrPCA) is proposed, which could progressively extract features from a set of given data with large dimensionality and the e...

2002
Mohamed N. Nounou Bhavik R. Bakshi Prem K. Goel Xiaotong Shen

Principal component analysis (PCA) is a dimensionality reduction modeling technique that transforms a set of process variables by rotating their axes of representation. Maximum Likelihood PCA (MLPCA) is an extension that accounts for different noise contributions in each variable. Neither PCA nor its extensions utilize external information about the model or data such as the range or distributi...

2011
Wieland Brendel Ranulfo Romo Christian K. Machens

In many experiments, the data points collected live in high-dimensional observation spaces, yet can be assigned a set of labels or parameters. In electrophysiological recordings, for instance, the responses of populations of neurons generally depend on mixtures of experimentally controlled parameters. The heterogeneity and diversity of these parameter dependencies can make visualization and int...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید