نتایج جستجو برای: lossless dimensionality reduction

تعداد نتایج: 510869  

2002
Kristin M. Branson

Many tasks, such as face recognition, require learning a classifier from a small number of high dimensional training samples. These tasks suffer from the curse of dimensionality: the number of training samples required to accurately learn a classifier increases exponentially with the dimensionality of the data. One solution to this problem is dimensionality reduction. Common methods for dimensi...

2007
Daoqiang Zhang Zhi-Hua Zhou Songcan Chen

Dimensionality reduction is among the keys in mining highdimensional data. This paper studies semi-supervised dimensionality reduction. In this setting, besides abundant unlabeled examples, domain knowledge in the form of pairwise constraints are available, which specifies whether a pair of instances belong to the same class (must-link constraints) or different classes (cannot-link constraints)...

Journal: :Journal of Machine Learning Research 2003
Amir Globerson Naftali Tishby

Dimensionality reduction of empirical co-occurrence data is a fundamental problem in unsupervised learning. It is also a well studied problem in statistics known as the analysis of cross-classified data. One principled approach to this problem is to represent the data in low dimension with minimal loss of (mutual) information contained in the original data. In this paper we introduce an informa...

2008
Raviv Raich

Dimensionality reduction is a topic of recent interest. In this paper, we present the classification constrained dimensionality reduction (CCDR) algorithm to account for label information. The algorithm can account for multiple classes as well as the semi-supervised setting. We present an out-of-sample expressions for both labeled and unlabeled data. For unlabeled data, we introduce a method of...

1997
Stefan Schaal Sethu Vijayakumar Christopher G. Atkeson

If globally high dimensional data has locally only low dimensional distributions, it is advantageous to perform a local dimensionality reduction before further processing the data. In this paper we examine several techniques for local dimensionality reduction in the context of locally weighted linear regression. As possible candidates, we derive local versions of factor analysis regression, pri...

2015
Jelani Nelson

where the inf is taken over all admissible sequences. We also let dX(T ) denote the diameter of T with respect to norm ‖·‖X . For the remainder of this section we make the definitions πrx = argminy∈Tr‖y − x‖X and ∆rx = πrx− πr−1x. Throughout this section we let ‖ · ‖ denote the `2→2 operator norm in the case of matrix arguments, and the `2 norm in the case of vector arguments. Krahmer, Mendelso...

2008
Christian Walder Bernhard Schölkopf

This paper introduces a new approach to constructing meaningful lower dimensional representations of sets of data points. We argue that constraining the mapping between the high and low dimensional spaces to be a diffeomorphism is a natural way of ensuring that pairwise distances are approximately preserved. Accordingly we develop an algorithm which diffeomorphically maps the data near to a low...

2010
Yu-Yin Sun Michael K. Ng Zhi-Hua Zhou

Multi-instance learning deals with problems that treat bags of instances as training examples. In single-instance learning problems, dimensionality reduction is an essential step for high-dimensional data analysis and has been studied for years. The curse of dimensionality also exists in multiinstance learning tasks, yet this difficult task has not been studied before. Direct application of exi...

2015
Xiaoqian Wang Yun Liu Feiping Nie Heng Huang

As an important machine learning topic, dimensionality reduction has been widely studied and utilized in various kinds of areas. A multitude of dimensionality reduction methods have been developed, among which unsupervised dimensionality reduction is more desirable when obtaining label information requires onerous work. However, most previous unsupervised dimensionality reduction methods call f...

2011
Quanquan Gu Zhenhui Li Jiawei Han

Fisher criterion has achieved great success in dimensionality reduction. Two representative methods based on Fisher criterion are Fisher Score and Linear Discriminant Analysis (LDA). The former is developed for feature selection while the latter is designed for subspace learning. In the past decade, these two approaches are often studied independently. In this paper, based on the observation th...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید