نتایج جستجو برای: lossless dimensionality reduction
تعداد نتایج: 510869 فیلتر نتایج به سال:
The theme of these two lectures is that for L2 methods we need not work in infinite dimensional spaces. In particular, we can unadaptively find and work in a low dimensional space and achieve about as good results. These results question the need for explicitly working in infinite (or high) dimensional spaces for L2 methods. In contrast, for sparsity based methods (including L1 regularization),...
Dimensionality reduction is one of the widely used techniques for data analysis. However, it is often hard to get a demanded low-dimensional representation with only the unlabeled data, especially for the discriminative task. In this paper, we put forward a novel problem of Transferred Dimensionality Reduction, which is to do unsupervised discriminative dimensionality reduction with the help of...
A method for creating a non–linear encoder–decoder for multidimensional data with compact representations is presented. The commonly used technique of autoassociation is extended to allow non–linear representations, and an objective function which penalizes activations of individual hidden units is shown to result in minimum dimensional encodings with respect to allowable error in reconstruction.
The problem of learning from both labeled and unlabeled data is considered. In this paper, we present a novel semisupervised multimodal dimensionality reduction (SSMDR) algorithm for feature reduction and extraction. SSMDR can preserve the local and multimodal structures of labeled and unlabeled samples. As a result, data pairs in the close vicinity of the original space are projected in the ne...
In this paper new methodologies for clustering and dimensionality reduction of large data sets are illustrated using both a least-squares and maximum likelihood approach. The methodologies are described by both real applications and Monte Carlo simulations.
We formulate linear dimensionality reduction as a semi-parametric estimation problem, enabling us to study its asymptotic behavior. We generalize the problem beyond additive Gaussian noise to (unknown) nonGaussian additive noise, and to unbiased non-additive models.
Epistasis, the interaction among genes, is ubiquitous among common, complex, and multifactorial diseases. Therefore it has become necessary to develop methods to detect epistasis, the motivation for one such method, multifactor dimensionality reduction (MDR). We introduce the algorithm of MDR, its strengths and weaknesses, and finally illustrate the results of applying MDR to alcoholism. We com...
Sequence segmentation and dimensionality reduction have been used as methods for studying high-dimensional sequences — they both reduce the complexity of the representation of the original data. In this paper we study the interplay of these two techniques. We formulate the problem of segmenting a sequence while modeling it with a basis of small size, thus essentially reducing the dimension of t...
The visual interpretation of data is an essential step to guide any further processing or decision making. Dimensionality reduction (or manifold learning) tools may be used for visualization if the resulting dimension is constrained to be 2 or 3. The field of machine learning has developed numerous nonlinear dimensionality reduction tools in the last decades. However, the diversity of methods r...
A fundamental problem in machine learning is to extract compact but relevant representations of empirical data. Relevance can be measured by the ability to make good decisions based on the representations, for example in terms of classification accuracy. Compact representations can lead to more human-interpretable models, as well as improve scalability. Furthermore, in multi-class and multi-tas...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید