نتایج جستجو برای: lossless dimensionality reduction

تعداد نتایج: 510869  

Journal: :Stat 2023

With the recent surge in big data analytics for hyperdimensional data, there is a renewed interest dimensionality reduction techniques. In order these methods to improve performance gains and understanding of underlying proper metric needs be identified. This step often overlooked, metrics are typically chosen without consideration geometry data. this paper, we present method incorporating elas...

Journal: :ITM web of conferences 2022

The use of dimensionality reduction techniques is a keystone for analyzing and interpreting high dimensional data. These gather several data features interest, such as dynamical structure, input-output relationships, the correlation between sets, covariance, etc. Dimensionality entails mapping set onto low Motivated by lack learning models’ performance due to data, this study encounters five di...

Journal: :Informatics 2017
Boris Kovalerchuk Dmytro Dovhalets

The exploration of multidimensional datasets of all possible sizes and dimensions is a long-standing challenge in knowledge discovery, machine learning, and visualization. While multiple efficient visualization methods for n-D data analysis exist, the loss of information, occlusion, and clutter continue to be a challenge. This paper proposes and explores a new interactive method for visual disc...

2003
Giovanni Motta Francesco Rizzo James A. Storer

A novel design for a vector quantizer that uses multiple codebooks of variable dimensionality is proposed. High dimensional source vectors are first partitioned into two or more subvectors of (possibly) different length and then, each subvector is individually encoded with an appropriate codebook. Further redundancy is exploited by conditional entropy coding of the subvectors indices. This sche...

Journal: :نشریه دانشکده فنی 0
برات مجردی دانشگاه خواجه نصیر محمد جواد ولدان زوج دانشگاه خواجه نصیر حمید ابریشمی مقدم دانشگاه شهید رجائی

0

2015
Shrinu Kushagra Shai Ben-David

Dimensionality reduction is a very common preprocessing approach in many machine learning tasks. The goal is to design data representations that on one hand reduce the dimension of the data (therefore allowing faster processing), and on the other hand aim to retain as much task-relevant information as possible. We look at generic dimensionality reduction approaches that do not rely on much task...

2010
Andreas Krause Matt Faulkner

Previously in the course, we have discussed algorithms suited for a large number of data points. This lecture discusses when the dimensionality of the data points becomes large. We denote the data set as x1, x2, . . . , xn ∈ RD for D >> n, and will consider dimensionality reductions f : RD → Rd for d << D. We would like the function f to preserve some properties of the original data set, such a...

2015
Jelani Nelson

1 Optimality theorems for JL Yesterday we saw for MJL that we could achieve target dimension m = O(ε −2 log N), and for DJL we could achieve m = O(ε −2 log(1/δ)). The following theorems tell us that not much improvement is possible for MJL, and for DJL we have the optimal bound. Theorem 1 ([Alo03]). For any N > 1 and ε < 1/2, there exist N + 1 points in R N such that achieving the MJL guarantee...

Journal: :Statistical Methods and Applications 2013
Pietro Giorgio Lovaglio Giorgio Vittadini

When data sets are multilevel (group nesting or repeated measures), different sources of variations must be identified. In the framework of unsupervised analyses, multilevel simultaneous component analysis (MSCA) has recently been proposed as the most satisfactory option for analyzing multilevel data. MSCA estimates submodels for the different levels in data and thereby separates the “within”-s...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید