نتایج جستجو برای: low rank representation

تعداد نتایج: 1475339  

Journal: :CoRR 2016
Yubao Sun Renlong Hang Qingshan Liu Fuping Zhu Hucheng Pei

In this paper, we propose a novel data-driven regression model for aerosol optical depth (AOD) retrieval. First, we adopt a low rank representation (LRR) model to learn a powerful representation of the spectral response. Then, graph regularization is incorporated into the LRR model to capture the local structure information and the nonlinear property of the remote-sensing data. Since it is easy...

Journal: :CoRR 2015
Yifan Fu Junbin Gao Xia Hong David Tien

In this paper, we present a novel low rank representation (LRR) algorithm for data lying on the manifold of square root densities. Unlike traditional LRR methods which rely on the assumption that the data points are vectors in the Euclidean space, our new algorithm is designed to incorporate the intrinsic geometric structure and geodesic distance of the manifold. Experiments on several computer...

2013
Yong Peng Suhang Wang Shen Wang Bao-Liang Lu

Constructing an informative and discriminative graph plays an important role in the graph based semi-supervised learning methods. Among these graph construction methods, low-rank representation based graph, which calculates the edge weights of both labeled and unlabeled samples as the low-rank representation (LRR) coefficients, has shown excellent performance in semi-supervised learning. In thi...

Journal: :CoRR 2016
Stephen Tierney Junbin Gao Yi Guo Zhengwu Zhang

In machine learning it is common to interpret each data point as a vector in Euclidean space. However the data may actually be functional i.e. each data point is a function of some variable such as time and the function is discretely sampled. The naive treatment of functional data as traditional multivariate data can lead to poor performance since the algorithms are ignoring the correlation in ...

2012
Guangcan Liu Huan Xu Shuicheng Yan

In this work, we address the following matrix recovery problem: suppose we are given a set of data points containing two parts, one part consists of samples drawn from a union of multiple subspaces and the other part consists of outliers. We do not know which data points are outliers, or how many outliers there are. The rank and number of the subspaces are unknown either. Can we detect the outl...

2011
Wei Siming Lin Zhouchen

We analyze and improve low rank representation (LRR), the state-of-the-art algorithm for subspace segmentation of data. We prove that for the noiseless case, the optimization model of LRR has a unique solution, which is the shape interaction matrix (SIM) of the data matrix. So in essence LRR is equivalent to factorization methods. We also prove that the minimum value of the optimization model o...

2016
Yuqi Pan Mingyan Jiang

Classification based on Low-Rank Representation (LRR) has been a hot-topic in the field of pattern classification. However, LRR may not be able to fuse the local and global information of data completely and fail to represent nonlinear samples. In this paper, we propose a kernel locality preserving low-rank representation with Tikhonov regularization (KLP-LRR) for face recognition. KLP-LRR is a...

2017
Wenjia Niu Kewen Xia Baokai Zu Jianchuan Bai

Unlike Support Vector Machine (SVM), Multiple Kernel Learning (MKL) allows datasets to be free to choose the useful kernels based on their distribution characteristics rather than a precise one. It has been shown in the literature that MKL holds superior recognition accuracy compared with SVM, however, at the expense of time consuming computations. This creates analytical and computational diff...

Journal: :Neurocomputing 2014
Hongyang Zhang Zhouchen Lin Chao Zhang Junbin Gao

Subspace clustering has found wide applications in machine learning, data mining, and computer vision. Latent Low Rank Representation (LatLRR) is one of the state-of-the-art methods for subspace clustering. However, its effectiveness is undermined by a recent discovery that the solution to the noiseless LatLRR model is non-unique. To remedy this issue, we propose choosing the sparest solution i...

Journal: :Knowl.-Based Syst. 2017
Jie Chen Hua Mao Yongsheng Sang Zhang Yi

In this paper, we propose a low-rank representation with symmetric constraint (LRRSC) method for robust subspace clustering. Given a collection of data points approximately drawn from multiple subspaces, the proposed technique can simultaneously recover the dimension and members of each subspace. LRRSC extends the original low-rank representation algorithm by integrating a symmetric constraint ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید