نتایج جستجو برای: schatten p norm

تعداد نتایج: 1308327  

Journal: :CoRR 2017
Petros Drineas Michael W. Mahoney

2 Linear Algebra 3 2.1 Basics. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 2.2 Norms. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 2.3 Vector norms. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 2.4 Induced matrix norms. . . . . . . . . . . . . . . . . . . . . . . . ....

2016
Feiping Nie Heng Huang

We propose a new subspace clustering model to segment data which is drawn from multiple linear or affine subspaces. Unlike the well-known sparse subspace clustering (SSC) and low-rank representation (LRR) which transfer the subspace clustering problem into two steps’ algorithm including building the affinity matrix and spectral clustering, our proposed model directly learns the different subspa...

2013
Andreas Argyriou Charles A. Micchelli Alexander Smola

In this paper, we study the problem of learning a matrix W from a set of linear measurements. Our formulation consists in solving an optimization problem which involves regularization with a spectral penalty term. That is, the penalty term is a function of the spectrum of the covariance of W . Instances of this problem in machine learning include multi-task learning, collaborative filtering and...

Journal: :Journal of Machine Learning Research 2010
Andreas Argyriou Charles A. Micchelli Massimiliano Pontil

In this paper, we study the problem of learning a matrix W from a set of linear measurements. Our formulation consists in solving an optimization problem which involves regularization with a spectral penalty term. That is, the penalty term is a function of the spectrum of the covariance of W . Instances of this problem in machine learning include multi-task learning, collaborative filtering and...

Journal: :CoRR 2017
Zhiyuan Zha Xinggan Zhang Yu Wu Qiong Wang Lan Tang

Since the matrix formed by nonlocal similar patches in a natural image is of low rank, the nuclear norm minimization (NNM) has been widely used for image restoration. However, NNM tends to over-shrink the rank components and treats the different rank components equally, thus limits its capability and flexibility. This paper proposes a new approach for image restoration based ADMM framework via ...

2014
Georgios Papamakarios Yannis Panagakis Stefanos Zafeiriou

The robust estimation of the low-dimensional subspace that spans the data from a set of high-dimensional, possibly corrupted by gross errors and outliers observations is fundamental in many computer vision problems. The state-of-the-art robust principal component analysis (PCA) methods adopt convex relaxations of `0 quasi-norm-regularised rank minimisation problems. That is, the nuclear norm an...

Journal: :Annals of Functional Analysis 2021

Let $${\mathcal {H}}={\mathcal {H}}_+\oplus {\mathcal {H}}_-$$ be a fixed orthogonal decomposition of the complex separable Hilbert space {H}}$$ in two infinite-dimensional subspaces. We study geometry set {P}}^p$$ selfadjoint projections Banach algebra {A}}^p=\{A\in {B}}({\mathcal {H}}): [A,E_+]\in {B}}_p({\mathcal {H}})\},$$ where $$E_+$$ is projection onto {H}}_+$$ and {H}})$$ Schatten ideal...

Journal: :Statistics & Probability Letters 2022

We correct a formula of Gavish and Donoho for singular value shrinkage with operator norm loss non-square matrices. also observe that in the classical regime, optimal any Schatten converges to best linear predictor.

Journal: :Linear Algebra and its Applications 2015

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید