نتایج جستجو برای: tensor decomposition

تعداد نتایج: 139824  

2017
Yu-Fei Gao Guan Gui Wei Xie Yanbin Zou Yue Yang Qun Wan

This paper investigates a two-dimensional angle of arrival (2D AOA) estimation algorithm for the electromagnetic vector sensor (EMVS) array based on Type-2 block component decomposition (BCD) tensor modeling. Such a tensor decomposition method can take full advantage of the multidimensional structural information of electromagnetic signals to accomplish blind estimation for array parameters wit...

Journal: :Signal Processing 2007
Peter J. Basser Sinisa Pajevic

We propose a novel spectral decomposition of a 4th-order covariance tensor, S. Just as the variability of vector (i.e., a 1st-order tensor)-valued random variable is characterized by a covariance matrix (i.e., a 2nd-order tensor), S, the variability of a 2nd-order tensor-valued random variable, D, is characterized by a 4th-order covariance tensor, S. Accordingly, just as the spectral decomposit...

Journal: :CoRR 2017
Tai-Xiang Jiang Ting-Zhu Huang Xi-Le Zhao Liang-Jian Deng

A recently developed novel tensor decomposition scheme named tensor singular value decomposition (t-SVD) results in a notion of rank referred to as the tubal-rank. Many methods minimize its convex surrogate the tensor nuclear norm (TNN) to enhance the low tubal-rankness of the underlying data. Generally, minimizing the TNN may cause some biases. In this paper, to alleviate these bias phenomenon...

Journal: :Numerical Lin. Alg. with Applic. 2017
Kim Batselier Ngai Wong

We propose the tensor Kronecker product singular value decomposition (TKPSVD) that decomposes a real k-way tensor A into a linear combination of tensor Kronecker products with an arbitrary number of d factors A = ∑R j=1 σj A (d) j ⊗ · · · ⊗ A (1) j . We generalize the matrix Kronecker product to tensors such that each factor A j in the TKPSVD is a k-way tensor. The algorithm relies on reshaping...

Journal: :SIAM Review 2009
Tamara G. Kolda Brett W. Bader

This survey provides an overview of higher-order tensor decompositions, their applications, and available software. A tensor is a multidimensional or N-way array. Decompositions of higher-order tensors (i.e., N-way arrays with N ≥ 3) have applications in psychometrics, chemometrics, signal processing, numerical linear algebra, computer vision, numerical analysis, data mining, neuroscience, grap...

2015
Rong Ge Tengyu Ma

Tensor rank and low-rank tensor decompositions have many applications in learning and complexity theory. Most known algorithms use unfoldings of tensors and can only handle rank up to nbp/2c for a p-th order tensor in Rnp . Previously no efficient algorithm can decompose 3rd order tensors when the rank is super-linear in the dimension. Using ideas from sum-of-squares hierarchy, we give the firs...

2011
Dijun Luo Chris H. Q. Ding Heng Huang

A main challenging problem for many machine learning and data mining applications is that the amount of data and features are very large, so that low-rank approximations of original data are often required for efficient computation. We propose new multi-level clustering based low-rank matrix approximations which are comparable and even more compact than Singular Value Decomposition (SVD). We ut...

Journal: :International Journal of Research in Advent Technology 2019

Journal: :SIAM J. Matrix Analysis Applications 2016
Anil Aswani

Unlike the matrix case, computing low-rank approximations of tensors is NP-hard and numerically ill-posed in general. Even the best rank-1 approximation of a tensor is NP-hard. In this paper, we use convex optimization to develop polynomial-time algorithms for low-rank approximation and completion of positive tensors. Our approach is to use algebraic topology to define a new (numerically well-p...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید