Tensor and Its Tucker Core: the Invariance Relationships
نویسندگان
چکیده
In [12], Hillar and Lim famously demonstrated that “multilinear (tensor) analogues of many efficiently computable problems in numerical linear algebra are NP-hard”. Despite many recent advancements, the state-of-the-art methods for computing such ‘tensor analogues’ still suffer severely from the curse of dimensionality. In this paper we show that the Tucker core of a tensor however, retains many properties of the original tensor, including the CP rank, the border rank, the tensor Schatten quasi norms, and the Z-eigenvalues. Since the core is typically smaller than the original tensor, this property leads to considerable computational advantages, as confirmed by our numerical experiments. In our analysis, we in fact work with a generalized Tucker-like decomposition that can accommodate any full column-rank factorization matrices.
منابع مشابه
An Iterative Reweighted Method for Tucker Decomposition of Incomplete Multiway Tensors
We consider the problem of low-rank decomposition of incomplete multiway tensors. Since many real-world data lie on an intrinsically low dimensional subspace, tensor low-rank decomposition with missing entries has applications in many data analysis problems such as recommender systems and image inpainting. In this paper, we focus on Tucker decomposition which represents an N th-order tensor in ...
متن کاملOn Tensor Tucker Decomposition: the Case for an Adjustable Core Size
This paper is concerned with the problem of finding a Tucker decomposition for tensors. Traditionally, solution methods for Tucker decomposition presume that the size of the core tensor is specified in advance, which may not be a realistic assumption in some applications. In this paper we propose a new computational model where the configuration and the size of the core become a part of the dec...
متن کاملBaTFLED: Bayesian Tensor Factorization Linked to External Data
The vast majority of current machine learning algorithms are designed to predict single responses or a vector of responses, yet many types of response are more naturally organized as matrices or higher-order tensor objects where characteristics are shared across modes. We present a new machine learning algorithm BaTFLED (Bayesian Tensor Factorization Linked to External Data) that predicts value...
متن کاملTensor Decompositions for Very Large Scale Problems
Modern applications such as neuroscience, text mining, and large-scale social networks generate massive amounts of data with multiple aspects and high dimensionality. Tensors (i.e., multi-way arrays) provide a natural representation for such massive data. Consequently, tensor decompositions and factorizations are emerging as novel and promising tools for exploratory analysis of multidimensional...
متن کاملMultifactor sparse feature extraction using Convolutive Nonnegative Tucker Decomposition
Multilinear algebra of the higher-order tensor has been proposed as a potential mathematical framework for machine learning to investigate the relationships among multiple factors underlying the observations. One popular model Nonnegative Tucker Decomposition (NTD) allows us to explore the interactions of different factors with nonnegative constraints. In order to reduce degeneracy problem of t...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Numerical Lin. Alg. with Applic.
دوره 24 شماره
صفحات -
تاریخ انتشار 2017