نتایج جستجو برای: tensor decomposition

تعداد نتایج: 139824  

2013
Animashree Anandkumar

This work considers a computationally and statistically e?cient parameter estimation method for a wide class of latent variable models—including Gaussian mixture models, hidden Markov models, and latent Dirichlet allocation—which exploits a certain tensor structure in their loworder observable moments (typically, of secondand third-order). Speci?cally, parameter estimation is reduced to the pro...

Journal: :SIAM J. Matrix Analysis Applications 2015
Kim Batselier Haotian Liu Ngai Wong

Abstract. We propose a novel and constructive algorithm that decomposes an arbitrary tensor into a finite sum of orthonormal rank-1 outer factors. The algorithm, named TTr1SVD, works by converting the tensor into a rank-1 tensor train (TT) series via singular value decomposition (SVD). TTr1SVD naturally generalizes the SVD to the tensor regime and delivers elegant notions of tensor rank and err...

Journal: :CoRR 2015
Linxiao Yang Jun Fang Hongbin Li Bing Zeng

We consider the problem of low-rank decomposition of incomplete multiway tensors. Since many real-world data lie on an intrinsically low dimensional subspace, tensor low-rank decomposition with missing entries has applications in many data analysis problems such as recommender systems and image inpainting. In this paper, we focus on Tucker decomposition which represents an N th-order tensor in ...

2017
Matteo Ruffini Ricard Gavaldà Esther Limon

In this paper we present a method for the unsupervised clustering of high-dimensional binary data, with a special focus on electronic healthcare records. We present a robust and efficient heuristic to face this problem using tensor decomposition. We present the reasons why this approach is preferable for tasks such as clustering patient records, to more commonly used distance-based methods. We ...

Journal: :Neurocomputing 2014
Qiang Wu Liqing Zhang Andrzej Cichocki

Multilinear algebra of the higher-order tensor has been proposed as a potential mathematical framework for machine learning to investigate the relationships among multiple factors underlying the observations. One popular model Nonnegative Tucker Decomposition (NTD) allows us to explore the interactions of different factors with nonnegative constraints. In order to reduce degeneracy problem of t...

2012
Ben London Theodoros Rekatsinas Bert Huang Lise Getoor

In real-world network data, there often exist multiple types of relationships (edges) that we would like to model. For instance, in social networks, relationships between individuals may be personal, familial, or professional. In this paper, we examine a multi-relational learning scenario in which the learner is given a small set of training examples, sampled from the complete set of potential ...

2016
Hafiz Imtiaz Anand D. Sarwate

Differential privacy has recently received a significant amount of research attention for its robustness against known attacks. Decomposition of tensors has applications in many areas including signal processing, machine learning, computer vision and neuroscience. In this paper, we particularly focus on differentially-private orthogonal decomposition of symmetric tensors that arise in several l...

2016
Zhao Song David P. Woodruff Huan Zhang

A recent work (Wang et. al., NIPS 2015) gives the fastest known algorithms for orthogonal tensor decomposition with provable guarantees. Their algorithm is based on computing sketches of the input tensor, which requires reading the entire input. We show in a number of cases one can achieve the same theoretical guarantees in sublinear time, i.e., even without reading most of the input tensor. In...

2004
P. Comon

The Singular Value Decomposition (SVD) may be extended to tensors at least in two very different ways. One is the High-Order SVD (HOSVD), and the other is the Canonical Decomposition (CanD). Only the latter is closely related to the tensor rank. Important basic questions are raised in this short paper, such as the maximal achievable rank of a tensor of given dimensions, or the computation of a ...

Journal: :Comput. Graph. Forum 2009
Roland Ruiters Reinhard Klein

In this paper, we present a novel compression technique for Bidirectional Texture Functions based on a sparse tensor decomposition. We apply the K-SVD algorithm along two different modes of a tensor to decompose it into a small dictionary and two sparse tensors. This representation is very compact, allowing for considerably better compression ratios at the same RMS error than possible with curr...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید