نتایج جستجو برای: matrix factorization

تعداد نتایج: 378049  

Journal: :PVLDB 2017
Fan Yang Fanhua Shang Yuzhen Huang James Cheng Jinfeng Li Yunjian Zhao Ruihao Zhao

Tensors are higher order generalizations of matrices to model multi-aspect data, e.g., a set of purchase records with the schema (user id, product id, timestamp, feedback). Tensor factorization is a powerful technique for generating a model from a tensor, just like matrix factorization generates a model from a matrix, but with higher accuracy and richer information as more attributes are availa...

2005
Oleg Okun Helen Priisalu Alexessander Alves

In this paper, dimensionality reduction via matrix factorization with nonnegativity constraints is studied. Because of these constraints, it stands apart from other linear dimensionality reduction methods. Here we explore nonnegative matrix factorization in combination with a classifier for protein fold recognition. Since typically matrix factorization is iteratively done, convergence can be sl...

2004
Jong-Hoon Ahn Sang-Ki Kim Jong-Hoon Oh Seungjin Choi

We propose an extension of nonnegative matrix factorization (NMF) to multilayer network model for dynamic myocardial PET image analysis. NMF has been previously applied to the analysis and shown to successfully extract three cardiac components and time-activity curve from the image sequences. Here we apply triple nonnegative-matrix factorization to the dynamic PET images of dog and show details...

2009
M. Bouri S. Bourennane

In this paper, we propose a novel method for subspace estimation used high resolution method without eigendecomposition where the sample Cross-Spectral Matrix (CSM) is replaced by upper triangular matrix obtained from LU factorization. This novel method decreases the computational complexity. The method relies on a recently published result on Rank-Revealing LU (RRLU) factorization. Simulation ...

2017
Michael J. O'Connell Eric F. Lock

In recent years, a number of methods have been developed for the dimension reduction and decomposition of multiple linked high-content data matrices. Typically these methods assume that just one dimension, rows or columns, is shared among the data sources. This shared dimension may represent common features that are measured for different sample sets (i.e., horizontal integration) or a common s...

2010
Lester W. Mackey David J. Weiss Michael I. Jordan

17.

Journal: :CoRR 2015
Gintare Karolina Dziugaite Daniel M. Roy

Data often comes in the form of an array or matrix. Matrix factorization techniques attempt to recover missing or corrupted entries by assuming that the matrix can be written as the product of two low-rank matrices. In other words, matrix factorization approximates the entries of the matrix by a simple, fixed function—namely, the inner product—acting on the latent feature vectors for the corres...

1997
T. FUJITA

We examine the factorization ansatz of the S-matrix in the massive Thirring model. Using the rapidity values obtained numerically for the vacuum as well as for the 1p − 1h and the 2p − 2h states, we show that the factorization of the S-matrix for the particle hole scattering does not hold. This is because the crossing symmetry and the factorization do not commute with each other. This indicates...

Journal: :CoRR 2015
Hsiang-Fu Yu Nikhil Rao Inderjit S. Dhillon

Matrix factorization approaches have been applied to a variety of applications, from recommendation systems to multi-label learning. Standard low rank matrix factorization methods fail in cases when the data can be modeled as a time series, since they do not take into account the dependencies among factors, while EM algorithms designed for time series data are inapplicable to large multiple tim...

Journal: :CoRR 2013
Behnam Neyshabur Rina Panigrahy

We investigate the problem of factoring a matrix into several sparse matrices and propose an algorithm for this under randomness and sparsity assumptions. This problem can be viewed as a simplification of the deep learning problem where finding a factorization corresponds to finding edges in different layers and also values of hidden units. We prove that under certain assumptions on a sparse li...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید