نتایج جستجو برای: p local rank
تعداد نتایج: 1809616 فیلتر نتایج به سال:
Matrix approximation is a common tool in recommendation systems, text mining, and computer vision. A prevalent assumption in constructing matrix approximations is that the partially observed matrix is low-rank. In this paper, we propose, analyze, and experiment with two procedures, one parallel and the other global, for constructing local matrix approximations. The two approaches approximate th...
Local rank modulation scheme was suggested recently for representing information in flash memories in order to overcome drawbacks of rank modulation. For 0 < s ≤ t ≤ n with s divides n, an (s, t, n)-LRM scheme is a local rank modulation scheme where the n cells are locally viewed cyclically through a sliding window of size t resulting in a sequence of small permutations which requires less comp...
By allowing the regression coefficients to change with certain covariates, the class of varying coefficient models offers a flexible approach to modeling nonlinearity and interactions between covariates. This paper proposes a novel estimation procedure for the varying coefficient models based on local ranks. The new procedure provides a highly efficient and robust alternative to the local linea...
Y. Huang acknowledges partial support from a UCF Graduate College Presidential Fellowship and National Science Foundation (NS F) grant No. 1200566. C. Li acknowledges partial support from NSF grants No. 0806931 and No. 0963146. M. Georgiopoulos acknowledges partial support from NSF grants No. 1161228 and No. 0525429. G. G. Anagnostopoulos acknowledges partial support from NSF grant No. 1263011....
Matrix approximation is a common tool in machine learning for building accurate prediction models for recommendation systems, text mining, and computer vision. A prevalent assumption in constructing matrix approximations is that the partially observed matrix is of low-rank. We propose a new matrix approximation model where we assume instead that the matrix is only locally of low-rank, leading t...
This paper presents a patch-wise low-rank based image denoising method with constrained variational model involving local and nonlocal regularization. On one hand, recent patch-wise methods can be represented as a low-rank matrix approximation problem whose convex relaxation usually depends on nuclear norm minimization (NNM). Here, we extend the NNM to the nonconvex schatten p-norm minimization...
The problem of determining which abelian groups admit a full-rank normalized factorization is settled for the orders 64 = 26, 81 = 34, and 128 = 27. By a computer-aided approach, it is shown that such groups of these orders are exactly those of type (22, 22, 22), (22, 22, 2, 2), (23, 22, 22), (23, 22, 2, 2), (22, 22, 22, 2), and (22, 22, 2, 2, 2). Mathematics Subject Classification (2000): Prim...
Low-rank tensor learning has many applications in machine learning. A series of batch learning algorithms have achieved great successes. However, in many emerging applications, such as climate data analysis, we are confronted with largescale tensor streams, which pose significant challenges to existing solutions. In this paper, we propose an accelerated online low-rank tensor learning algorithm...
Mixtures of Linear Regressions (MLR) is an important mixture model with many applications. In this model, each observation is generated from one of the several unknown linear regression components, where the identity of the generated component is also unknown. Previous works either assume strong assumptions on the data distribution or have high complexity. This paper proposes a fixed parameter ...
We study the common problem of approximating a target matrix with a matrix of lower rank. We provide a simple and efficient (EM) algorithm for solving weighted low-rank approximation problems, which, unlike their unweighted version, do not admit a closedform solution in general. We analyze, in addition, the nature of locally optimal solutions that arise in this context, demonstrate the utility ...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید