نتایج جستجو برای: orthogonally invariant

تعداد نتایج: 78400  

2005
Catherine Fraikin Yurii Nesterov Paul Van Dooren

We consider the problem of finding the optimal correlation between two projected matrices U∗AU and V ∗BV . The square matrices A and B may be of different dimensions, but the isometries U and V have a common column dimension k. The correlation is measured by the real function c(U,V ) = R tr(U∗AUV ∗B∗V ), which we maximize of the isometries U∗U = V ∗V = Ik. This problem can be viewed as an exten...

2013
Andreas Argyriou Charles A. Micchelli Alexander Smola

In this paper, we study the problem of learning a matrix W from a set of linear measurements. Our formulation consists in solving an optimization problem which involves regularization with a spectral penalty term. That is, the penalty term is a function of the spectrum of the covariance of W . Instances of this problem in machine learning include multi-task learning, collaborative filtering and...

2003
T Gorin T Prosen T H Seligman

We propose to study echo dynamics in a random matrix framework, where we assume that the perturbation is time independent, random and orthogonally invariant. This allows to use a basis in which the unperturbed Hamiltonian is diagonal and its properties are thus largely determined by its spectral statistics. We concentrate on the effect of spectral correlations usually associated to chaos and di...

2004
Jesenko Vukadinovic J Vukadinovic

For two-dimensional periodic Kelvin-filtered Navier–Stokes systems, both positively and negatively invariant sets Mn, consisting of initial data for which solutions exist for all negative times and exhibiting a certain asymptotic behaviour backwards in time, are investigated. They are proven to be rich in the sense that they project orthogonally onto the sets of lower modes corresponding to the...

Journal: :Journal of Machine Learning Research 2010
Andreas Argyriou Charles A. Micchelli Massimiliano Pontil

In this paper, we study the problem of learning a matrix W from a set of linear measurements. Our formulation consists in solving an optimization problem which involves regularization with a spectral penalty term. That is, the penalty term is a function of the spectrum of the covariance of W . Instances of this problem in machine learning include multi-task learning, collaborative filtering and...

ژورنال: پژوهش های ریاضی 2022

    In ‎this ‎paper‎, we introduce statistical cosymplectic manifolds and investigate some properties of their tensors. We define invariant and anti-invariant submanifolds and study invariant submanifolds with normal and tangent structure vector fields. We prove that an invariant submanifold of a statistical cosymplectic manifold with tangent structure vector field is a cosymplectic and minimal...

2014
Elina Robeva

In symmetric tensor decomposition one expresses a given symmetric tensor T a sum of tensor powers of a number of vectors: T = v⊗d 1 + · · · + v ⊗d k . Orthogonal decomposition is a special type of symmetric tensor decomposition in which in addition the vectors v1, ..., vk are required to be pairwise orthogonal. We study the properties of orthogonally decomposable tensors. In particular, we give...

1995
Peter W. Michor PETER W. MICHOR

The assumption in the main result of [2] is removed Let G be a Lie group which acts isometrically on a Riemannian manifold M . A section of the Riemannian G-manifold M is a closed submanifold Σ which meets each orbit orthogonally. In this situation the trace on Σ of the G-action is a discrete group action by the generalized Weyl group W (Σ) = NG(Σ)/ZG(Σ), where NG(Σ) := {g ∈ G : g.Σ = Σ} and ZG...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید