نتایج جستجو برای: gramian matrix

تعداد نتایج: 364823  

Journal: :Mathematics of Control, Signals, and Systems 2022

Abstract This paper considers large-scale linear stochastic systems representing, e.g., spatially discretized partial differential equations. Since asymptotic stability can often not be ensured in such a setting (e.g., due to larger noise), the main focus is on establishing model order reduction (MOR) schemes applicable unstable systems. MOR vital reduce dimension of problem lower enormous comp...

Journal: :IEEE Transactions on Control of Network Systems 2017

Journal: :IFAC-PapersOnLine 2021

Identifiability, observability, and controllability are important structural properties of a dynamic system model. Our interest lies in the detection lack identifiabil-ity/observability and/or through computation subsequent analysis exact nullspace gramian for non-linear systems. For this we have developed user-friendly application with name StrucID which runs Matlab. The App requires as input ...

2008
P. LOUBATON J. NAJIM

Consider a N × n random matrix Yn = (Y n ij ) where the entries are given by Y n ij = σ(i/N,j/n) √ n X ij , the X n ij being centered i.i.d. and σ : [0, 1] 2 → (0,∞) being a continuous function called a variance profile. Consider now a deterministic N×n matrix Λn = ( Λij ) whose non diagonal elements are zero. Denote by Σn the non-centered matrix Yn+Λn. Then under the assumption that limn→∞ N n...

Journal: :Neurocomputing 2015
Daniela Hofmann Andrej Gisbrecht Barbara Hammer

Due to its intuitive learning algorithms and classification behavior, learning vector quantization (LVQ) enjoys a wide popularity in diverse application domains. In recent years, the classical heuristic schemes have been accompanied by variants which can be motivated by a statistical framework such as robust soft LVQ (RSLVQ). In its original form, LVQ and RSLVQ can be applied to vectorial data ...

2008
E. K. Zavadskas Rasa Karbauskaitė Gintautas Dzemyda Virginijus Marcinkevičius

Abstract: This paper deals with a method, called locally linear embedding. It is a nonlinear dimensionality reduction technique that computes low-dimensional, neighbourhood preserving embeddings of highdimensional data and attempts to discover nonlinear structure in high-dimensional data. The implementation of the algorithm is fairly straightforward, because the algorithm has only two control p...

2001
Nello Cristianini John Shawe-Taylor Jaz S. Kandola

In this paper we introduce new algorithms for unsupervised learning based on the use of a kernel matrix. All the information required by such algorithms is contained in the eigenvectors of the matrix or of closely related matrices. We use two different but related cost functions, the Alignment and the 'cut cost'. The first one is discussed in a companion paper [3], the second one is based on gr...

Journal: :Math. Comput. 2009
Kanat S. Abdukhalikov Rudolf Scharlau

All indecomposable unimodular hermitian lattices in dimensions 14 and 15 over the ring of integers in Q( √ −3) are determined. Precisely one lattice in dimension 14 and two lattices in dimension 15 have minimal norm 3. In 1978 W. Feit [10] classified the unimodular hermitian lattices of dimensions up to 12 over the ring Z[ω] of Eisenstein integers, where ω is a primitive third root of unity. Th...

2007
Edin Andelic Martin Schafföner Marcel Katz Sven E. Krüger Andreas Wendemuth

A novel training algorithm for nonlinear discriminants for classification and regression in Reproducing Kernel Hilbert Spaces (RKHSs) is presented. It is shown how the overdetermined linear leastsquares-problem in the corresponding RKHS may be solved within a greedy forward selection scheme by updating the pseudoinverse in an order-recursive way. The described construction of the pseudoinverse ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید