نتایج جستجو برای: random subspace

تعداد نتایج: 300614  

Journal: :J. Artif. Intell. Res. 2008
Fei Tony Liu Kai Ming Ting Yang Yu Zhi-Hua Zhou

In this paper, we show that a continuous spectrum of randomisation exists, in which most existing tree randomisations are only operating around the two ends of the spectrum. That leaves a huge part of the spectrum largely unexplored. We propose a base learner VR-Tree which generates trees with variable-randomness. VR-Trees are able to span from the conventional deterministic trees to the comple...

2001
Philippe Ciblat Alban Quadrat

In the blind equalization problem, one can recently observe that the sought unknown filter can be performed from the sole knowledge of second order statistics of the received signal. According to this fact, a powerful algorithm which is the socalled subspace method has been developed. The subspace method was previously described in rational spaces framework which seemed to be unappropriated. In...

2008
Martin Sewell

This note presents a chronological review of the literature on ensemble learning which has accumulated over the past twenty years. The idea of ensemble learning is to employ multiple learners and combine their predictions. If we have a committee of M models with uncorrelated errors, simply by averaging them the average error of a model can be reduced by a factor of M. Unfortunately, the key ass...

Journal: :JCIT 2009
Jinn-Min Yang Pao-Ta Yu

Classifier combining techniques have become popular for improving weak classifiers in recent years. The random subspace method (RSM) is an efficient classifier combining technique that can improve classification performance of weak classifiers for the small sample size (SSS) problems. In RSM, the feature subsets are randomly selected and the resulting datasets are used to train classifiers. How...

2014

Classical methods of DOA estimation such as the MUSIC algorithm are based on estimating the signal and noise subspaces from the sample covariance matrix. For a small number of samples, such methods are exposed to performance breakdown, as the sample covariance matrix can largely deviate from the true covariance matrix. We invistigate DOA estimation performance breakdown. Specifically, we consid...

Journal: :SIAM J. Scientific Computing 1997
Tony F. Chan Wing Lok Wan

We analyze a class of Krylov projection methods but mainly concentrate on a specific conjugate gradient (CG) implementation by Smith, Peterson, and Mittra [IEEE Transactions on Antennas and Propogation, 37 (1989), pp. 1490–1493] to solve the linear system AX = B, where A is symmetric positive definite and B is a multiple of right-hand sides. This method generates a Krylov subspace from a set of...

2012
Matthias Schneider Sven Hirsch Gábor Székely Bruno Weber Bjoern H. Menze

We propose a machine learning-based framework using oblique random forests for 3-D vessel segmentation. Two different kinds of features are compared. One is based on orthogonal subspace filtering where we learn 3-D eigenspace filters from local image patches that return task optimal feature responses. The other uses a specific set of steerable filters that show, qualitatively, similarities to t...

2008
Arnak Dalalyan

Yi = f(xi) + εi = g(Θ >xi) + εi, i = 1, . . . , n, is addressed. In the general setup we are interested in, the covariates xi ∈ R, Θ is a d×m orthogonal matrix (ΘΘ = Im∗) and g : R ∗ → R is an unknown function. To be able to estimate Π consistently, we assume that S = Im(Θ) is the smallest subspace satisfying f(xi) = f(ΠSxi), ∀i = 1, . . . , n, where ΠS stands for the orthogonal projector in R ...

2008
Jihun Ham Daniel D. Lee

In this paper we tackle the problem of learning the appearances of a person’s face from images with both unknown pose and illumination. The unknown, simultaneous change in pose and illumination makes it difficult to learn 3D face models from data without manual labeling and tracking of features. In comparison, image-based models do not require geometric knowledge of faces but only the statistic...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید