نتایج جستجو برای: bootstrap aggregating

تعداد نتایج: 18325  

Journal: :INTENSIF: Jurnal Ilmiah Penelitian dan Penerapan Teknologi Sistem Informasi 2019

2007
Xueqin Zhang Yuehui Chen Jack Y. Yang

Stock market analysis is one of the most important and hard problems in finance analysis field. Recently, the usage of intelligent systems for stock market prediction has been widely established. In this paper, a PSO based selective neural network ensemble (PSOSEN) algorithm is proposed, which is used for the Nasdaq-100 index of Nasdaq Stock Market and the S&P CNX NIFTY stock index analysis. In...

1999
Tom Bylander Dennis Hanzlik

We provide a method for estimating the generalization error of a bag using out-of-bag estimates. In bagging, each predictor (single hypothesis) is learned from a bootstrap sample of the training examples; the output of a bag (a set of predictors) on an example is determined by voting. The outof-bag estimate is based on recording the votes of each predictor on those training examples omitted fro...

2008
Md. Rafiul Hassan M. Maruf Hossain James Bailey Geoff Macintyre Joshua W. K. Ho Kotagiri Ramamohanarao

• C4.5 can achieve an average of 88.49% accuracy using at most from 1 to 4 genes for different fold. While in the test set the same classifier used only one gene to achieve a maximum accuracy of 84.52%. • C4.5 with boosting can achieve an average of 89.54% accuracy using at most from 1 to 5 genes for different fold. While in the test set the same classifier used 4 genes to achieve a maximum acc...

2004
Prem Melville Nishit Shah Lilyana Mihalkova Raymond J. Mooney

One of the potential advantages of multiple classifier systems is an increased robustness to noise and other imperfections in data. Previous experiments on classification noise have shown that bagging is fairly robust but that boosting is quite sensitive. Decorate is a recently introduced ensemble method that constructs diverse committees using artificial data. It has been shown to generally ou...

1995
Dirk Ormoneit Volker Tresp

Volker Tresp Siemens AG Central Research 81730 Munchen, Germany Volker. [email protected] We compare two regularization methods which can be used to improve the generalization capabilities of Gaussian mixture density estimates. The first method uses a Bayesian prior on the parameter space. We derive EM (Expectation Maximization) update rules which maximize the a posterior parameter probabili...

2001
Gregory Shakhnarovich Ran El-Yaniv Yoram Baram

This paper is concerned with the estimation of a classifier’s accuracy. We present a number of novel bootstrap estimators, based on kernel smoothing, that consistently show superior performance on both synthetic and real data, with respect to other established methods. We call the process of (re)sampling the data via kernel-based smoothed bootstrap data cloning. The new cloning methods outperfo...

2006
Anneleen Van Assche Hendrik Blockeel

Bagging is a well-known and widely used ensemble method. It operates by sequentially bootstrapping the data set and invoking a base classifier on these different bootstraps. By learning several models (and combining them), it tends to increase predictive accuracy, while sacrificing efficiency. Due to this it becomes slow for large scale data sets. In this paper we propose a method that simulate...

2007
Hossein Ebrahimpour Abbas Kouzani

In this paper a novel ensemble based techniques for face recognition is presented. In ensemble learning a group of methods are employed and their results are combined to form the final results of the system. Gaining the higher accuracy rate is the main advantage of this system. Two of the most successful wrapping classification methods are bagging and boosting. In this paper we used the K neare...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید