نتایج جستجو برای: کمیته bagging

تعداد نتایج: 4107  

2004
Vicent Estruch César Ferri José Hernández-Orallo M. José Ramírez-Quintana

Ensemble methods improve accuracy by combining the predictions of a set of different hypotheses. A well-known method for generating hypothesis ensembles is Bagging. One of the main drawbacks of ensemble methods in general, and Bagging in particular, is the huge amount of computational resources required to learn, store, and apply the set of models. Another problem is that even using the bootstr...

2002
Philip Derbeko Ran El-Yaniv Ron Meir

We propose and study a new technique for aggregating an ensemble of bootstrapped classifiers. In this method we seek a linear combination of the base-classifiers such that the weights are optimized to reduce variance. Minimum variance combinations are computed using quadratic programming. This optimization technique is borrowed from Mathematical Finance where it is called Markowitz Mean-Varianc...

Journal: :Entropy 2018
Hossein Foroozand Valentina Radic Steven V. Weijs

Recently, the Entropy Ensemble Filter (EEF) method was proposed to mitigate the computational cost of the Bootstrap AGGregatING (bagging) method. This method uses the most informative training data sets in the model ensemble rather than all ensemble members created by the conventional bagging. In this study, we evaluate, for the first time, the application of the EEF method in Neural Network (N...

2000
John C. Henderson Eric Brill

Bagging and boosting, two effective machine learning techniques, are applied to natural language parsing. Experiments using these techniques with a trainable statistical parser are described. The best resulting system provides roughly as large of a gain in F-measure as doubling the corpus size. Error analysis of the result of the boosting technique reveals some inconsistent annotations in the P...

Journal: :Astronomische Nachrichten 2008

Journal: :Soft Computing 2022

It is hard to come up with a strong learning algorithm high cross-media retrieval accuracy, but finding weak slightly higher accuracy than random prediction simple. This paper proposes an innovative Bagging-based (called BCMR) based on this concept. First, we use bootstrap sampling take sample from the original set. The amount of abstracted by bootstrapping set be same as dataset. Secondly, 50 ...

1996
DAVID H. WOLPERT WILLIAM G. MACREADY

Bagging [1] is a technique that tries to improve a learning algorithm's performance by using bootstrap replicates of the training set [5, 4]. The computational requirements for estimating the resultant generalization error on a test set by means of cross-validation are often prohibitive for leave-one-out cross-validation one needs to train the underlying algorithm on the order of m times, where...

Journal: :IEEE transactions on neural networks 2001
Ramazan Gencay Min Qi

We study the effectiveness of cross validation, Bayesian regularization, early stopping, and bagging to mitigate overfitting and improving generalization for pricing and hedging derivative securities with daily S&P 500 index daily call options from January 1988 to December 1993. Our results indicate that Bayesian regularization can generate significantly smaller pricing and delta-hedging errors...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید