A Unified Bias-Variance Decomposition and its Applications
نویسنده
چکیده
This paper presents a unified bias-variance decomposition that is applicable to squared loss, zero-one loss, variable misclassification costs, and other loss functions. The unified decomposition sheds light on a number of significant issues: the relation between some of the previously-proposed decompositions for zero-one loss and the original one for squared loss, the relation between bias, variance and Schapire et al.’s (1997) notion of margin, and the nature of the trade-off between bias and variance in classification. While the biasvariance behavior of zero-one loss and variable misclassification costs is quite different from that of squared loss, this difference derives directly from the different definitions of loss. We have applied the proposed decomposition to decision tree learning, instancebased learning and boosting on a large suite of benchmark data sets, and made several significant observations.
منابع مشابه
A Unified Bias-Variance Decomposition for Zero-One and Squared Loss
The bias-variance decomposition is a very useful and widely-used tool for understanding machine-learning algorithms. It was originally developed for squared loss. In recent years, several authors have proposed decompositions for zero-one loss, but each has significant shortcomings. In particular, all of these decompositions have only an intuitive relationship to the original squared-loss one. I...
متن کاملEvaluating Quasi-Monte Carlo (QMC) algorithms in blocks decomposition of de-trended
The length of equal minimal and maximal blocks has eected on logarithm-scale logarithm against sequential function on variance and bias of de-trended uctuation analysis, by using Quasi Monte Carlo(QMC) simulation and Cholesky decompositions, minimal block couple and maximal are founded which are minimum the summation of mean error square in Horest power.
متن کاملBias-Variance Analysis of Support Vector Machines for the Development of SVM-Based Ensemble Methods
Bias-variance analysis provides a tool to study learning algorithms and can be used to properly design ensemble methods well tuned to the properties of a specific base learner. Indeed the effectiveness of ensemble methods critically depends on accuracy, diversity and learning characteristics of base learners. We present an extended experimental analysis of bias-variance decomposition of the err...
متن کاملBias Plus Variance Decomposition for Zero-One Loss Functions
We present a bias variance decomposition of expected misclassi cation rate the most commonly used loss function in supervised classi cation learning The bias variance decomposition for quadratic loss functions is well known and serves as an important tool for analyzing learning algorithms yet no decomposition was o ered for the more commonly used zero one misclassi cation loss functions until t...
متن کاملEnsemble Methods Based on Bias–variance Analysis Title: Ensemble Methods Based on Bias–variance Analysis
Ensembles of classifiers represent one of the main research directions in machine learning. Two main theories are invoked to explain the success of ensemble methods. The first one consider the ensembles in the framework of large margin classifiers, showing that ensembles enlarge the margins, enhancing the generalization capabilities of learning algorithms. The second is based on the classical b...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2000