نتایج جستجو برای: forward selection

تعداد نتایج: 430598  

2011
Jan Ulbricht Gerhard Tutz

Quadratic penalties can be used to incorporate external knowledge about the association structure among regressors. Unfortunately, they do not enforce single estimated regression coefficients to equal zero. In this paper we propose a new approach to combine quadratic penalization and variable selection within the framework of generalized linear models. The new method is called Forward Boosting ...

2016
Shikai Luo Subhashis Ghosal

We propose a new variable selection and estimation technique for high dimensional single index models with unknown monotone smooth link function. Among many predictors, typically, only a small fraction of them have significant impact on prediction. In such a situation, more interpretable models with better prediction accuracy can be obtained by variable selection. In this article, we propose a ...

2009
Sheng CHEN

The objective of modelling from data is not that the model simply fits the training data well. Rather, the goodness of a model is characterized by its generalization capability, interpretability and ease for knowledge extraction. All these desired properties depend crucially on the ability to construct appropriate parsimonious models by the modelling process, and a basic principle in practical ...

2013
Robert Dürichen Tobias Wissel Floris Ernst Achim Schweikard

In robotic radiotherapy, systematic latencies have to be compensated by prediction of external optical surrogates. We investigate possibilities to increase the prediction accuracy using multi-modal sensors. The measurement setup includes position, acceleration, strain and flow sensors. To select the most relevant and least redundant information from the sensors and to limit the size of the feat...

2013
Frank Hutter Holger H. Hoos Kevin Leyton-Brown

Abstract. Most state-of-the-art algorithms for large scale optimization expose free parameters, giving rise to combinatorial spaces of possible configurations. Typically, these spaces are hard for humans to understand. In this work, we study a model-based approach for identifying a small set of both algorithm parameters and instance features that suffices for predicting empirical algorithm perf...

2015
P. J. Lingras

Attribute reduction of an information system is a key problem in rough set theory and its applications. Rough set theory has been one of the most successful methods used for feature selection. Rough set is one of the most useful data mining techniques. This paper proposes relative reduct to solve the attribute reduction problem in roughest theory. It is the most promising technique in the Rough...

2006
Vincent Lemaire Raphaël Féraud

In the field of neural networks, feature selection has been studied for the last ten years and classical as well as original methods have been employed. This paper reviews the efficiency of four approaches to do a driven forward features selection on neural networks . We assess the efficiency of these methods compare to the simple Pearson criterion in case of a regression problem.

2003
Matthias W. Seeger Christopher K. I. Williams Neil D. Lawrence

We present a method for the sparse greedy approximation of Bayesian Gaussian process regression, featuring a novel heuristic for very fast forward selection. Our method is essentially as fast as an equivalent one which selects the “support” patterns at random, yet it can outperform random selection on hard curve fitting tasks. More importantly, it leads to a sufficiently stable approximation of...

Journal: :Pattern Recognition 2014
Matthias Reif Faisal Shafait

Most of the widely used pattern classification algorithms, such as Support Vector Machines (SVM), are sensitive to the presence of irrelevant or redundant features in the training data. Automatic feature selection algorithms aim at selecting a subset of features present in a given dataset so that the achieved accuracy of the following classifier can be maximized. Feature selection algorithms ar...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید