نتایج جستجو برای: ridge regression
تعداد نتایج: 331006 فیلتر نتایج به سال:
This paper deals with ridge estimation of fuzzy nonparametric regression models using triangular fuzzy numbers. This estimation method is obtained by implementing ridge regression learning algorithm in the Lagrangian dual space. The distance measure for fuzzy numbers that suggested by Diamond is used and the local linear smoothing technique with the crossvalidation procedure for selecting the o...
Computational efficiency is important for learning algorithms operating in the "large p, small n" setting. In computational biology, the analysis of data sets containing tens of thousands of features ("large p"), but only a few hundred samples ("small n"), is nowadays routine, and regularized regression approaches such as ridge-regression, lasso, and elastic-net are popular choices. In this pap...
We introduce single-set spectral sparsification as a deterministic sampling-based feature selection technique for regularized least-squares classification, which is the classification analog to ridge regression. The method is unsupervised and gives worst-case guarantees of the generalization power of the classification function after feature selection with respect to the classification function...
Tikhonov regularization, or ridge regression, is a popular technique to deal with collinearity in multivariate regression. We unveil a formal analogy between ridge regression and statistical mechanics, where the objective function is comparable to a free energy, and the ridge parameter plays the role of temperature. This analogy suggests two novel criteria for selecting a suitable ridge paramet...
We apply a general algorithm for merging prediction strategies (the Aggregating Algorithm) to the problem of linear regression with the square loss; our main assumption is that the response variable is bounded. It turns out that for this particular problem the Aggregating Algorithm resembles, but is slightly different from, the wellknown ridge estimation procedure. From general results about th...
In multivariate linear regression, it is often assumed that the response matrix is intrinsically of lower rank. This could be because of the correlation structure among the prediction variables or the coefficient matrix being lower rank. To accommodate both, we propose a reduced rank ridge regression for multivariate linear regression. Specifically, we combine the ridge penalty with the reduced...
In this paper some new methods whitch very recently have been introduced for parameter estimation and variable selection in regression models are reviewd. Furthermore , we simulate several models in order to evaluate the performance of these methods under diffrent situation. At last we compare the performance of these methods with that of the regular traditional variable selection methods such ...
Regularized regression methods for linear regression have been developed the last few decades to overcome the flaws of ordinary least squares regression with regard to prediction accuracy. In this chapter, three of these methods (Ridge regression, the Lasso, and the Elastic Net) are incorporated into CATREG, an optimal scaling method for both linear and nonlinear transformation of variables in ...
In 1996 an Introduction to Radial Basis Function Networks was published on the web 2 along with a package of Matlab functions 3. The emphasis was on the linear character of RBF networks and two techniques borrowed from statistics: forward selection and ridge regression. This document 4 is an update on developments between 1996 and 1999 and is associated with a second version of the Matlab packa...
An obvious Bayesian nonparametric generalization of ridge regression assumes that coefficients are exchangeable, from a prior distribution of unknown form, which is given a Dirichlet process prior with a normal base measure. The purpose of this paper is to explore predictive performance of this generalization, which does not seem to have received any detailed attention, despite related applicat...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید