نتایج جستجو برای: ridge regression

تعداد نتایج: 331006  

2013
NP Slagle

As part of a survey of state-of-the-art kernel approximation algorithms, we present a new sampling algorithm for circulant matrix construction to perform fast kernel matrix inversion in kernel ridge regression, comparing theoretical and experimental performance of that of multilevel circulant kernel approximation, incomplete Cholesky decomposition, and random features, all recent advances in th...

2014
Nina Hofheinz Matthias Frisch

Ridge regression with heteroscedastic marker variances provides an alternative to Bayesian genome-wide prediction methods. Our objectives were to suggest new methods to determine marker-specific shrinkage factors for heteroscedastic ridge regression and to investigate their properties with respect to computational efficiency and accuracy of estimated effects. We analyzed published data sets of ...

2018
Michiel Stock Tapio Pahikkala Antti Airola Bernard De Baets Willem Waegeman

Many machine learning problems can be formulated as predicting labels for a pair of objects. Problems of that kind are often referred to as pairwise learning, dyadic prediction or network inference problems. During the last decade kernel methods have played a dominant role in pairwise learning. They still obtain a state-of-the-art predictive performance, but a theoretical analysis of their beha...

2017
Lee H. Dicker Dean P. Foster Daniel Hsu

Regularization is an essential element of virtually all kernel methods for nonparametric regression problems. A critical factor in the effectiveness of a given kernel method is the type of regularization that is employed. This article compares and contrasts members from a general class of regularization techniques, which notably includes ridge regression and principal component regression. We d...

2014
Paula Parpart Matt Jones Bradley C. Love

Abstract: Probabilistic inference models (e.g. Bayesian models) are often cast as being rational and at odds with simple heuristic approaches. We show that prominent decision heuristics, take-the-best and tallying, are special cases of Bayesian inference. We developed two Bayesian learning models by extending two popular regularized regression approaches, lasso and ridge regression. The priors ...

2010
Jinyu Li Yu Tsao Chin-Hui Lee

We propose a parameter shrinkage adaptation framework to estimate models with only a limited set of adaptation data to improve accuracy for automatic speech recognition, by regularizing an objective function with a sum of parameterwise power q constraint. For the first attempt, we formulate ridge maximum likelihood linear regression (MLLR) and ridge constraint MLLR (CMLLR) with an element-wise ...

2006
Erika Chin

We built 3-D and 1-D look up tables (LUTs) to transform a user’s desired device-independent colors (CIELab) to the device-dependent color space (RGB). We considered experimental adaptive neighborhood and estimation methods for building the 3-D and 1-D LUTs. Methods of finding neighborhoods include: smallest enclosing neighborhood (SEN), smallest enclosing inclusive neighborhood (SENR), natural ...

Journal: :Pattern Recognition Letters 2011
Sujeevan Aseervatham Anestis Antoniadis Éric Gaussier Michel Burlet Yves Denneulin

The ridge logistic regression has successfully been used in text categorization problems and it has been shown to reach the same performance as the Support Vector Machine but with the main advantage of computing a probability value rather than a score. However, the dense solution of the ridge makes its use unpractical for large scale categorization. On the other side, LASSO regularization is ab...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید