نتایج جستجو برای: ridge regression

تعداد نتایج: 331006  

2006
Ling Wang Liefeng Bo Licheng Jiao

Based on the feature map principle, Sparse Kernel Ridge Regression (SKRR) model is proposed. SKRR obtains the sparseness by backward deletion feature selection procedure that recursively removes the feature with the smallest leave-one-out score until the stop criterion is satisfied. Besides good generalization performance, the most compelling property of SKRR is rather sparse, and moreover, the...

Journal: :Technometrics 2000
Arthur E. Hoerl Robert W. Kennard

Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at http://links.jstor.org/page/info/about/policies/terms.jsp. JSTOR's Terms and Conditions of Use provides, in part, that unless you have obtained prior permission, you may not download an entire issue of a journal or multiple copies of articles, and you may use content in the JSTOR archive...

2010
Marcin Budka Bogdan Gabrys

Traditional methods of assessing chemical toxicity of various compounds require tests on animals, which raises ethical concerns and is expensive. Current legislation may lead to a further increase of demand for laboratory animals in the next years. As a result, automatically generated predictions using Quantitative Structure–Activity Relationship (QSAR) modelling approaches appear as an attract...

2014
Vedide Rezan Uslu Erol Egrioglu Eren Bas

A multiple regression model has got the standard assumptions. If the data can not satisfy these assumptions some problems which have some serious undesired effects on the parameter estimates arise. One of the problems is called multicollinearity which means that there is a nearly perfect linear relationship between explanatory variables used in a multiple regression model. This undesirable prob...

Journal: :Technometrics 2011
Ricardo A. Maronna

Ridge regression, being based on the minimization of a quadratic loss function, is sensitive to outliers. Current proposals for robust ridge regression estimators are sensitive to “bad leverage observations”, cannot be employed when the number of predictors p is larger than the number of observations n; and have a low robustness when the ratio p=n is large. In this paper a ridge regression esti...

2012
Haimao Zhan Shizhong Xu

It is widely believed that both common and rare variants contribute to the risks of common diseases or complex traits and the cumulative effects of multiple rare variants can explain a significant proportion of trait variances. Advances in high-throughput DNA sequencing technologies allow us to genotype rare causal variants and investigate the effects of such rare variants on complex traits. We...

2015
Xi Peng Zhang Yi Huajin Tang

In this material, we provide the theoretical analyses to show that the trivial coefficients always correspond to the codes over errors. Lemmas 1–3 show that our errors-removing strategy will perform well when the lp-norm is enforced over the representation, where p = {1, 2,∞}. Let x 6= 0 be a data point in the union of subspaces SD that is spanned by D = [Dx D−x], where Dx and D−x consist of th...

Journal: :Journal of Machine Learning Research 2010
Zhihua Zhang Guang Dai Congfu Xu Michael I. Jordan

Fisher linear discriminant analysis (FDA) and its kernel extension—kernel discriminant analysis (KDA)—are well known methods that consider dimensionality reduction and classification jointly. While widely deployed in practical problems, there are still unresolved issues surrounding their efficient implementation and their relationship with least mean squares procedures. In this paper we address...

Journal: :IACR Cryptology ePrint Archive 2017
Irene Giacomelli Somesh Jha C. David Page Kyonghwan Yoon

Linear regression is an important statistical tool that models the relationship between some explanatory values and an outcome value using a linear function. In many current applications (e.g. predictive modelling in personalized healthcare), these values represent sensitive data owned by several different parties that are unwilling to share them. In this setting, training a linear regression m...

2013
Yuchen Zhang John C. Duchi Martin J. Wainwright

We study a decomposition-based scalable approach to performing kernel ridge regression. The method is simple to describe: it randomly partitions a dataset of size N into m subsets of equal size, computes an independent kernel ridge regression estimator for each subset, then averages the local solutions into a global predictor. This partitioning leads to a substantial reduction in computation ti...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید