نتایج جستجو برای: partial linear model preliminary test lasso
تعداد نتایج: 3367252 فیلتر نتایج به سال:
Regularized regression methods for linear regression have been developed the last few decades to overcome the flaws of ordinary least squares regression with regard to prediction accuracy. In this chapter, three of these methods (Ridge regression, the Lasso, and the Elastic Net) are incorporated into CATREG, an optimal scaling method for both linear and nonlinear transformation of variables in ...
UNLABELLED Robust conversion between microarray platforms is needed to leverage the wide variety of microarray expression studies that have been conducted to date. Currently available conversion methods rely on manufacturer annotations, which are often incomplete, or on direct alignment of probes from different platforms, which often fail to yield acceptable genewise correlation. Here, we descr...
nowadays in trade and economic issues, prediction is proposed as the most important branch of science. existence of effective variables, caused various sectors of the economic and business executives to prefer having mechanisms which can be used in their decisions. in recent years, several advances have led to various challenges in the science of forecasting. economical managers in various fi...
Lasso is a popular method for variable selection in regression. Much theoretical understanding has been obtained recently on its model selection or sparsity recovery properties under sparse and homoscedastic linear regression models. Since these standard model assumptions are often not met in practice, it is important to understand how Lasso behaves under nonstandard model assumptions. In this ...
We devise a communication-efficient approach to distributed sparse regression in the highdimensional setting. The key idea is to average “debiased” or “desparsified” lasso estimators. We show the approach converges at the same rate as the lasso as long as the dataset is not split across too many machines, and consistently estimates the support under weaker conditions than the lasso. On the comp...
I briefly report on some unexpected results that I obtained when optimizing the model parameters of the Lasso. In simulations with varying observations-to-variables ratio n/p, I typically observe a strong peak in the test error curve at the transition point n/p = 1. This peaking phenomenon is well-documented in scenarios that involve the inversion of the sample covariance matrix, and as I illus...
Given n noisy samples with p dimensions, where n ≪ p, we show that the multi-step thresholding procedure based on the Lasso – we call it the Thresholded Lasso, can accurately estimate a sparse vector β ∈ R in a linear model Y = Xβ + ǫ, where Xn×p is a design matrix normalized to have column l2 norm √ n, and ǫ ∼ N(0, σ2In). We show that under the restricted eigenvalue (RE) condition (Bickel-Rito...
Background & Aim: One of the most important and useful models for assessing hospital performance is the Pabon Lasso Model, a graphical model that determines the relative performance of hospitals using three indicators: 1. Bed Occupancy Rate (BOR); 2. Bed turnover (BTO); 3 Average Length of Stay (ALS). The aim of this research is to investigate the performance of the hospitals affiliated with Te...
Estimation in generalized linear models (GLM) is complicated by the presence of constraints. One can handle constraints by maximizing a penalized log-likelihood. Penalties such as the lasso are effective in high dimensions, but often lead to unwanted shrinkage. This paper explores instead penalizing the squared distance to constraint sets. Distance penalties are more flexible than algebraic and...
We study high-dimensional generalized linear models and empirical risk minimization using the Lasso. An oracle inequality is presented, under a so called compatibility condition. Our aim is three fold: to proof a result announced in van de Geer (2007), to provide a simple proof with simple constants, and to separate the stochastic problem from the deterministic one.
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید