نتایج جستجو برای: regression problems
تعداد نتایج: 883874 فیلتر نتایج به سال:
In recent years, many methods have been developed for regression in high-dimensional settings. We propose covariance-regularized regression, a family of methods that use a shrunken estimate of the inverse covariance matrix of the features in order to achieve superior prediction. An estimate of the inverse covariance matrix is obtained by maximizing its log likelihood, under a multivariate norma...
The problem of computing an approximate solution of an overdetermined system of linear equations is considered. The usual approach to the problem is least squares, in which the 2-norm of the residual is minimized. This produces the minimum variance unbiased estimator of the solution when the errors in the observations are independent and normally distributed with mean 0 and constant variance. I...
We propose a new nonparametric learning method based on multivariate dyadic regression trees (MDRTs). Unlike traditional dyadic decision trees (DDTs) or classification and regression trees (CARTs), MDRTs are constructed using penalized empirical risk minimization with a novel sparsity-inducing penalty. Theoretically, we show that MDRTs can simultaneously adapt to the unknown sparsity and smooth...
In adaptive boosting, several weak learners trained sequentially are combined to boost the overall algorithm performance. Recently adaptive boostingmethods for classification problems have been derived asgradient descent algorithms. This formulation justifies key elements and parameters in the methods, all chosen to optimize a single common objective function. Wepropose an analogous formulation...
Statistical modelling and inference problems with sample sizes substantially smaller than the number of available covariates are challenging. This is known as large p small n problem. We develop nonlinear regression models in this setup for accurate prediction. In this paper, we introduce a full Bayesian support vector regression model with Vapnik’s 2-insensitive loss function, based on reprodu...
Several efforts have been done to bring ROC analysis beyond (binary) classification, especially in regression. However, the mapping and possibilities of these proposals do not correspond to what we expect from the analysis of operating conditions, dominance, hybrid methods, etc. In this paper we present a new representation of regression models in the so-called regression ROC (RROC) space. The ...
We consider model selection in a hierarchical Bayes formulation of the sparse normal linear model in which individual variables have, independently, an unknown prior probability of being included in the model. The focus is on orthogonal designs, which are of particular importance in nonparametric regression via wavelet shrinkage. Empirical Bayes estimates of hyperparameters are easily obtained ...
In this paper we develop several regression algorithms for solving general stochastic optimal control problems via Monte Carlo. This type of algorithms is particularly useful for problems with a high-dimensional state space and complex dependence structure of the underlying Markov process with respect to some control. The main idea behind the algorithms is to simulate a set of trajectories unde...
A new algorithm for isotonic regression is presented based on recursively partitioning the solution space. We develop efficient methods for each partitioning subproblem through an equivalent representation as a network flow problem, and prove that this sequence of partitions converges to the global solution. These network flow problems can further be decomposed in order to solve very large prob...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید