نتایج جستجو برای: regularization parameter estimation
تعداد نتایج: 467554 فیلتر نتایج به سال:
For high-dimensional sparse parameter estimation problems, Log-Sum Penalty (LSP) regularization effectively reduces the sampling sizes in practice. However, it still lacks theoretical analysis to support the experience from previous empirical study. The analysis of this article shows that, like `0-regularization, O(s) sampling size is enough for proper LSP, where s is the non-zero components of...
Under the framework of the Kullback-Leibler (KL) distance, we show that a particular case of Gaussian probability function for feedforward neural networks (NNs) reduces into the first-order Tikhonov regularizer. The smooth parameter in kernel density estimation plays the role of regularization parameter. Under some approximations, an estimation formula is derived for estimating regularization p...
Inverse problems are typically ill-posed or ill-conditioned and require regularization. Tikhonov regularization is a popular approach and it requires an additional parameter called the regularization parameter that has to be estimated. The χ method introduced by Mead in [8] uses the χ distribution of the Tikhonov functional for linear inverse problems to estimate the regularization parameter. H...
The solution, x, of the linear system of equations Ax ≈ b arising from the discretization of an ill-posed integral equation g(s) = ∫ H(s, t)f(t) dt with a square integrable kernel H(s, t) is considered. The Tikhonov regularized solution x(λ) approximating the Galerkin coefficients of f(t) is found as the minimizer of J(x) = {‖Ax− b‖2 +λ‖Lx‖2}, where b is given by the Galerkin coefficients of g(...
Regularization is a solution to solve the problem of unstable estimation of covariance matrix with a small sample set in Gaussian classifier. And multi-regularization parameters estimation is more difficult than single parameter estimation. In this paper, KLIM_L covariance matrix estimation is derived theoretically based on MDL (minimum description length) principle for the small sample problem...
High-dimensional feature selection has become increasingly crucial for seeking parsimonious models in estimation. For selection consistency, we derive one necessary and sufficient condition formulated on the notion of degree-of-separation. The minimal degree of separation is necessary for any method to be selection consistent. At a level slightly higher than the minimal degree of separation, se...
In this paper Tikhonov regularization for nonlinear illposed problems is investigated. The regularization term is characterized by a closed linear operator, permitting seminorm regularization in applications. Results for existence, stability, convergence and convergence rates of the solution of the regularized problem in terms of the noise level are given. An illustrating example involving para...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید