Iterative regularization with minimum-residual methods
نویسندگان
چکیده
منابع مشابه
Scale-space Properties of Nonstationary Iterative Regularization Methods Scale-space Properties of Nonstationary Iterative Regularization Methods
The technical reports of the CVGPR Group are listed under Abstract Most scale-space concepts have been expressed as parabolic or hyperbolic partial diierential equations (PDEs). In this paper we extend our work on scale-space properties of elliptic PDEs arising from regularization methods: we study linear and nonlinear regularization methods that are applied iteratively and with different regul...
متن کاملIterative Projection Methods for Structured Sparsity Regularization
In this paper we propose a general framework to characterize and solve the optimization problems underlying a large class of sparsity based regularization algorithms. More precisely, we study the minimization of learning functionals that are sums of a differentiable data term and a convex non differentiable penalty. These latter penalties have recently become popular in machine learning since t...
متن کاملIterative Regularization Methods in Inverse Scattering
We consider the problem of reconstructing the shape of an acoustic or electromagnetic scatterer from far eld measurements of the scattered wave corresponding to one incident time harmonic wave in the resonance region. This problem is di cult to solve since it is nonlinear and severely ill-posed. The characterization of the Fr echet derivatives, which has been accomplished some years ago by Kirs...
متن کاملResidual Smoothing Techniques for Iterative Methods
An iterative method for solving a linear system Ax b produces iterates {xk with associated residual norms that, in general, need not decrease "smoothly" to zero. "Residual smoothing" techniques are considered that generate a second sequence {Yk via a simple relation yk (1 0k)yk-+ r/kxk. The authors first review and comment on a technique of this form introduced by Sch6nauer and Weiss that resul...
متن کاملLearning with Incremental Iterative Regularization
Within a statistical learning setting, we propose and study an iterative regularization algorithm for least squares defined by an incremental gradient method. In particular, we show that, if all other parameters are fixed a priori, the number of passes over the data (epochs) acts as a regularization parameter, and prove strong universal consistency, i.e. almost sure convergence of the risk, as ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: BIT Numerical Mathematics
سال: 2007
ISSN: 0006-3835,1572-9125
DOI: 10.1007/s10543-006-0109-5