نتایج جستجو برای: newton basis functions
تعداد نتایج: 862303 فیلتر نتایج به سال:
In this paper, we study the problem of minimizing the ratio of two quadratic functions subject to a quadratic constraint. First we introduce a parametric equivalent of the problem. Then a bisection and a generalized Newton-based method algorithms are presented to solve it. In order to solve the quadratically constrained quadratic minimization problem within both algorithms, a semidefinite optim...
in this paper, we study the problem of minimizing the ratio of two quadratic functions subject to a quadratic constraint. first we introduce a parametric equivalent of the problem. then a bisection and a generalized newton-based method algorithms are presented to solve it. in order to solve the quadratically constrained quadratic minimization problem within both algorithms, a semidefinite optim...
We propose a randomized second-order method for optimization known as the Newton Sketch: it is based on performing an approximate Newton step using a randomly projected or sub-sampled Hessian. For self-concordant functions, we prove that the algorithm has super-linear convergence with exponentially high probability, with convergence and complexity guarantees that are independent of condition nu...
In this paper we compare the performance, scalability, and robustness of diierent parallel algorithms for the numerical solution of nonlinear boundary value problems arising in the magnetic eld computation and in solid mechanics. These problems are discretized by using the nite element method with triangular meshes and piece-wise linear functions. The nonlinearity is handled by a nested Newton ...
We show how certain widely used multistep approximation algorithms can be interpreted as instances of an approximate Newton method. It was shown in an earlier paper by the second author that the convergence rates of approximate Newton methods (in the context of the numerical solution of PDEs) suuer from a \loss of derivatives", and that the subsequent linear rate of convergence can be improved ...
We propose a randomized second-order method for optimization known as the Newton sketch: it is based on performing an approximate Newton step using a randomly projected Hessian. For self-concordant functions, we prove that the algorithm has superlinear convergence with exponentially high probability, with convergence and complexity guarantees that are independent of condition numbers and relate...
We studied the source localization accuracy in MEG with single-surface brain-shaped conductor models. The Galerkin method with linear basis functions was used as a discretization method. To Þnd a dipolar source, a variant of the Newton method was used for solving nonlinear least-squares problems. Reference magnetic Þelds were computed using a very large number of unknowns to obtain accurate sol...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید