نتایج جستجو برای: compact quasi newton representation
تعداد نتایج: 418611 فیلتر نتایج به سال:
This study concerns with a trust-region-based method for solving unconstrained optimization problems. The approach takes the advantages of the compact limited memory BFGS updating formula together with an appropriate adaptive radius strategy. In our approach, the adaptive technique leads us to decrease the number of subproblems solving, while utilizing the structure of limited memory quasi-Newt...
Let $varpi$ be a representation of the homogeneous space $G/H$, where $G$ be a locally compact group and $H$ be a compact subgroup of $G$. For an admissible wavelet $zeta$ for $varpi$ and $psi in L^p(G/H), 1leq p <infty$, we determine a class of bounded compact operators which are related to continuous wavelet transforms on homogeneous spaces and they are called localization operators.
Quasi-hierarchical Powell-Sabin splines are C-continuous quadratic splines defined on a locally refined hierarchical triangulation. They admit a compact representation in a normalized B-spline basis. We prove that the quasi-hierarchical basis is in general weakly Lpstable, but for a broad class of hierarchical triangulations it is even strongly Lp-stable.
This paper examines the e ectiveness of using a quasi-Newton based training of a feedforward neural network for forecasting. We have developed a novel quasi-Newton based training algorithm using a generalized logistic function. We have shown that a well designed feed forward structure can lead to a good forecast without the use of the more complicated feedback/feedforward structure of the recur...
One of the widely used methods for solving a nonlinear system of equations is the quasi-Newton method. The basic idea underlining this type of method is to approximate the solution of Newton's equation by means of approximating the Jacobian matrix via quasi-Newton update. Application of quasi-Newton methods for large scale problems requires, in principle, vast computational resource to form and...
The smoothness-constrained least-squares method is widely used for two-dimensional (2D) and three-dimensional (3D) inversion of apparent resistivity data sets. The Gauss–Newton method that recalculates the Jacobian matrix of partial derivatives for all iterations is commonly used to solve the least-squares equation. The quasi-Newton method has also been used to reduce the computer time. In this...
We consider a family of damped quasi-Newton methods for solving unconstrained optimization problems. This family resembles that of Broyden with line searches, except that the change in gradients is replaced by a certain hybrid vector before updating the current Hessian approximation. This damped technique modifies the Hessian approximations so that they are maintained sufficiently positive defi...
<span><span>Quasi-Newton methods are a class of numerical for </span>solving the problem unconstrained optimization. To improve overall efficiency resulting algorithms, we use quasi-Newton which is interesting equation. In this manuscript, present modified BFGS update formula based on new equation, give search direction solving optimizations proplems. We analyse convergence ra...
The problem of minimizing an objective that can be written as the sum of a set of n smooth and strongly convex functions is challenging because the cost of evaluating the function and its derivatives is proportional to the number of elements in the sum. The Incremental Quasi-Newton (IQN) method proposed here belongs to the family of stochastic and incremental methods that have a cost per iterat...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید