نتایج جستجو برای: iteration complexity
تعداد نتایج: 356024 فیلتر نتایج به سال:
A new class of infeasible interior point methods for solving sufficient linear complementarity problems requiring one matrix factorization andm backsolves at each iteration is proposed and analyzed. The algorithms from this class use a large (N− ∞) neighborhood of an infeasible central path associated with the complementarity problem and an initial positive, but not necessarily feasible, starti...
In this paper we revisit Newton’s iteration as a method to find the G or R matrix in M/G/1type and GI/M/1-type Markov chains. We start by reconsidering the method proposed in [14] which required O(m +Nm) time per iteration, and show that it can be reduced to O(Nm), where m is the block size and N the number of blocks. Moreover, we show how this method is able to further reduce this time complex...
It is known that predictor-corrector methods in a large neighborhood of the central path are among the most efficient interior point methods (IPMs) for linear optimization (LO) problems. The best iteration bound based on the classical logarithmic barrier function is O ( n log n ǫ ) . In this paper we propose a family of self-regular proximity based predictorcorrector (SR-PC) IPM for LO in a lar...
Principal component analysis (PCA) is one of the most powerful tools in machine learning. The simplest method for PCA, the power iteration, requires O(1/∆) full-data passes to recover the principal component of a matrix with eigen-gap ∆. Lanczos, a significantly more complex method, achieves an accelerated rate of O(1/ √ ∆) passes. Modern applications, however, motivate methods that only ingest...
Large-scale constrained convex optimization problems arise in several application 4 domains. First-order methods are good candidates to tackle such problems due to their low iteration 5 complexity and memory requirement. The level-set framework extends the applicability of first-order 6 methods to tackle problems with complicated convex objectives and constraint sets. Current methods 7 based on...
In this paper, we consider a prototypical convex optimization problem with multi-block variables and separable structures. By adding the Logarithmic Quadratic Proximal (LQP) regularizer suitable proximal parameter to each of first grouped subproblems, develop partial LQP-based Alternating Direction Method Multipliers (ADMM-LQP). The dual variable is updated twice relatively larger stepsizes tha...
MOTIVATION Iteration has been used a number of times as an optimization method to produce multiple alignments, either alone or in combination with other methods. Iteration has a great advantage in that it is often very simple both in terms of coding the algorithms and the complexity of the time and memory requirements. In this paper, we systematically test several different iteration strategies...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید