نتایج جستجو برای: global gradient algorithm
تعداد نتایج: 1260152 فیلتر نتایج به سال:
augmented downhill simplex method (adsm) is introduced here, that is a heuristic combination of downhill simplex method (dsm) with random search algorithm. in fact, dsm is an interpretable nonlinear local optimization method. however, it is a local exploitation algorithm; so, it can be trapped in a local minimum. in contrast, random search is a global exploration, but less efficient. here, rand...
The EENCL algorithm [1] automatically designs neural network ensembles for classification, combining global evolution with local search based on gradient descent. Two mechanisms encourage diversity: Negative Correlation Learning (NCL) and implicit fitness sharing. This paper analyses EENCL, finding that NCL is not an essential component of the algorithm, while implicit fitness sharing is. Furth...
We study the problem of recovering a vector x ∈ R from its magnitude measurements yi = |〈ai,x〉|, i = 1, ...,m. Our work is along the line of the Wirtinger flow (WF) approach Candès et al. [2015], which solves the problem by minimizing a nonconvex loss function via a gradient algorithm and can be shown to converge to a global optimal point under good initialization. In contrast to the smooth los...
The motivation for this thesis has been to improve the robustness of image processing applications to motion estimation failure and in particular applications for the restoration of archived film. The thesis has been divided into two parts. The first part is concerned with the development of an missing data detection algorithm that is robust to Pathological Motion (PM). PM can cause clean image...
In this study an effective method for nonlinear constrained optimization of shallow foundation is presented. A newly developed heuristic global optimization algorithm called Gravitational Search Algorithm (GSA) is introduced and applied for the optimization of foundation. The algorithm is classified as random search algorithm and does not require initial values and uses a random search instead ...
We study the problem of recovering a vector x ∈ R from its magnitude measurements yi = |〈ai,x〉|, i = 1, ..., m. Our work is along the line of the Wirtinger flow (WF) approach [1], which solves the problem by minimizing a nonconvex loss function via a gradient algorithm and can be shown to converge to a global optimal point under good initialization. In contrast to the smooth loss function used ...
We propose a modified algorithm for the gradient method to determine the near-edge smoke plume boundaries using backscatter signals of a scanning lidar. The running derivative of the ratio of the signal standard deviation (STD) to the accumulated sum of the STD is calculated, and the location of the global maximum of this function is found. No empirical criteria are required to determine smoke ...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید