Supervised Training Using Global Search Methods
نویسنده
چکیده
Supervised learning in neural networks based on the popular backpropagation method can be often trapped in a local minimum of the error function. The class of backpropagation-type training algorithms includes local minimization methods that have no mechanism that allows them to escape the influence of a local minimum. The existence of local minima is due to the fact that the error function is the superposition of nonlinear activation functions that may have minima at different points, which sometimes results in a nonconvex error function. This work investigates the use of global search methods for batch-mode training of feedforward multilayer perceptrons. Global search methods are expected to lead to “optimal” or “near-optimal” weight configurations by allowing the neural network to escape local minima during training and, in that sense, they improve the efficiency of the learning process. The paper reviews the fundamentals of simulated annealing, genetic and evolutionary algorithms as well as some recently proposed deflection procedures. Simulations and comparisons are presented.
منابع مشابه
Self-adaptive global best harmony search algorithm for training neural networks
This paper addresses the application of Self-adaptive Global Best Harmony Search (SGHS) algorithm for the supervised training of feed-forward neural networks (NNs). A structure suitable to data representation of NNs is adapted to SGHS algorithm. The technique is empirically tested and verified by training NNs on two classification benchmarking problems. Overall training time, sum of squared err...
متن کاملGlobal Optimization for Neural Network Training
In this paper, we study various supervised learning methods for training feed-forward neural networks. In general, such learning can be considered as a nonlinear global optimization problem in which the goal is to minimize a nonlinear error function that spans the space of weights using heuristic strategies that look for global optima (in contrast to local optima). We survey various global opti...
متن کاملA Global Optimization Method for Neural Network Training
In this paper, we present a new supervised learning method called NOVEL (Nonlinear Optimization Via External Lead) for training feed-forward neural networks. In general, such learning can be considered as a nonlinear global optimization problem in which the goal is to minimize the nonlinear error function that spans the space of weights. NOVEL is a trajectorybased nonlinear optimization method ...
متن کاملGlobal Search Methods for Neural Network Training
In many cases the supervised neural network training using a backpropagation based learning rule can be trapped in a local minimum of the error function. These training algorithms are local minimization methods and have no mechanism that allows them to escape the in uence of a local minimum. The existence of local minima is due to the fact that the error function is the superposition of nonline...
متن کاملDimensionality Reduction by Supervised Neighbor Embedding Using Laplacian Search
Dimensionality reduction is an important issue for numerous applications including biomedical images analysis and living system analysis. Neighbor embedding, those representing the global and local structure as well as dealing with multiple manifolds, such as the elastic embedding techniques, can go beyond traditional dimensionality reduction methods and find better optima. Nevertheless, existi...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2001