نتایج جستجو برای: adaptive learning rate
تعداد نتایج: 1694493 فیلتر نتایج به سال:
A normalized algorithm for on-line adaptation of a recurrent perceptron is derived. The algorithm builds upon the normalized backpropagation (NBP) algorithm for feedforward neural networks, and provides an adaptive learning rate and normalization for a recurrent perceptron learning algorithm. The algorithm is based upon local linearization about the current point in the state-space of the netwo...
We study the rates of convergence in classification error achievable by active learning in the presence of label noise. Additionally, we study the more general problem of active learning with a nested hierarchy of hypothesis classes, and propose an algorithm whose error rate provably converges to the best achievable error among classifiers in the hierarchy at a rate adaptive to both the complex...
We study the rates of convergence in generalization error achievable by active learning under various types of label noise. Additionally, we study the general problem of model selection for active learning with a nested hierarchy of hypothesis classes, and propose an algorithm whose error rate provably converges to the best achievable error among classifiers in the hierarchy at a rate adaptive ...
the aim of the current study was to investigate the relationship among efl learners learning style preferences, use of language learning strategies, and autonomy. a total of 148 male and female learners, between the ages of 18 and 30, majoring in english literature and english translation at islamic azad university, central tehran were randomly selected. a package of three questionnaires was ad...
The recurrent least squares (RLS) learning approach is proposed for controlling the learning rate in parallel principal subspace analysis (PSA) and in a wide class of principal component analysis (PCA) associated algorithms with a quasi{parallel extraction ability. The purpose is to provide a useful tool for applications where the learning process has to be repeated in an on{line self{adaptive ...
The recurrent least squares (RLS) learning approach is proposed for controlling the learning rate in parallel principal subspace analysis (PSA) and in a wide class of principal component analysis (PCA) associated algorithms with a quasi{parallel extraction ability. The purpose is to provide a useful tool for applications where the learning process has to be repeated in an on{line self{adaptive ...
In this paper, we study the ability of learning automata-based schemes in escaping from local minima when standard backpropagation (BP) fails to 2nd the global minima. It is demonstrated through simulation that learning automata-based schemes compared to other schemes such as SAB, Super SAB, Fuzzy BP, adaptive steepness method, and variable learning rate method have a higher ability to escape f...
The architecture of forecasting adaptive wavelet-neuro-fuzzy-network and its learning algorithm for the solving of nonstationary processes forecasting tasks are proposed. The learning algorithm is optimal on rate of convergence and allows to tune both the synaptic weights and dilations and translations parameters of wavelet activation functions. The simulation of developed wavelet-neuro-fuzzy n...
Training predictive models with stochastic gradient descent is widespread practice in machine learning. Recent advances improve on the basic technique in two ways: adaptive learning rates are widely used for deep learning, while acceleration techniques like stochastic average and variance reduced gradient descent can achieve a linear convergence rate. We investigate the utility of both types of...
The architecture of adaptive wavelet-neuro-fuzzy-network and its learning algorithm for the solving of nonstationary processes forecasting and emulation tasks are proposed. The learning algorithm is optimal on rate of convergence and allows tuning both the synaptic weights and dilations and translations parameters of wavelet activation functions. The simulation of developed wavelet-neuro-fuzzy ...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید