نتایج جستجو برای: weight update

تعداد نتایج: 417945  

2011
Binxuan SUN Jiarong LUO Shuangbao SHU Nan YU

Discuss approaches to combine techniques used by ensemble learning methods. Randomness which is used by Bagging and Random Forests is introduced into Adaboost to get robust performance under noisy situation. Declare that when the randomness introduced into AdaBoost equals to 100, the proposed algorithm turns out to be a Random Forests with weight update technique. Approaches are discussed to im...

Journal: :IEEE Journal on Emerging and Selected Topics in Circuits and Systems 2021

Artificial Neural Networks (ANNs) are known as state-of-the-art techniques in Machine Learning (ML) and have achieved outstanding results data-intensive applications, such recognition, classification, segmentation. These networks mostly use deep layers of convolution and/or fully connected with many filters each layer, demanding a large amount data tunable hyperparameters to achieve competitive...

Journal: :The Journal of Korean Institute of Communications and Information Sciences 2011

Journal: :the modares journal of electrical engineering 2003
hamidreza noorian kamal mohamedpour

future increase of mobile communication subscribers will require a great capacity expansion of the cellular systems. in order to accommodate the increasing number of subscribers, the cell size will have to be much smaller than current size. therefore, it is predictable that the location updating and paging procedures will produce a major part of signaling traffic in these networks. this pape...

2011
Yuke FANG Yan FU Chongjing SUN Junlin ZHOU

From family of corrective boosting algorithms (i.e. AdaBoost, LogitBoost) to total corrective algorithms (i.e. LPBoost, TotalBoost, SoftBoost, ERLPBoost), we analysis these methods of sample weight updating. Corrective boosting algorithms update the sample weight according to the last hypothesis; comparatively, total corrective algorithms update the weight with the best one of all weak classifi...

1993
Genevieve B. Orr Todd K. Leen

The rate of convergence for gradient descent algorithms, both batch and stochastic, can be improved by including in the weight update a “momentum” term proportional to the previous weight update. Several authors [1, 2] give conditions for convergence of the mean and covariance of the weight vector for momentum LMS with constant learning rate. However stochastic algorithms require that the learn...

2006
Chris J.B. Macnab

This paper addresses the problem of weight drift in direct adaptive control using associative memories. Weight drift can pose significant problems for systems that exhibit underdamped oscillations. Simulations show how the robust weight update method e-modification can result in bursting, an unacceptable mode of behavior where unexpected large increases in state error occur. Performance may be ...

2014
Muhammad Hanif Md. Jashim Uddin Md Abdul Alim

In this paper, we implement the method of Steepest Descent in single and multilayer feedforward artificial neural networks. In all previous works, all the update weight equations for single or multilayer feedforward artificial neural networks has been calculated by choosing a single activation function for various processing unit in the network. We, at first, calculate the total error function ...

Journal: :JCP 2013
Dan Wang Jilan Chen Wenbing Zhao

MapReduce is a kind of software framework for easily writing applications which process vast amounts of data on large clusters of commodity hardware. In order to get better allocation of tasks and load balancing, the MapReduce work mode and task scheduling algorithm of Hadoop platform is analyzed in this paper. According to this situation that the number of tasks of the smaller weight job is mo...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید