نتایج جستجو برای: ideal batch size

تعداد نتایج: 662202  

2010
Andrea Zanella

A batch is a group of nodes that have to transmit a single packet each to a common receiver in the shortest time. Most of existing batch resolution algorithms assume immediate feedback and generally neglect the feedback time, being considered much shorter than the packet transmission time. This conjecture, however, fails to apply in many practical high-rate wireless systems, with the consequenc...

Journal: :CoRR 2017
Stanislaw Jastrzebski Zachary Kenton Devansh Arpit Nicolas Ballas Asja Fischer Yoshua Bengio Amos J. Storkey

We study the properties of the endpoint of stochastic gradient descent (SGD). By approximating SGD as a stochastic differential equation (SDE) we consider the Boltzmann-Gibbs equilibrium distribution of that SDE under the assumption of isotropic variance in loss gradients. Through this analysis, we find that three factors – learning rate, batch size and the variance of the loss gradients – cont...

Journal: :Mathematics 2023

Various transportation services exist, such as ride-sharing or shared taxis, in which customers receive a batch of flexible sizes and share fees. In this study, we conducted an equilibrium analysis variable service model who observe no waiting incomplete can strategically select size to maximize the individual utilities. We formulated three-dimensional Markov chain created book-type transition ...

2005
Hei-Chia Wang Yi-Shen Chen

Batch size is an important issue in a supply chain system. It affects stock, cost, and service quality. This paper presents a batch-size decision method to determine the optimal batch size for a supply chain system. The goal of this method is to minimize the total cost of a supply chain at a reasonable service level. The proposed method first applies a mathematical model to finding the possible...

Journal: :IEEE transactions on neural networks 1996
Tom Heskes Wim Wiegerinck

We study and compare different neural network learning strategies: batch-mode learning, online learning, cyclic learning, and almost-cyclic learning. Incremental learning strategies require less storage capacity than batch-mode learning. However, due to the arbitrariness in the presentation order of the training patterns, incremental learning is a stochastic process; whereas batch-mode learning...

Journal: :Curationis 1999
Z Mvo J Dick K Steyn

PURPOSE Malnutrition, presenting as obesity in women and under-nutrition in children, is a prevalent problem in the squatter communities of Cape Town. Food habits are determined by a complex matrix of economic, social and cultural factors which need to be understood by health professionals prior to the implementation of strategies to improve the nutritional status of this community. This qualit...

Journal: :CoRR 2017
Aditya Devarakonda Maxim Naumov Michael Garland

Training deep neural networks with Stochastic Gradient Descent, or its variants, requires careful choice of both learning rate and batch size. While smaller batch sizes generally converge in fewer training epochs, larger batch sizes offer more parallelism and hence better computational efficiency. We have developed a new training approach that, rather than statically choosing a single batch siz...

Journal: :JNW 2014
Chao Feng Yang Xin Hongliang Zhu Yixian Yang

As an effective solution to protect the privacy of the data, homomorphic encryption has become a hot research topic. Existing homomorphic schemes are not truly practical due to their huge key size. In this paper, we present a simple weakly homomorphic encryption scheme using only elementary modular arithmetic over the integers rather than working with ideal lattices. Compared with DGHV’s constr...

2007
Charles D. Immanuel Ying Wang Nicola Bianco

In this article, the control of particle size distribution (PSD) is discussed as a means for the inferential control of the rheology of emulsion polymers. A controllability assessment is presented through a consideration of the process mechanisms to illustrate the attainability or otherwise of bimodal PSD. The suitability of a batch-to-batch iterative feedback PSD control is demonstrated, which...

Journal: :CoRR 2017
Lukas Balles Javier Romero Philipp Hennig

Mini-batch stochastic gradient descent and variants thereof have become standard for large-scale empirical risk minimization like the training of neural networks. These methods are usually used with a constant batch size chosen by simple empirical inspection. The batch size significantly influences the behavior of the stochastic optimization algorithm, though, since it determines the variance o...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید