نتایج جستجو برای: SGD

تعداد نتایج: 1169  

Journal: :avicenna journal of phytomedicine 0
seyed hadi mousavi department of pharmacology, faculty of medicine, mashhad university of medical sciences, mashhad, iran behnaz naghizade payam noor shargh, tehran university of basic sciences, tehran, iran solmaz pourgonabadi department of pharmacology school of medicine, mashhad university of medical sciences, mashhad, iran ahmad ghorbani pharmacological research center of medicinal plants, school of medicine, mashhad university of medical sciences, mashhad, iran

objective: oxidative stress plays a key role in the pathophysiology of brain ischemia and neurodegenerative disorders.previous studies indicated that viola tricolor and viola odorataare rich sources of antioxidants. this study aimed to determine whether these plants protect neurons against serum/glucose deprivation (sgd)-induced cell death in an in vitro model of ischemia and neurodegeneration....

Objective: Oxidative stress plays a key role in the pathophysiology of brain ischemia and neurodegenerative disorders.Previous studies indicated that Viola tricolor and Viola odorataare rich sources of antioxidants. This study aimed to determine whether these plants protect neurons against serum/glucose deprivation (SGD)-induced cell death in an in vitro model of ischemia and neurodegeneration....

Journal: :ACM Transactions on Architecture and Code Optimization 2020

Objective: Oxidative stress is associated with the pathogenesis of brain ischemia and other neurodegenerative disorders. Previous researches have shown the antioxidant activity of Viola odorata L. In this project, we studied neuro-protective and reactive oxygen species (ROS) scavenging activities of methanol  (MeOH) extract and other fractions isolated from <e...

2014
Jianhui Chen Tianbao Yang Shenghuo Zhu

We propose a low-rank stochastic gradient descent (LR-SGD) method for solving a class of semidefinite programming (SDP) problems. LR-SGD has clear computational advantages over the standard SGD peers as its iterative projection step (a SDP problem) can be solved in an efficient manner. Specifically, LR-SGD constructs a low-rank stochastic gradient and computes an optimal solution to the project...

Journal: :CoRR 2015
Andrew J. R. Simpson

Stochastic Gradient Descent (SGD) is arguably the most popular of the machine learning methods applied to training deep neural networks (DNN) today. It has recently been demonstrated that SGD can be statistically biased so that certain elements of the training set are learned more rapidly than others. In this article, we place SGD into a feedback loop whereby the probability of selection is pro...

Journal: :IEEE journal on selected areas in information theory 2021

In this paper, we propose and analyze SQuARM-SGD, a communication-efficient algorithm for decentralized training of large-scale machine learning models over network. each node performs fixed number local SGD steps using Nesterov's momentum then sends sparsified quantized updates to its neighbors regulated by locally computable triggering criterion. We provide convergence guarantees our general ...

Journal: :CoRR 2017
Daning Cheng Shigang Li Yunquan Zhang

Stochastic gradient descent (SGD) is a popular stochastic optimization method in machine learning. Traditional parallel SGD algorithms, e.g., SimuParallel SGD [1], often require all nodes to have the same performance or to consume equal quantities of data. However, these requirements are difficult to satisfy when the parallel SGD algorithms run in a heterogeneous computing environment; low-perf...

Journal: :CoRR 2015
Shen-Yi Zhao Wu-Jun Li

Stochastic gradient descent (SGD) and its variants have become more and more popular in machine learning due to their efficiency and effectiveness. To handle large-scale problems, researchers have recently proposed several parallel SGD methods for multicore systems. However, existing parallel SGD methods cannot achieve satisfactory performance in real applications. In this paper, we propose a f...

2016
Bo Han Ivor W. Tsang Ling Chen

The convergence of Stochastic Gradient Descent (SGD) using convex loss functions has been widely studied. However, vanilla SGD methods using convex losses cannot perform well with noisy labels, which adversely affect the update of the primal variable in SGD methods. Unfortunately, noisy labels are ubiquitous in real world applications such as crowdsourcing. To handle noisy labels, in this paper...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید