نتایج جستجو برای: dichotomous coordinate descent dcd

تعداد نتایج: 77679  

Journal: :IEEE Transactions on Signal Processing 2021

In this paper, we address the problem of extracting all super-Gaussian source signals from a linear mixture in which (i) number sources $K$ is less than that sensors notation="LaTeX">$M$ , and (ii) there are ...

Journal: :Proceedings of the ... AAAI Conference on Artificial Intelligence 2022

We address the poor scalability of learning algorithms for orthogonal recurrent neural networks via use stochastic coordinate descent on group, leading to a cost per iteration that increases linearly with number states. This contrasts cubic dependency typical feasible such as Riemannian gradient descent, which prohibits big network architectures. Coordinate rotates successively two columns matr...

Journal: :Journal of Machine Learning Research 2010
Fang-Lan Huang Cho-Jui Hsieh Kai-Wei Chang Chih-Jen Lin

Maximum entropy (Maxent) is useful in natural language processing and many other areas. Iterative scaling (IS) methods are one of the most popular approaches to solve Maxent. With many variants of IS methods, it is difficult to understand them and see the differences. In this paper, we create a general and unified framework for iterative scaling methods. This framework also connects iterative s...

2008
Daniel M. Cer Daniel Jurafsky Christopher D. Manning

Minimum error rate training (MERT) is a widely used learning procedure for statistical machine translation models. We contrast three search strategies for MERT: Powell’s method, the variant of coordinate descent found in the Moses MERT utility, and a novel stochastic method. It is shown that the stochastic method obtains test set gains of +0.98 BLEU on MT03 and +0.61 BLEU on MT05. We also prese...

2016
Zhanxing Zhu Amos J. Storkey

We consider convex-concave saddle point problems with a separable structure and non-strongly convex functions. We propose an efficient stochastic block coordinate descent method using adaptive primal-dual updates, which enables flexible parallel optimization for large-scale problems. Our method shares the efficiency and flexibility of block coordinate descent methods with the simplicity of prim...

2017
Shirin Elizabeth Khorsandi Emmanouil Giorgakis Hector Vilca-Melendez John O’Grady Michael Heneghan Varuna Aluvihare Abid Suddle Kosh Agarwal Krishna Menon Andreas Prachalias Parthi Srinivasan Mohamed Rela Wayel Jassem Nigel Heaton

AIM To identify objective predictive factors for donor after cardiac death (DCD) graft loss and using those factors, develop a donor recipient stratification risk predictive model that could be used to calculate a DCD risk index (DCD-RI) to help in prospective decision making on organ use. METHODS The model included objective data from a single institute DCD database (2005-2013, n = 261). Uni...

Journal: :CoRR 2017
Pavel Dvurechensky Alexander Gasnikov Alexander Tiurin

In this paper, we consider smooth convex optimization problems with simple constraints and inexactness in the oracle information such as value, partial or directional derivatives of the objective function. We introduce a unifying framework, which allows to construct different types of accelerated randomized methods for such problems and to prove convergence rate theorems for them. We focus on a...

Journal: :Research in developmental disabilities 2015
Faiçal Farhat Ines Hsairi Hamza Baiti John Cairney Radhouane Mchirgui Kaouthar Masmoudi Johnny Padulo Chahinez Triki Wassim Moalla

Children with developmental coordination disorder (DCD) have been shown to be less physically fit when compared to their typically developing peers. The purpose of the present study was to examine the relationships among body composition, physical fitness and exercise tolerance in children with and without DCD. Thirty-seven children between the ages of 7 and 9 years participated in this study. ...

2013
Yatao Bian Xiong Li Mingqi Cao Yuncai Liu

Parallel coordinate descent algorithms emerge with the growing demand of large-scale optimization. In general, previous algorithms are usually limited by their divergence under high degree of parallelism (DOP), or need data pre-process to avoid divergence. To better exploit parallelism, we propose a coordinate descent based parallel algorithm without needing of data pre-process, termed as Bundl...

Journal: :SIAM Journal on Optimization 2015
Cong D. Dang Guanghui Lan

In this paper, we present a new stochastic algorithm, namely the stochastic block mirror descent (SBMD) method for solving large-scale nonsmooth and stochastic optimization problems. The basic idea of this algorithm is to incorporate the block-coordinate decomposition and an incremental block averaging scheme into the classic (stochastic) mirror-descent method, in order to significantly reduce ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید