نتایج جستجو برای: informative dropout

تعداد نتایج: 31950  

Journal: :Archives of pediatrics & adolescent medicine 2011
Suzanne E U Kerns Michael D Pullmann Sarah Cusworth Walker Aaron R Lyon T J Cosgrove Eric J Bruns

OBJECTIVE To determine the association between use of school-based health centers (SBHCs) and school dropout. DESIGN Quasi-experimental longitudinal analysis of a retrospective student cohort, with SBHC use as the independent variable. We statistically controlled for dropout risk and used propensity score regression adjustment to control for several factors associated with SBHC use. SETTING...

Journal: :CoRR 2017
Dimity Miller Lachlan Nicholson Feras Dayoub Niko Sünderhauf

Dropout Variational Inference, or Dropout Sampling, has been recently proposed as an approximation technique for Bayesian Deep Learning and evaluated for image classification and regression tasks. This paper investigates the utility of Dropout Sampling for object detection for the first time. We demonstrate how label uncertainty can be extracted from a state-of-the-art object detection system v...

2013
Pierre Baldi Peter J. Sadowski

Dropout is a relatively new algorithm for training neural networks which relies on stochastically “dropping out” neurons during training in order to avoid the co-adaptation of feature detectors. We introduce a general formalism for studying dropout on either units or connections, with arbitrary probability values, and use it to analyze the averaging and regularizing properties of dropout in bot...

2017
Yarin Gal Jiri Hron Alex Kendall

• Gal and Gharamani (2015) reinterpreted dropout regularisation as approximate inference in BNNs •Dropout probabilities pl are variational parameters of the approximate posterior qθ(ω) = ∏ k qMk,pk(Wk), where Wk = Mk · diag (zk) and zkl iid ∼Bernoulli(1− pk) • Concrete distribution (Maddison et al., Jang et al.) relaxes Categorical distribution to obtain gradients wrt the probability vector – E...

Journal: :CoRR 2017
Konrad Zolna Devansh Arpit Dendi Suhubdy Yoshua Bengio

Recurrent neural networks (RNNs) form an important class of architectures among neural networks useful for language modeling and sequential prediction. However, optimizing RNNs is known to be harder compared to feed-forward neural networks. A number of techniques have been proposed in literature to address this problem. In this paper we propose a simple technique called fraternal dropout that t...

2015
Haibing Wu Xiaodong Gu

Recently, dropout has seen increasing use in deep learning. For deep convolutional neural networks, dropout is known to work well in fully-connected layers. However, its effect in pooling layers is still not clear. This paper demonstrates that max-pooling dropout is equivalent to randomly picking activation based on a multinomial distribution at training time. In light of this insight, we advoc...

Journal: :IEEE Transactions on Pattern Analysis and Machine Intelligence 2021

2010
Juliana Guimarães Breno Sampaio Yony Sampaio

In this paper we analyze the major determinants of university enrollment and dropout in Brazil. The econometric model consists of two simultaneous equations. The first equation determines whether the student was accepted at the university or not; the second determines students decision to dropout or not from higher education, given they were accepted at the university. Gender, age and marriage ...

2017
Gaofeng Cheng Vijayaditya Peddinti Daniel Povey Vimal Manohar Sanjeev Khudanpur Yonghong Yan

Long Short-Term Memory networks (LSTMs) are a component of many state-of-the-art DNN-based speech recognition systems. Dropout is a popular method to improve generalization in DNN training. In this paper we describe extensive experiments in which we investigated the best way to combine dropout with LSTMs– specifically, projected LSTMs (LSTMP). We investigated various locations in the LSTM to pl...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید