نتایج جستجو برای: informative dropout
تعداد نتایج: 31950 فیلتر نتایج به سال:
Recurrent neural networks (RNNs) stand at the forefront of many recent developments in deep learning. Yet a major difficulty with these models is their tendency to overfit, with dropout shown to fail when applied to recurrent layers. Recent results at the intersection of Bayesian modelling and deep learning offer a Bayesian interpretation of common deep learning techniques such as dropout. This...
A common problem encountered in statistical analysis is that of missing data, which occurs when some variables have missing values in some units. The present paper deals with the analysis of longitudinal continuous measurements with incomplete data due to non-ignorable dropout. In repeated measurements data, as one solution to a such problem, the selection model assumes a mechanism of outcome-d...
We propose a novel framework to adaptively adjust the dropout rates for the deep neural network based on a Rademacher complexity bound. The state-of-the-art deep learning algorithms impose dropout strategy to prevent feature co-adaptation. However, choosing the dropout rates remains an art of heuristics or relies on empirical grid-search over some hyperparameter space. In this work, we show the...
This study examines dropout incidence, moment of dropout, and switching behavior in organized exercise programs for seniors in the Netherlands, as determined in a prospective cohort study (with baseline measurements at the start of the exercise program and follow-up after 6 months; N = 1,725, response rate 73%). Participants were community-living individuals 50+ who participated in different fo...
The big breakthrough on the ImageNet challenge in 2012 was partially due to the ‘dropout’ technique used to avoid overfitting. Here, we introduce a new approach called ‘Spectral Dropout’ to improve the generalization ability of deep neural networks. We cast the proposed approach in the form of regular Convolutional Neural Network (CNN) weight layers using a decorrelation transform with fixed ba...
Dropout training, originally designed for deep neural networks, has been successful on high-dimensional single-layer natural language tasks. This paper proposes a theoretical explanation for this phenomenon: we show that, under a generative Poisson topic model with long documents, dropout training improves the exponent in the generalization bound for empirical risk minimization. Dropout achieve...
Prior research on school dropout has often focused on stable person- and institution-level variables. In this research, we investigate longitudinally perceived stress and optimism as predictors of dropout intentions over a period of four years, and distinguish between stable and temporary predictors of dropout intentions. Findings based on a nationally representative sample of 16-20 year-olds i...
Preventing feature co-adaptation by encouraging independent contributions from different features often improves classification and regression performance. Dropout training (Hinton et al., 2012) does this by randomly dropping out (zeroing) hidden units and input features during training of neural networks. However, repeatedly sampling a random subset of input features makes training much slower...
While MOOCs offer educational data at a new scale, many educators have been alarmed by their high dropout rates. Learners join a course with some motivation to persist for some or all of the course, but various factors, such as attrition or lack of satisfaction, can lead them to disengage or totally drop out. Educational interventions targeting such risk factors can help reduce dropout rates. H...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید