نتایج جستجو برای: mlff n eural network

تعداد نتایج: 1611241  

2018
Julius Adebayo Justin Gilmer Ian Goodfellow

Explaining the output of a complicated machine learning model like a deep neural network (DNN) is a central challenge in machine learning. Several proposed local explanation methods address this issue by identifying what dimensions of a single input are most responsible for a DNN’s output. The goal of this work is to assess the sensitivity of local explanations to DNN parameter values. Somewhat...

2018
Antoine Bosselut Omer Levy Ari Holtzman Corin Ennis Dieter Fox Yejin Choi

Understanding procedural language requires anticipating the causal effects of actions, even when they are not explicitly stated. In this work, we introduce Neural Process Networks to understand procedural text through (neural) simulation of action dynamics. Our model complements existing memory architectures with dynamic entity tracking by explicitly modeling actions as state transformers. The ...

2014
Jaeho Yoon Jung-Ho Kim Sung Chan Kim Jae-Bong Park Jae-Yong Lee Jaebong Kim

Suppression of bone morphogenetic protein (BMP) signaling induces neural induction in the ectoderm of developing embryos. BMP signaling inhibits eural induction via the expression of various neural suppressors. Previous research has demonstrated that the ectopic expression of dominant negative BMP receptors (DNBR) reduces the expression of target genes down-stream of BMP and leads to neural ind...

Journal: :Developmental Cognitive Neuroscience 2013
Yair Bar-Haim Daniel S. Pine

The seven papers appearing in the current Special ssue of Developmental Cognitive Neuroscience embrace particularly exciting theme in both basic and clincal neuroscience. This excitement reflects the unique otential to unite diverse groups all sharing a major nterest in development, creating a transformative translaional developmental cognitive neuroscience. These groups nclude scientists inter...

Journal: :Chemical Bulletin of Kazakh National University 2013

2018
Itay Hubara Elad Hoffer Daniel Soudry

Binarized Neural networks (BNNs) have been shown to be effective in improving network efficiency during the inference phase, after the network has been trained. However, BNNs only binarize the model parameters and activations during propagations. We show there is no inherent difficulty in training BNNs using ”Quantized BackPropagation” (QBP), in which we also quantized the error gradients and i...

Journal: :international journal of industrial mathematics 0
z. ‎aghababazadeh‎‎ department of mathematics‎, ‎science and‎ ‎research branches‎, ‎islamic azad university‎, ‎tehran‎, ‎iran‎. m. ‎rostamy-‎malkhalifeh‎‎ department of mathematics‎, ‎science and‎ ‎research branches‎, ‎islamic azad university‎, ‎tehran‎, ‎iran‎.

it is well known that in operations research‎, ‎degeneracy can cause a cycle in a network‎ ‎simplex algorithm which can be prevented by maintaining strong‎ ‎feasible bases in each pivot‎. ‎also‎, ‎in a network consists of n arcs‎ ‎and m nodes‎, ‎not considering any new conditions on the entering‎ ‎variable‎, ‎the upper bound of consecutive degenerate pivots is equal‎ $left( ‎begin{array}{c}‎ ‎n...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید