نتایج جستجو برای: discrete time neural networks dnns

تعداد نتایج: 2505214  

Journal: :Journal of neuroscience methods 2016
Irene Sturm Sebastian Bach Wojciech Samek Klaus-Robert Müller

BACKGROUND In cognitive neuroscience the potential of deep neural networks (DNNs) for solving complex classification tasks is yet to be fully exploited. The most limiting factor is that DNNs as notorious 'black boxes' do not provide insight into neurophysiological phenomena underlying a decision. Layer-wise relevance propagation (LRP) has been introduced as a novel method to explain individual ...

Journal: :Applied and Computational Harmonic Analysis 2023

Deep neural networks (DNNs) are quantized for efficient inference on resource-constrained platforms. However, training deep learning models with low-precision weights and activations involves a demanding optimization task, which calls minimizing stage-wise loss function subject to discrete set-constraint. While numerous methods have been proposed, existing studies full quantization of DNNs most...

2014
Raul Fernandez Asaf Rendel Bhuvana Ramabhadran Ron Hoory

Deep Neural Networks (DNNs) have been shown to provide state-of-the-art performance over other baseline models in the task of predicting prosodic targets from text in a speechsynthesis system. However, prosody prediction can be affected by an interaction of shortand long-term contextual factors that a static model that depends on a fixed-size context window can fail to properly capture. In this...

پایان نامه :وزارت علوم، تحقیقات و فناوری - دانشگاه علامه طباطبایی - دانشکده اقتصاد 1393

due to extraordinary large amount of information and daily sharp increasing claimant for ui benefits and because of serious constraint of financial barriers, the importance of handling fraud detection in order to discover, control and predict fraudulent claims is inevitable. we use the most appropriate data mining methodology, methods, techniques and tools to extract knowledge or insights from ...

2017
Patrick McClure Nikolaus Kriegeskorte

As deep neural networks (DNNs) are applied to increasingly challenging problems, they will need to be able to represent their own uncertainty. Modelling uncertainty is one of the key features of Bayesian methods. Using Bernoulli dropout with sampling at prediction time has recently been proposed as an efficient and well performing variational inference method for DNNs. However, sampling from ot...

Journal: :CoRR 2017
Sanjay Ganapathy Swagath Venkataramani Balaraman Ravindran Anand Raghunathan

Deep Neural Networks (DNNs) have advanced the state-of-the-art in a variety of machine learning tasks and are deployed in increasing numbers of products and services. However, the computational requirements of training and evaluating large-scale DNNs are growing at a much faster pace than the capabilities of the underlying hardware platforms that they are executed upon. In this work, we propose...

2014
Qingqing Wang Shouming Zhong

Utilizing the Lyapunov functional method and combining linear matrix inequality (LMI) techniques and integral inequality approach (IIA) to analyze the global asymptotic stability for delayed neural networks (DNNs),a new sufficient criterion ensuring the global stability of DNNs is obtained.The criteria are formulated in terms of a set of linear matrix inequalities,which can be checked efficient...

2017
Antonio Jimeno-Yepes Jianbin Tang Benjamin Scott Mashford

Deep Neural Networks (DNN) achieve human level performance in many image analytics tasks but DNNs are mostly deployed to GPU platforms that consume a considerable amount of power. New hardware platforms using lower precision arithmetic achieve drastic reductions in power consumption. More recently, brain-inspired spiking neuromorphic chips have achieved even lower power consumption, on the orde...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید