Layer-wise Relevance Propagation for Deep Neural Network Architectures
نویسندگان
چکیده
We present the application of layer-wise relevance propagation to several deep neural networks such as the BVLC reference neural net and googlenet trained on ImageNet and MIT Places datasets. Layerwise relevance propagation is a method to compute scores for image pixels and image regions denoting the impact of the particular image region on the prediction of the classifier for one particular test image. We demonstrate the impact of different parameter settings on the resulting explanation.
منابع مشابه
Layer-Wise Relevance Propagation for Neural Networks with Local Renormalization Layers
Layer-wise relevance propagation is a framework which allows to decompose the prediction of a deep neural network computed over a sample, e.g. an image, down to relevance scores for the single input dimensions of the sample such as subpixels of an image. While this approach can be applied directly to generalized linear mappings, product type non-linearities are not covered. This paper proposes ...
متن کاملBeyond saliency: understanding convolutional neural networks from saliency prediction on layer-wise relevance propagation
Despite the tremendous achievements of deep convolutional neural networks (CNNs) in most of computer vision tasks, understanding how they actually work remains a significant challenge. In this paper, we propose a novel two-step visualization method that aims to shed light on how deep CNNs recognize images and the objects therein. We start out with a layer-wise relevance propagation (LRP) step w...
متن کاملExplaining Recurrent Neural Network Predictions in Sentiment Analysis
Recently, a technique called Layer-wise Relevance Propagation (LRP) was shown to deliver insightful explanations in the form of input space relevances for understanding feed-forward neural network classification decisions. In the present work, we extend the usage of LRP to recurrent neural networks. We propose a specific propagation rule applicable to multiplicative connections as they arise in...
متن کاملTrain Feedfoward Neural Network with Layer-wise Adaptive Rate via Approximating Back-matching Propagation
Stochastic gradient descent (SGD) has achieved great success in training deep neural network, where the gradient is computed through backpropagation. However, the back-propagated values of different layers vary dramatically. This inconsistence of gradient magnitude across different layers renders optimization of deep neural network with a single learning rate problematic. We introduce the back-...
متن کاملDeep Reservoir Computing: A Critical Analysis
In this paper we propose an empirical analysis of deep recurrent neural networks (RNNs) with stacked layers. The analysis aims at the study and proposal of approaches to develop and enhance multiple timescale and hierarchical dynamics in deep recurrent architectures, within the efficient Reservoir Computing (RC) approach for RNN modeling. Results point out the actual relevance of layering and R...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2015