Markov Information Bottleneck to Improve Information Flow in Stochastic Neural Networks
نویسندگان
چکیده
منابع مشابه
Layer-wise Learning of Stochastic Neural Networks with Information Bottleneck
In this paper, we present a layer-wise learning of stochastic neural networks (SNNs) in an information-theoretic perspective. In each layer of an SNN, the compression and the relevance are defined to quantify the amount of information that the layer contains about the input space and the target space, respectively. We jointly optimize the compression and the relevance of all parameters in an SN...
متن کاملInformation Bottleneck in Control Tasks with Recurrent Spiking Neural Networks
The nervous system encodes continuous information from the environment in the form of discrete spikes, and then decodes these to produce smooth motor actions. Understanding how spikes integrate, represent, and process information to produce behavior is one of the greatest challenges in neuroscience. Information theory has the potential to help us address this challenge. Informational analyses o...
متن کاملCompressing Neural Networks using the Variational Information Bottleneck
Neural networks can be compressed to reduce memory and computational requirements, or to increase accuracy by facilitating the use of a larger base architecture. In this paper we focus on pruning individual neurons, which can simultaneously trim model size, FLOPs, and run-time memory. To improve upon the performance of existing compression algorithms we utilize the information bottleneck princi...
متن کاملComputational Inference of Neural Information Flow Networks
Determining how information flows along anatomical brain pathways is a fundamental requirement for understanding how animals perceive their environments, learn, and behave. Attempts to reveal such neural information flow have been made using linear computational methods, but neural interactions are known to be nonlinear. Here, we demonstrate that a dynamic Bayesian network (DBN) inference algor...
متن کاملModeling Information Flow Through Deep Neural Networks
This paper proposes a principled information theoretic analysis of classification for deep neural network structures, e.g. convolutional neural networks (CNN). The output of convolutional filters is modeled as a random variable Y conditioned on the object class C and network filter bank F . The conditional entropy (CENT) H(Y |C,F ) is shown in theory and experiments to be a highly compact and c...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Entropy
سال: 2019
ISSN: 1099-4300
DOI: 10.3390/e21100976