Information Bottleneck in Deep Learning - A Semiotic Approach

نویسندگان

چکیده


 The information bottleneck principle was recently proposed as a theory meant to explain some of the training dynamics deep neural architectures. Via plane analysis, patterns start emerge in this framework, where two phases can be distinguished: fitting and compression. We take step further study behaviour spatial entropy characterizing layers convolutional networks (CNNs), relation theory. observe pattern formations which resemble compression phases. From perspective semiotics, also known signs sign-using behavior, saliency maps CNN’s exhibit aggregations: are aggregated into supersigns process is called semiotic superization. Superization characterized by decrease interpreted concentration. discuss from superization discover very interesting analogies related informational adaptation model. In practical application, we introduce modification CNN process: progressively freeze with small variation their map representation. Such stopped earlier without significant impact on performance (the accuracy) network, connecting evolution through time network.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Deep Variational Information Bottleneck

We present a variational approximation to the information bottleneck of Tishby et al. (1999). This variational approach allows us to parameterize the information bottleneck model using a neural network and leverage the reparameterization trick for efficient training. We call this method “Deep Variational Information Bottleneck”, or Deep VIB. We show that models trained with the VIB objective ou...

متن کامل

Learning Sparse Latent Representations with the Deep Copula Information Bottleneck

Deep latent variable models are powerful tools for representation learning. In this paper, we adopt the deep information bottleneck model, identify its shortcomings and propose a model that circumvents them. To this end, we apply a copula transformation which, by restoring the invariance properties of the information bottleneck method, leads to disentanglement of the features in the latent spac...

متن کامل

Learning Hidden Variable Networks: The Information Bottleneck Approach

A central challenge in learning probabilistic graphical models is dealing with domains that involve hidden variables. The common approach for learning model parameters in such domains is the expectation maximization (EM) algorithm. This algorithm, however, can easily get trapped in suboptimal local maxima. Learning the model structure is even more challenging. The structural EM algorithm can ad...

متن کامل

Applying the Information Bottleneck Approach to SRL: Learning LPAD Parameters

In this paper, we propose to apply the Information Bottleneck (IB) approach to a sub-class of Statistical Relational Learning (SRL) languages. Learning parameters in SRL dealing with domains that involve hidden variables requires the use of techniques for learning from incomplete data such as the expectation maximization (EM) algorithm. Recently, IB was shown to overcome well known problems of ...

متن کامل

Information and Semiosis in Living Systems: a Semiotic Approach

During the 1950s and 1960s, genetics and cell and molecular biology have been swamped by terms borrowed from information theory. This ‘information talk’ still pervades these fields, including widely used terms such as ‘genetic code’, ‘messenger RNA’, ‘transcription’, ‘translation’, ‘transduction’, ‘genetic information’, ‘chemical signals’, ‘cell signaling’ etc. But, as the concept of informatio...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: International Journal of Computers Communications & Control

سال: 2022

ISSN: ['1841-9844', '1841-9836']

DOI: https://doi.org/10.15837/ijccc.2022.1.4650