KENet: Distilling Convolutional Networks via Knowledge Enhancement

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Distilling Model Knowledge

Top-performing machine learning systems, such as deep neural networks, large ensembles and complex probabilistic graphical models, can be expensive to store, slow to evaluate and hard to integrate into larger systems. Ideally, we would like to replace such cumbersome models with simpler models that perform equally well. In this thesis, we study knowledge distillation, the idea of extracting the...

متن کامل

Distilling Knowledge from Ensembles of Neural Networks for Speech Recognition

Speech recognition systems that combine multiple types of acoustic models have been shown to outperform single-model systems. However, such systems can be complex to implement and too resource-intensive to use in production. This paper describes how to use knowledge distillation to combine acoustic models in a way that has the best of many worlds: It improves recognition accuracy significantly,...

متن کامل

Distilling Knowledge from Deep Networks with Applications to Healthcare Domain

Exponential growth in Electronic Healthcare Records (EHR) has resulted in new opportunities and urgent needs for discovery of meaningful data-driven representations and patterns of diseases in Computational Phenotyping research. Deep Learning models have shown superior performance for robust prediction in computational phenotyping tasks, but suffer from the issue of model interpretability which...

متن کامل

Convolutional Neural Networks Analyzed via Convolutional Sparse Coding

In recent years, deep learning and in particular convolutional neural networks (CNN) have led to some remarkable results in various fields. In this scheme, an input signal is convolved with learned filters and a non-linear point wise function is then applied on the response map. Oftentimes, an additional non-linear step, termed pooling, is applied on the outcome. The obtained result is then fed...

متن کامل

Graph Convolutional Neural Networks via Scattering

We generalize the scattering transform to graphs and consequently construct a convolutional neural network on graphs. We show that under certain conditions, any feature generated by such a network is approximately invariant to permutations and stable to graph manipulations. Numerical results demonstrate competitive performance on relevant datasets.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IFAC-PapersOnLine

سال: 2020

ISSN: 2405-8963

DOI: 10.1016/j.ifacol.2021.04.116