Model averages sharpened into Occam’s razors: Deep learning enhanced by Rényi entropy
نویسندگان
چکیده
Ensemble methods of machine learning combine neural networks or other models in order to improve predictive performance. The proposed ensemble method is based on Occam’s razor idea...
منابع مشابه
Smooth Entropy and Rényi Entropy
The notion of smooth entropy allows a unifying, generalized formulation of privacy ampli-cation and entropy smoothing. Smooth entropy is a measure for the number of almost uniform random bits that can be extracted from a random source by probabilistic algorithms. It is known that the R enyi entropy of order at least 2 of a random variable is a lower bound for its smooth entropy. On the other ha...
متن کاملMaximum Rényi Entropy Rate
Two maximization problems of Rényi entropy rate are investigated: the maximization over all stochastic processes whose marginals satisfy a linear constraint, and the Burg-like maximization over all stochastic processes whose autocovariance function begins with some given values. The solutions are related to the solutions to the analogous maximization problems of Shannon entropy rate.
متن کاملUnderstanding Deep Learning Generalization by Maximum Entropy
Deep learning achieves remarkable generalization capability with overwhelming number of model parameters. Theoretical understanding of deep learning generalization receives recent attention yet remains not fully explored. This paper attempts to provide an alternative understanding from the perspective of maximum entropy. We first derive two feature conditions that softmax regression strictly ap...
متن کاملA Preferred Definition of Conditional Rényi Entropy
The Rényi entropy is a generalization of Shannon entropy to a one-parameter family of entropies. Tsallis entropy too is a generalization of Shannon entropy. The measure for Tsallis entropy is non-logarithmic. After the introduction of Shannon entropy , the conditional Shannon entropy was derived and its properties became known. Also, for Tsallis entropy, the conditional entropy was introduced a...
متن کاملEntropy Power Inequality for the Rényi Entropy
The classical entropy power inequality is extended to the Rényi entropy. We also discuss the question of the existence of the entropy for sums of independent random variables.
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Communications in Statistics
سال: 2021
ISSN: ['1532-415X', '0361-0926']
DOI: https://doi.org/10.1080/03610926.2021.1891438