نتایج جستجو برای: shannon entropy

تعداد نتایج: 72543  

Journal: :Acta Materialia 2022

Current definition of high-entropy alloys (HEAs) is commonly based on the configurational entropy. But this depends only chemical composition a HEA and therefore cannot distinguish information content that encoded in various local atomic arrangements measurable by Shannon entropy theory. Here, inspired finding two-dimensional (2D) Sudoku matrices exhibit higher average than 2D random matrix cou...

Journal: :Journal of Risk and Financial Management 2020

Journal: :Physical Review E 2021

The statistical analysis of data stemming from dynamical systems, including, but not limited to, time series, routinely relies on the estimation information theoretical quantities, most notably Shannon entropy. To this purpose, possibly widespread tool is provided by so-called plug-in estimator, whose properties in terms bias and variance were investigated since first decade after publication S...

Journal: :ESAIM: Probability and Statistics 2019

Journal: :CoRR 2008
Ping Li

Compressed Counting (CC) was recently proposed for approximating the αth frequency moments of data streams, for 0 < α ≤ 2. Under the relaxed strict-Turnstile model, CC dramatically improves the standard algorithm based on symmetric stable random projections, especially as α → 1. A direct application of CC is to estimate the entropy, which is an important summary statistic in Web/network measure...

Journal: :Physical review letters 2016
Mirjam Weilenmann Lea Kraemer Philippe Faist Renato Renner

Thermodynamic entropy, as defined by Clausius, characterizes macroscopic observations of a system based on phenomenological quantities such as temperature and heat. In contrast, information-theoretic entropy, introduced by Shannon, is a measure of uncertainty. In this Letter, we connect these two notions of entropy, using an axiomatic framework for thermodynamics [E. H. Lieb and J. Yngvason Pro...

2016
Yuta Sakai Ken-ichi Iwata

The paper examines relationships between the Shannon entropy and the `α-norm for n-ary probability vectors, n ≥ 2. More precisely, we investigate the tight bounds of the `α-norm with a fixed Shannon entropy, and vice versa. As applications of the results, we derive the tight bounds between the Shannon entropy and several information measures which are determined by the `α-norm. Moreover, we app...

2007
John Watrous

1.2 REMARK ON INTERPRETATIONS OF THE SHANNON ENTROPY There are standard ways to interpret the Shannon entropy. For instance, the quantity H(p) can be viewed as a measure of the amount of uncertainty in a random experiment described by the probability mass function p, or as a measure of the amount of information one gains by learning the value of such an experiment. Indeed, it is possible to sta...

Journal: :Journal of Approximation Theory 2007

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید