نتایج جستجو برای: maximal entropy
تعداد نتایج: 151970 فیلتر نتایج به سال:
We present a construction of an entropy-preserving equivariant surjective map from the d-dimensional critical sandpile model to a certain closed, shift-invariant subgroup of T d (the ‘harmonic model’). A similar map is constructed for the dissipative abelian sandpile model and is used to prove uniqueness and the Bernoulli property of the measure of maximal entropy for that model.
1. The Physics of Information 2 2. Thermodynamics 2 2.1. The laws 3 2.2. Free energy 6 3. Statistical mechanics 7 3.1. Definitions and postulates 8 3.2. A simple model system of magnetic spins 9 3.3. The Maxwell-Boltzmann distribution 10 3.4. Free energy revisited 11 3.5. Gibbs entropy 12 4. Nonlinear dynamics 12 4.1. The ergodic hypothesis 13 4.2. Chaos and limits to prediction 14 4.3. Quantif...
For strongly positively recurrent countable state Markov shifts, we bound the distance between an invariant measure and of maximal entropy in terms difference their entropies. This extends earlier result for subshifts finite type, due to Kadyrov. We provide a similar equilibrium measures potentials, pressure difference. with nearly entropy, have new, sharp, bounds. The strong positive recurrenc...
For a circle map $f\colon\mathbb{S}\to\mathbb{S}$ with zero topological entropy, we show that non-diagonal pair $\langle x,y\rangle\in \mathbb{S}\times \mathbb{S}$ is non-separable if and only it an IN-pair IT-pair. We also null then the maximal pattern entropy of every open cover polynomial order.
Sufficient conditions are developed, under which the compound Poisson distribution has maximal entropy within a natural class of probability measures on the nonnegative integers. Recently, one of the authors [O. Johnson, Stoch. Proc. Appl., 2007] used a semigroup approach to show that the Poisson has maximal entropy among all ultra-log-concave distributions with fixed mean. We show via a non-tr...
A basic property of the entropy of a discrete random variable x is that: 0 ≤ H(x) ≤ log |X | In fact, the entropy is maximal for the discrete uniform distribution. That is, (∀x ∈ X ) p(x) = 1/|X |, in which case H(x) = log |X |. Definition 3.2 (Conditional entropy). The conditional entropy of y given x is defined as: H(y|x) = ∑ v∈X px(v)H(y|x = v) = − ∑ v∈X px(v) ∑ y∈Y py|x(y|v) log py|x(y|v) =...
In the first part of these notes we survey results on entropy for smooth systems. We emphasize questions regarding existence and uniqueness of measures of maximal entropy, changes of topological entropy under perturbations, and entropy structures for smooth systems. In the second part of these notes we review topological pressure and equilibrium states for smooth systems. We look at existence a...
Information-theoretic entropy measures are useful tools for quantifying the spreading of quantum states in phase space. In the present paper, we compare the time evolution of the joint entropy for three simple quantum systems: sid a free Gaussian wave packet, siid a wave packet in a monochromatic electromagnetic field, and siiid a wave packet tunneling through a d barrier. As initial condition ...
We establish the existence of ergodic measures of maximal Hausdorff dimension for hyperbolic sets of surface diffeomorphisms. This is a dimension-theoretical version of the existence of ergodic measures of maximal entropy. The crucial difference is that while the entropy map is upper-semicontinuous, the map ν 7→ dimH ν is neither uppersemicontinuous nor lower-semicontinuous. This forces us to d...
We define discretized Markov transformations and find an algorithm to give the number of maximal-period sequences based on discretized Markov transformations. In this report, we focus on discretized dyadic transformations and define a number-theoretic function related to the numbers of maximal-period sequences based on the discretized dyadic transformations. We also introduce the entropy of the...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید