نتایج جستجو برای: g entropy
تعداد نتایج: 504631 فیلتر نتایج به سال:
In this work a novel approach for color image segmentation using higher order entropy as a textural feature for determination of thresholds over a two dimensional image histogram is discussed. A similar approach is applied to achieve multi-level thresholding in both grayscale and color images. The paper discusses two methods of color image segmentation using RGB space as the standard processing...
1 Introduction Kolmogorov-Sinai entropy (K-S entropy) is used to analyze radio wave intensity time series of quasars. K-S entropy measures the rate of change of probability with the change of dimension that trajectory points in the embedding space stay within a distance r [1]. The K-S entropy analysis method is a powerful formalism for describing dynamics of time series [1]. m-dimensional K-S e...
Entropy measures have been extensively used to assess heart rate variability (HRV), a noninvasive marker of cardiovascular autonomic regulation. It is yet to be elucidated whether those entropy measures can sensitively respond to changes of autonomic balance and whether the responses, if there are any, are consistent across different entropy measures. Sixteen healthy subjects were enrolled in t...
in this paper we introduce the concept of entropy operator for continuous systems of finite topological entropy. it is shown that it generates the kolmogorov entropy as a special case. if $phi$ is invertible then the entropy operator is bounded with the topological entropy of $phi$ as its norm.
We study the Kolmogorov ε-entropy and the fractal dimension of global attractors for autonomous and nonautonomous equations of mathematical physics. We prove upper estimates for the ε-entropy and fractal dimension of the global attractors of nonlinear dissipative wave equations. Andrey Nikolaevich Kolmogorov discovered applications of notions of information theory in the theory of dynamical sys...
Gibbs’s inequality states that the differential entropy of a random variable with probability density function (pdf) f is less than or equal to its cross entropy with any other pdf g defined on the same alphabet, i.e., h(X) ≤ −E[log g(X)]. Using this inequality with a cleverly chosen g, we derive a lower bound on the smallest output entropy that can be achieved by quantizing a d-dimensional sou...
In this paper, we consider a security market in which two investors on diierent information levels maximize their expected logarithmic utility from terminal wealth. While the ordinary investor's portfolio decisions are based on a public information ow, the insider possesses from the beginning extra information about the outcome of some random variable G, e.g., the future price of a stock. We so...
In this paper, we consider a security market in which two investors on different information levels maximize their expected logarithmic utility from terminal wealth. While the ordinary investor’s portfolio decisions are based on a public information flow, the insider possesses from the beginning extra information about the outcome of some random variable G, e.g., the future price of a stock. We...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید