نتایج جستجو برای: Informational Entropy
تعداد نتایج: 76318 فیلتر نتایج به سال:
This paper considers the problem of estimating the power consumption at logic and register transfer levels of design from an information theoretical point of view. In particular, it is demonstrated that the average switching activity in the circuit can be calculated using either entropy or informational energy averages. For control circuits and random logic, the output entropy (informational en...
Considering that the movements of complex system entities take place on continuous, but non-differentiable, curves, concepts, like non-differentiable entropy, informational non-differentiable entropy and informational non-differentiable energy, are introduced. First of all, the dynamics equations of the complex system entities (Schrödinger-type or fractal hydrodynamic-type) are obtained. The la...
This paper considers the problem of estimating the power consumption at logic and register transfer levels of design from an information theoretical point of view. In particular, it is demonstrated that the average switching activity in the circuit can be calculated using either entropy or informational energy averages. For control circuits and random logic, the output entropy (informational en...
Informational entropy is often identified as physical entropy. This is surprising because the two quantities are differently defined and furthermore the former is a subjective quantity while the latter is an objective one. We describe the problems and then present a possible view that reconciles the two entropies. Informational entropy of a system is interpreted as physical entropy of a whole c...
This study attempts to extend the prevailing definition of informational entropy, where entropy relates to the amount of reduction of uncertainty or, indirectly, to the amount of information gained through measurements of a random variable. The approach adopted herein describes informational entropy not as an absolute measure of information, but as a measure of the variation of information. Thi...
An analysis on what is known as the interpretation of Fourier maps has been done from the information theory point of view: determining the nature of the peaks in the map (in order to assign them a suitable scattering factor) and allocating bonds between some of the possible peak pairs. Before interpreting the map, a quantitatively measurable entropy (uncertainty, unknowingness) relating to the...
C. Oprean, C. Tănăsescu, Lucian Blaga University of Sibiu, G. Dobrotă, Constantin Brâncusi University of Târgu Jiu This article aims at analysing whether the field of metallurgy in Romania is informationally efficient or not. This field is important for the real Romanian and world-wide economy, that is why the answer to this dilemma is important, and it will determine the investment behaviour o...
Starting from the structure of C26 fullerene and by using a series of map operations, a series of hyperstructures were obtained and investigated. A series of measures were performed in order to characterize the obtained hyperstructures and the estimations of the relationship between them were done. Images of two hyperstructures showing greatest informational entropy and lowest energy respective...
Information is taken as the primary physical entity from which probabilities can be derived. Information produced by a source is defined as the class of sources that are recodingequivalent. Shannon entropy is one of a family of formal Renyi information measures on the space of unique sources. Each of these measures quantifies the volume of the source’s recoding-equivalence class. A space of inf...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید