نتایج جستجو برای: entropy production minimization

تعداد نتایج: 697066  

2001
Shu-Kun Lin

Symmetry is a measure of indistinguishability. Similarity is a continuous measure of imperfect symmetry. Lewis' remark that “gain of entropy means loss of information” defines the relationship of entropy and information. Three laws of information theory have been proposed. Labeling by introducing nonsymmetry and formatting by introducing symmetry are defined. The function L ( L=lnw, w is the nu...

This paper tries to achieve a solution for problems that concern condensation around a flat plate, circular and elliptical tube in by numerical and analytical methods. Also, it calculates entropy production rates. At first, a problem was solved with mesh dynamic and rational assumptions; next it was compared with the numerical solution that the result had acceptable errors. An additional suppor...

ژورنال: اقتصاد مقداری 2015

An economic system is comprised of different primary flows that can be captured in macroeconomic models with complex network relations. Theoretically and empirically in this system, weak substitution or complementarity of environmental materials, like energy and other production factors such as capital, is undeniable. This is an effective critique on neoclassical economics. In this paper, we vi...

2003
Erik G. Miller John W. Fisher

This paper presents a new algorithm for the independent components analysis (ICA) problem based on efficient entropy estimates. Like many previous methods, this algorithm directly minimizes the measure of departure from independence according to the estimated Kullback-Leibler divergence between the joint distribution and the product of the marginal distributions. We pair this approach with effi...

2001
Deniz Erdogmus Jose C. Principe

We have previously proposed the use of quadratic Renyi’s error entropy with a Parzen density estimator with Gaussian kernels as an alternative optimality criterion for supervised neural network training, and showed that it produces better performance on the test data compared to the MSE. The error entropy criterion imposes the minimization of average information content in the error signal rath...

1997
Wolfram Burgard Dieter Fox Sebastian Thrun

Localization is the problem of determining the position of a mobile robot from sensor data. Most existing localization approaches are passive, i.e., they do not exploit the opportunity to control the robot’s effectors during localization. This paper proposes an active localization approach. The approach provides rational criteria for (1) setting the robot’s motion direction (exploration), and (...

2017
Neha Mehta

Almost every branch of medical imaging uses the concept of digital image processing for visualizing and extracting details from the data images. Thus, quality enhancement has become more important to be performed with the help of various techniques. The enhancement of an image is one of the most acceptable methods to get the deeper knowledge of any image. There are various contrast enhancement ...

Journal: :CoRR 2017
Shuai Huang Trac D. Tran

Compressive sensing relies on the sparse prior imposed on the signal to solve the ill-posed recovery problem in an under-determined linear system. The objective function that enforces the sparse prior information should be both effective and easily optimizable. Motivated by the entropy concept from information theory, in this paper we propose the generalized Shannon entropy function and Rényi e...

2008
Jernej Zupanc

Recent publications have presented many successful usages of elements from information theory in adaptive systems training. Errorentropy has been proven to outperform mean squared error as a cost function in many artificially generated data sets, but still few applications to real world data have been published. In this paper, we design a neural network trained with error-entropy minimization c...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید