Game Theory , Maximum Generalized Entropy , Minimum Dis repan y , Robust Bayes
ثبت نشده
چکیده
منابع مشابه
Game Theory, Maximum Entropy, Minimum Discrepancy and Robust Bayesian Decision Theory
We describe and develop a close relationship between two problems that have customarily been regarded as distinct: that of maximizing entropy, and that of minimizing worst-case expected loss. Using a formulation grounded in the equilibrium theory of zero-sum games between Decision Maker and Nature, these two problems are shown to be dual to each other, the solution to each providing that to the...
متن کاملQuantum Games: Mixed Strategy Nash's Equilibrium Represents Minimum Entropy
This paper introduces Hermite’s polynomials, in the description of quantum games. Hermite’s polynomials are associated with gaussian probability density. The gaussian probability density represents minimum dispersion. I introduce the concept of minimum entropy as a paradigm of both Nash’s equilibrium (maximum utility MU) and Hayek equilibrium (minimum entropy ME). The ME concept is related to Q...
متن کاملComparison of entropy generation minimization principle and entransy theory in optimal design of thermal systems
In this study, the relationship among the concepts of entropy generation rate, entransy theory, and generalized thermal resistance to the optimal design of thermal systems is discussed. The equations of entropy and entransy rates are compared and their implications for optimization of conductive heat transfer are analyzed. The theoretical analyses show that based on entropy generation minimizat...
متن کاملQuantum Games and Minimum Entropy
This paper analyze Nash’s equilibrium (maximum utilility MU) and its relation with the order state (minimum entropy ME). I introduce the concept of minimum entropy as a paradigm of both Nash-Hayek’s equilibrium. The ME concept is related to Quantum Games. One question arises after completing this exercise: What do the Quantum Mechanics postulates indicate about Game Theory and Economics? Journa...
متن کاملThe Minimum Information Principle for Discriminative Learning
Exponential models of distributions are widely used in machine learning for classification and modelling. It is well known that they can be interpreted as maximum entropy models under empirical expectation constraints. In this work, we argue that for classification tasks, mutual information is a more suitable information theoretic measure to be optimized. We show how the principle of minimum mu...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2002