نتایج جستجو برای: krengel entropy

تعداد نتایج: 65366  

A. Aghaei G.A. Sheikhzadeh H.R. Ehteram M. Hajiahmadi

Natural convection heat transfer has many applications in different fields of industry; such as cooling industries, electronic transformer devices and ventilation equipment; due to simple process, economic advantage, low noise and renewed retrieval. Recently, heat transfer of nanofluids have been considered because of higher thermal conductivity coefficient compared with those of ordinary fluid...

پایان نامه :وزارت علوم، تحقیقات و فناوری - دانشگاه صنعتی اصفهان - دانشکده فیزیک 1385

امروزه کاربرد تجزیه و تحلیل داده ها محدود به گرایش خاصی نیست و زمینه های گوناگونی شامل مهندسی، علوم پایه، پزشکی و اقتصاد را در بر می گیرد. از این رو تلاش های زیادی جهت طبقه بندی سری های زمانی فیزیکی و فیزیولوژیکی و شناخت خواص آن ها از سوی دانشمندان صورت گرفته است. در این پایان نامه ابتدا مروری بر مبانی آمار و احتمال مورد نیاز می کنیم و سپس با برخی روشهای متداول برای پردازش داده آشنا و در نهایت...

, H. Homaei, H. Golestanian, M. Heidari,

This paper concentrates on a new procedure which experimentally recognises gears and bearings faults of a typical gearbox system using a least square support vector machine (LSSVM). Two wavelet selection criteria Maximum Energy to Shannon Entropy ratio and Maximum Relative Wavelet Energy are used and compared to select an appropriate wavelet for feature extraction. The fault diagnosis method co...

Journal: :پژوهشنامه کتابداری و اطلاع رسانی 0
اعظم بیگ لو محمدرضا داورپناه

purpose: this study aimed to measure the information load of words in farsi scientific texts and determine the relationship between some properties of the words and information load of the texts based on shannon entropy. methodology: the study was conducted based on the content analysis of a number of 320 articles that were published in scientific-research iranian journals in 2009. and the arti...

Journal: :Pattern Recognition 1994
Chein-I Chang Kebo Chen Jianwei Wang Mark L. G. Althouse

In this paper, we present a new image thresholding technique which uses the relative entropy (also known as the Kullback-Leiber discrimination distance function) as a criterion of thresholding an image. As a result, a gray level minimizing the relative entropy will be the desired threshold. The proposed relative entropy approach is different from two known entropy-based thresholding techniques,...

Journal: :CoRR 2012
Theodore P. Hill Marco Dall'Aglio

The classical Bayesian posterior arises naturally as the unique solution of several different optimization problems, without the necessity of interpreting data as conditional probabilities and then using Bayes' Theorem. For example, the classical Bayesian posterior is the unique posterior that minimizes the loss of Shannon information in combining the prior and the likelihood distributions. The...

Journal: :CoRR 2017
Thomas A. Courtade Guangyue Han Yaochen Wu

We give a counterexample to the vector generalization of Costa’s entropy power inequality (EPI) due to Liu, Liu, Poor and Shamai. In particular, the claimed inequality can fail if the matix-valued parameter in the convex combination does not commute with the covariance of the additive Gaussian noise. Conversely, the inequality holds if these two matrices commute. For a random vector X with dens...

Journal: :IEEE Trans. Information Theory 2014
Christoph Bunte Amos Lapidoth

A task is randomly drawn from a finite set of tasks and is described using a fixed number of bits. All the tasks that share its description must be performed. Upper and lower bounds on the minimum ρ-th moment of the number of performed tasks are derived. The case where a sequence of tasks is produced by a source and n tasks are jointly described using nR bits is considered. If R is larger than ...

Journal: :IACR Cryptology ePrint Archive 2012
Benjamin Fuller Leonid Reyzin

We investigate how information leakage reduces computational entropy of a random variable X. Recall that HILL and metric computational entropy are parameterized by quality (how distinguishable is X from a variable Z that has true entropy) and quantity (how much true entropy is there in Z). We prove an intuitively natural result: conditioning on an event of probability p reduces the quality of m...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید