نتایج جستجو برای: entropy estimate

تعداد نتایج: 291306  

Journal: :مجله علوم آماری 0
آرزو حبیبی راد arezoo habibi rad department of statistics, ferdowsi university of mashhad, mashhad, iran.گروه آمار، دانشگاه فردوسی مشهد ناصررضا ارقامی naser reza arghami department of statistics, ferdowsi university of mashhad, mashhad, iran.گروه آمار، دانشگاه فردوسی مشهد

the estimate of entropy (sample entropy), has been introduced by vasicek (1976), for the first time. in this paper, we provide an estimate of entropy of order statistics, that is the extention of the entropy estimate. then we present an application of the entropy estimate of order statistics as a test statistic for symmetry of distribution versus skewness. the proposed test has been compared wi...

پایان نامه :وزارت علوم، تحقیقات و فناوری - دانشگاه صنعتی اصفهان - دانشکده ریاضی 1390

the main objective in sampling is to select a sample from a population in order to estimate some unknown population parameter, usually a total or a mean of some interesting variable. a simple way to take a sample of size n is to let all the possible samples have the same probability of being selected. this is called simple random sampling and then all units have the same probability of being ch...

Journal: :journal of sciences, islamic republic of iran 2010
m. sabbaghan

a 1993 result of j. llibre, and m. misiurewicz, (theorem a [5]), states that if a continuous map f of a graph into itself has an s-horseshoe, then the topological entropy of f is greater than or equal to logs, that is h( f ) ? logs. also a 1980 result of l.s. block, j. guckenheimer, m. misiurewicz and l.s. young (lemma 1.5 [3]) states that if g is an a-graph of f then h(g) ? h( f ). in this pap...

M. Sabbaghan

A 1993 result of J. Llibre, and M. Misiurewicz, (Theorem A [5]), states that if a continuous map f of a graph into itself has an s-horseshoe, then the topological entropy of f is greater than or equal to logs, that is h( f ) ? logs. Also a 1980 result of L.S. Block, J. Guckenheimer, M. Misiurewicz and L.S. Young (Lemma 1.5 [3]) states that if G is an A-graph of f then h(G) ? h( f ). In this pap...

2002
Deniz Erdogmus José Carlos Príncipe Kenneth E. Hild

Hebbian learning is one of the mainstays of biologically inspired neural processing. Hebb’s rule is biologically plausible, and it has been extensively utilized in both computational neuroscience and in unsupervised training of neural systems. In these fields, Hebbian learning became synonymous for correlation learning. But it is known that correlation is a second order statistic of the data, s...

Journal: :Journal of Approximation Theory 2001

We study the entropy rate of a hidden Markov process, defined by observing the output of a symmetric channel whose input is a first order Markov process. Although this definition is very simple, obtaining the exact amount of entropy rate in calculation is an open problem. We introduce some probability matrices based on Markov chain's and channel's parameters. Then, we try to obtain an estimate ...

پایان نامه :وزارت علوم، تحقیقات و فناوری - دانشگاه تربیت مدرس - دانشکده علوم پایه 1391

bekenstein and hawking by introducing temperature and every black hole has entropy and using the first law of thermodynamic for black holes showed that this entropy changes with the event horizon surface. bekenstein and hawking entropy equation is valid for the black holes obeying einstein general relativity theory. however, from one side einstein relativity in some cases fails to explain expe...

M. Abbasnejad, M. Tavakoli, N. R. Arghami,

In this paper, we introduce a goodness of fit test for expo- nentiality based on Lin-Wong divergence measure. In order to estimate the divergence, we use a method similar to Vasicek’s method for estimat- ing the Shannon entropy. The critical values and the powers of the test are computed by Monte Carlo simulation. It is shown that the proposed test are competitive with other tests of exponentia...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید