نتایج جستجو برای: error probabilities

تعداد نتایج: 291823  

A. Naderi, D. A. S. Fraser, Jie Su, Kexin Ji, Wei Lin,

Welch & Peers (1963) used a root-information prior to obtain posterior probabilities for a scalar parameter exponential model and showed that these Bayes probabilities had the confidence property to second order asymptotically. An important undercurrent of this indicates that the constant information reparameterization provides location model structure, for which the confidence property ...

Journal: :Synthese 2022

A serious error in the proof of a recent characterization existence full conditional probabilities invariant under symmetries is corrected.

Journal: :IEEE Trans. Information Theory 1990
Mao Chao Lin

The (n, k , d 2 2t + 1) binary linear codes are studied, which are used for correcting error patterns of weight at most t and detecting other error patterns over a binary symmetric channel. In particular, for t = 1, it is shown that there exists one code whose probability of undetected errors is upper bounded by (n + 1]2"-k n ] l when used on a binary symmetric channel with transition probabili...

2014
Koenraad M.R. Audenaert Milán Mosonyi

We consider the multiple hypothesis testing problem for symmetric quantum state discrimination between r given states σ1, . . . , σr. By splitting up the overall test into multiple binary tests in various ways we obtain a number of upper bounds on the optimal error probability in terms of the binary error probabilities. These upper bounds allow us to deduce various bounds on the asymptotic erro...

Journal: :New Journal of Physics 2021

Abstract We consider the solution of subset sum problem based on a parallel computer consisting self-propelled biological agents moving in nanostructured network that encodes computing task its geometry. develop an approximate analytical method to analyze effects small errors nonideal junctions composing by using Gaussian confidence interval approximation multinomial distribution. concretely ev...

2005
Colin Fyfe

In this paper, we show how our AI opponents learn internal representations of probabilities. We use a Bayesian interpretation of such subjectivist probabilities but do not implement full Bayesian methods of parameter estimation since we wish the AIs to be as human-like as possible. Thus the parameters of the subjectivist probabilities are learned incrementally.

Journal: :روش های عددی در مهندسی (استقلال) 0
قاسم میرجلیلی g. mirjalily محمدرضا عارف m. r. aref محمد مهدی نایبی و مسعود کهریزی m. m. nayebi and m. kahrizi

in a detection network, the final decision is made by fusing the decisions from local detectors. the objective of that decision is to minimize the final error probability. to implement and optimal fusion rule, the performance of each detector, i.e. its probability of false alarm and its probability of missed detection as well as the a priori probabilities of the hypotheses, must be known. howev...

1998
E. Knill

There are quantum algorithms that can efficiently simulate quantum physics, factor large numbers and estimate integrals. As a result, quantum computers can solve otherwise intractable computational problems. One of the main problems of experimental quantum computing is to preserve fragile quantum states in the presence of errors. It is known that if the needed elementary operations (gates) can ...

M. Yadegari S. A. Seyedin

One of the important challenges in Graphical models is the problem of dealing with the uncertainties in the problem. Among graphical networks, fuzzy cognitive map is only capable of modeling fuzzy uncertainty and the Bayesian network is only capable of modeling probabilistic uncertainty. In many real issues, we are faced with both fuzzy and probabilistic uncertainties. In these cases, the propo...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید