نتایج جستجو برای: expectation
تعداد نتایج: 42247 فیلتر نتایج به سال:
We say that P[A|B] the conditional probability of A, given B. It is important to note that the condition P[B] > 0 is crucial. When X and Y are random variables defined on the same probability space, we often want to give a meaning to the expression P[X ∈ A|Y = y], even though it is usually the case that P[Y = y] = 0. When the random vector (X, Y) admits a joint density fX,Y(x, y), and fY(y) > 0...
(See e.g. [2], [6] or [18]. Of course p0(n) is not quite unique, but following common practice we will often say “the” threshold when we should really say “a.”) It follows from [8] that every {Fn} has a threshold, and that in fact (see e.g. [18, Proposition 1.23 and Theorem 1.24]) p0(n) := pc(Fn) is a threshold for {Fn}. So the quantity pc conveniently captures threshold behavior, in particular...
Common-sense physical reasoning is an essential ingredient for any intelligent agent operating in the real-world. For example, it can be used to simulate the environment, or to infer the state of parts of the world that are currently unobserved. In order to match real-world conditions this causal knowledge must be learned without access to supervised data. To solve this problem, we present a no...
We introduce a novel framework for clustering that combines generalized EM with neural networks and can be implemented as an end-to-end differentiable recurrent neural network. It learns its statistical model directly from the data and can represent complex non-linear dependencies between inputs. We apply our framework to a perceptual grouping task and empirically verify that it yields the inte...
We show that preconditioners constructed by random sampling can perform well without meeting the standard requirements of iterative methods. When applied to graph Laplacians, this leads to ultrasparsifiers that in expectation behave as the nearly-optimal ones given by [Kolla-Makarychev-Saberi-Teng STOC‘10]. Combining this with the recursive preconditioning framework by [Spielman-Teng STOC‘04] a...
In this paper, we use a general mathematical and experimental methodology to analyze image deconvolution. The main procedure is to use an example image convolving it with a know Gaussian point spread function and then develop algorithms to recover the image. Observe the deconvolution process by adding Gaussian and Poisson noise at different signal to noise ratios. In addition, we will describe ...
A series of corrections is developed for the fixed points of Expectation Propagation (EP), which is one of the most popular methods for approximate probabilistic inference. These corrections can lead to improvements of the inference approximation or serve as a sanity check, indicating when EP yields unrealiable results.
This paper investigates the role of resource allocation as a source of processing difficulty in human sentence comprehension. The paper proposes a simple information-theoretic characterization of processing difficulty as the work incurred by resource reallocation during parallel, incremental, probabilistic disambiguation in sentence comprehension, and demonstrates its equivalence to the theory ...
We study the query complexity of computing a function f : {0, 1} → R+ in expectation. This requires the algorithm on input x to output a nonnegative random variable whose expectation equals f(x), using as few queries to the input x as possible. We exactly characterize both the randomized and the quantum query complexity by two polynomial degrees, the nonnegative literal degree and the sum-ofsqu...
This note represents my attempt at explaining the EM algorithm (Hartley, 1958; Dempster et al., 1977; McLachlan and Krishnan, 1997). This is just a slight variation on TomMinka’s tutorial (Minka, 1998), perhaps a little easier (or perhaps not). It includes a graphical example to provide some intuition. 1 Intuitive Explanation of EM EM is an iterative optimizationmethod to estimate some unknown ...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید