نتایج جستجو برای: jointly distributed random variables
تعداد نتایج: 835035 فیلتر نتایج به سال:
the main purpose of this note is to establish some bounds in poisson approximation for row-wise arrays of independent geometric distributed random variables using the operator method. some results related to random sums of independent geometric distributed random variables are also investigated.
We consider jointly distributed random variables X and Y. After describing the Gács-Körner common information between the random variables from the viewpoint of the capacity region of the Gray-Wyner system, we propose a new notion of common information between the random variables that is dual to the Gács-Körner common information, from this viewpoint, in a well-defined sense. We characterize t...
• Proof: X is j G implies that V = uX is G with mean uμ and variance uΣu. Thus its characteristic function, CV (t) = e ituμe−t 2uTΣu/2. But CV (t) = E[e itV ] = E[e TX ]. If we set t = 1, then this is E[e TX ] which is equal to CX(u). Thus, CX(u) = CV (1) = e iuμe−u TΣu/2. • Proof (other side): we are given that the charac function ofX, CX(u) = E[eiuTX ] = e μe−u TΣu/2. Consider V = uX. Thus, C...
We prove a new extremal inequality, motivated by the vector Gaussian broadcast channel problem. As a corollary, this inequality yields a generalization of the classical vector entropy-power inequality (EPI). As another corollary, this inequality sheds insight into maximizing the differential entropy of the sum of two jointly distributed random variables.
We consider a two player random bimatrix game where each player is interested in the payoffs which can be obtained with certain confidence. The players’ payoff functions in such game theoretic problems are defined using chance constraints. We consider the case where the entries of each player’s random payoff matrix jointly follow a multivariate elliptically symmetric distribution. We show an eq...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید