is an information-theoretic measure of the dissimilarity between q = q 1, • • · ,qn and p = p 11 • • • , pn (H is also called cross-entropy, discrimination information, directed divergence, !-divergence, K-L number, among other terms). Various properties of relative entropy have led to its widespread use in information theory. These properties suggest that relative entropy has a role to play in...