Joint entropy
From Wikipedia, the free encyclopedia
| Information theory |
|---|

In information theory, joint entropy is a measure of the uncertainty associated with a set of variables.[1]
The joint Shannon entropy (in bits) of two discrete random variables and with images and is defined as[2]: 16
where and are particular values of and , respectively, is the joint probability of these values occurring together, and is defined to be 0 if .
For more than two random variables this expands to
where are particular values of , respectively, is the probability of these values occurring together, and is defined to be 0 if .
Properties
Nonnegativity
The joint entropy of a set of random variables is a nonnegative number.
Greater than individual entropies
The joint entropy of a set of variables is greater than or equal to the maximum of all of the individual entropies of the variables in the set.
Less than or equal to the sum of individual entropies
The joint entropy of a set of variables is less than or equal to the sum of the individual entropies of the variables in the set. This is an example of subadditivity. This inequality is an equality if and only if and are statistically independent.[2]: 30