Science Fair Project Encyclopedia
The conditional entropy is an entropy measure used in information theory. The conditional entropy measures how much entropy a random variable Y has remaining if we have already learned completely the value of a second random variable X. It is referred to as the entropy of Y conditional on X, and is written H(Y | X). Like other entropies, the conditional entropy is measured in bits.
Given random variables X and Y with entropies H(X) and H(Y), and with a joint entropy H(X,Y), the conditional entropy of Y given X is defined as . Intuitively, the combined system contains H(X,Y) bits of information. If we learn the value of X, we have gained H(X) bits of information, and the system has H(Y | X) bits remaining.
H(Y | X) = 0 if and only if the value of Y is completely determined by the value of X. Conversely, H(Y | X) = H(Y) if and only if Y and X are independent random variables.
The contents of this article is licensed from www.wikipedia.org under the GNU Free Documentation License. Click here to see the transparent copy and copyright details