# All Science Fair Projects

## Science Fair Project Encyclopedia for Schools!

 Search    Browse    Forum  Coach    Links    Editor    Help    Tell-a-Friend    Encyclopedia    Dictionary

# Science Fair Project Encyclopedia

For information on any area of science that interests you,
enter a keyword (eg. scientific method, molecule, cloud, carbohydrate etc.).
Or else, you can start by choosing any of the categories below.

# Conditional entropy

The conditional entropy is an entropy measure used in information theory. The conditional entropy measures how much entropy a random variable Y has remaining if we have already learned completely the value of a second random variable X. It is referred to as the entropy of Y conditional on X, and is written H(Y | X). Like other entropies, the conditional entropy is measured in bits.

Given random variables X and Y with entropies H(X) and H(Y), and with a joint entropy H(X,Y), the conditional entropy of Y given X is defined as $H(Y|X) \equiv H(X,Y) - H(X)$. Intuitively, the combined system contains H(X,Y) bits of information. If we learn the value of X, we have gained H(X) bits of information, and the system has H(Y | X) bits remaining.

H(Y | X) = 0 if and only if the value of Y is completely determined by the value of X. Conversely, H(Y | X) = H(Y) if and only if Y and X are independent random variables.

In quantum information theory, the conditional entropy is generalized to the conditional quantum entropy.

03-10-2013 05:06:04