Science Fair Project Encyclopedia
This article defines some terms which characterize probability distributions of two or more variables.
Joint probability is the probability of two events in conjunction. That is, it is the probability of both events together. The joint probability of A and B is written P(A, B).
Marginal probability is the probability of one event, ignoring any information about the other event. Marginal probability is obtained by summing (or integrating, more generally) the joint probability over the ignored event. The marginal probability of A is written P(A), and the marginal probability of B is written P(B).
In these definitions, note that there need not be a causal or temporal relation between A and B. A may precede B, or vice versa, or they may happen at the same time. A may cause B, or vice versa, or they may have no causal relation at all.
If A and B are events, and P(B) > 0, then
Equivalently, we have
If (equivalently, P(A | B) = P(A)), then we say that A and B are independent.
If B is an event and P(B) > 0, then the function Q defined by Q(A) = P(A | B) for all events A is a probability measure.
If P(B) = 0, P(A | B) is left undefined.
Conditional probability is more easily calculated with a decision tree.
The contents of this article is licensed from www.wikipedia.org under the GNU Free Documentation License. Click here to see the transparent copy and copyright details