next up previous
Next: Combinatorial Analysis Up: Probability Theory Previous: Combining Probabilities

Two-State System

The simplest non-trivial system that we can investigate using probability theory is one for which there are only two possible outcomes. (There would obviously be little point in investigating a one-outcome system.) Let us suppose that there are two possible outcomes to an observation made on some system, $ S$ . Let us denote these outcomes 1 and 2, and let their probabilities of occurrence be

$\displaystyle P(1)$ $\displaystyle = p,$ (2.9)
$\displaystyle P(2)$ $\displaystyle = q.$ (2.10)

It follows immediately from the normalization condition, Equation (2.5), that

$\displaystyle p+q=1,$ (2.11)

so $ q=1-p$ . The best known example of a two-state system is a tossed coin. The two outcomes are ``heads'' and ``tails,'' each with equal probabilities $ 1/2$ . So, $ p=q=1/2$ for this system.

Suppose that we make $ N$ statistically independent observations of $ S$ . Let us determine the probability of $ n_1$ occurrences of the outcome $ 1$ , and $ N-n_1$ occurrences of the outcome 2, with no regard as to the order of these occurrences. Denote this probability $ P_N(n_1)$ . This type of calculation crops up very often in probability theory. For instance, we might want to know the probability of getting nine ``heads'' and only one ``tails'' in an experiment where a coin is tossed ten times, or where ten coins are tossed simultaneously.

Consider a simple case in which there are only three observations. Let us try to evaluate the probability of two occurrences of the outcome 1, and one occurrence of the outcome 2. There are three different ways of getting this result. We could get the outcome 1 on the first two observations, and the outcome 2 on the third. Or, we could get the outcome 2 on the first observation, and the outcome 1 on the latter two observations. Or, we could get the outcome 1 on the first and last observations, and the outcome 2 on the middle observation. Writing this symbolically, we have

$\displaystyle P_3(2)= P(1\otimes 1\otimes 2 \!\mid\! 2\otimes 1\otimes 1 \!\mid\! 1\otimes 2 \otimes 1).$ (2.12)

Here, the symbolic operator $ \otimes$ stands for ``and,'' whereas the symbolic operator $ \mid\!$ stands for ``or.'' This symbolic representation is helpful because of the two basic rules for combining probabilities that we derived earlier in Equations (2.4) and (2.8):

$\displaystyle P(X\!\mid\! Y)$ $\displaystyle = P(X) + P(Y),$ (2.13)
$\displaystyle P(X \otimes Y)$ $\displaystyle = P(X) P(Y).$ (2.14)

The straightforward application of these rules gives

$\displaystyle P_3(2) = p p q + q p p + p q p = 3  p^{ 2} q$ (2.15)

for the case under consideration.

The probability of obtaining $ n_1$ occurrences of the outcome $ 1$ in $ N$ observations is given by

$\displaystyle P_N(n_1) = C^{  N}_{ n_1, N-n_1} p^{ n_1}  q^{ N-n_1},$ (2.16)

where $ C^{  N}_{  n_1,  N-n_1}$ is the number of ways of arranging two distinct sets of $ n_1$ and $ N-n_1$ indistinguishable objects. Hopefully, this is, at least, plausible from the previous example. There, the probability of getting two occurrences of the outcome 1, and one occurrence of the outcome 2, was obtained by writing out all of the possible arrangements of two $ p$ s (the probability of outcome 1) and one $ q$ (the probability of outcome 2), and then adding them all together.


next up previous
Next: Combinatorial Analysis Up: Probability Theory Previous: Combining Probabilities
Richard Fitzpatrick 2016-01-25