next up previous
Next: Combining probabilities Up: Probability theory Previous: Introduction

What is probability?

What is the scientific definition of probability? Well, let us consider an observation made on a general system $S$. This can result in any one of a number of different possible outcomes. We want to find the probability of some general outcome $X$. In order to ascribe a probability, we have to consider the system as a member of a large set ${\mit\Sigma}$ of similar systems. Mathematicians have a fancy name for a large group of similar systems. They call such a group an ensemble, which is just the French for ``group.'' So, let us consider an ensemble ${\mit\Sigma}$ of similar systems $S$. The probability of the outcome $X$ is defined as the ratio of the number of systems in the ensemble which exhibit this outcome to the total number of systems, in the limit where the latter number tends to infinity. We can write this symbolically as
\begin{displaymath}
P(X) = ~_{lt\,{\mit\Omega}({\mit\Sigma})\rightarrow\infty}\frac{{\mit\Omega}(X)}{{\mit\Omega}({\mit\Sigma})},
\end{displaymath} (1)

where ${\mit\Omega}({\mit\Sigma})$ is the total number of systems in the ensemble, and ${\mit\Omega}(X)$ is the number of systems exhibiting the outcome $X$. We can see that the probability $P(X)$ must be a number between 0 and 1. The probability is zero if no systems exhibit the outcome $X$, even when the number of systems goes to infinity. This is just another way of saying that there is no chance of the outcome $X$. The probability is unity if all systems exhibit the outcome $X$ in the limit as the number of systems goes to infinity. This is another way of saying that the outcome $X$ is bound to occur.


next up previous
Next: Combining probabilities Up: Probability theory Previous: Introduction
Richard Fitzpatrick 2006-02-02