Quantcast
Channel: Statistical Trends & Numbers »» Probability
Viewing all articles
Browse latest Browse all 4

The Bare Bones of Probability

$
0
0


How likely is it that an event will happen? Probabilities attempt to answer this question by assigning numbers to the likelihood of events. A probability is always a number between 0 and 1. The closer to 0 the probability, the less likely the event; the closer to 1, the more likely the event. An event with zero probability is thus regarded as impossible, an event with probability equal to one thus is regarded as certain.

When an event occurs, it is one of a range of possibilities. Consider a die with six faces. The range of possibilities here is any of the die’s six faces, each of which has probability 1/6 and is thus as likely to appear as any other. This range of possibilities can be represented by the set {1, 2, 3, 4, 5, 6}. An event can be thought of as any subset of this set.

Take, for instance, an odd number being tossed. This event corresponds to the subset {1, 3, 5}. Such an event is said occur provided any one of its outcomes occurs, that is, if either a 1 or a 3 or a 5 is tossed. Events are thus composed of outcomes. One can think of outcomes as the “elementary particles” that make up events.

Outcomes can therefore be represented as singleton sets, that is, as sets with only one element. Thus, the outcomes associated with the event {1, 3, 5} are the singleton sets {1}, {3}, and {5}. Outcomes are sometimes also called elementary events. Events include not just outcomes but also composite events such as an odd face being tossed, which includes more than one outcome.

Probabilities obey three axioms: (1) The impossible event (an event that is logically of physically impossible) is represented by the empty set and has probability zero. (2) The necessary event (an event guaranteed to happen) is represented by the entire range of possibilities and has probability one (in this die tossing example, {1, 2, 3, 4, 5, 6} has probability equal to one). (3) Events that are mutually exclusive have probabilities that sum together. Thus, P({1, 3, 5}) = P({1}) + P({3}) + P({5}).

Important: It follows from (2) and (3) that mutually exclusive and exhaustive events always sum to one. Thus P({1}) + P({2}) + P({3} + P({4}) + P({5} + P({6}) = P({1, 2, 3, 4, 5, 6}) = 1.

Suppose an event is known to have occurred and we then ask what is the probability of another event occurring. For instance, suppose we know that an odd number has been tossed with a die (i.e., {1, 3, 5}) and we want to know the probability of tossing a 1. The probability of tossing a 1 without knowing that an odd face has been tossed is 1/6 because the range of possibilities against which the probability of this event is calculated is {1, 2, 3, 4, 5, 6}.

But when we factor in the new information that an odd number has been tossed, that changes the probability of a 1 occurring. In that case, the new information reduces our range of possibilities to {1, 3, 5} and we must now calculate the probability of tossing a 1 in relation to this new range of possibilities. The probability of getting a 1 given that an odd face was tossed is not 1/6 but 1/3. We say that the probability of an event given another event is called a conditional probability. We write this as P({1} | {1, 3, 5}) (note the vertical stroke between the two events, which one reads as “given that”—the probability of tossing a 1 given that an odd face was tossed).

Often, adding information changes the probability of an event. But sometimes it doesn’t. What if instead of tossing one die we tossed two and you learned that the first die came up an odd number. Would that change the probability of the second toss coming up a 1? No, it would not. When a conditional probability of an event does not change the original probability of the event, then we say those events are probabilistically independent. Coin flips are likewise probabilistically independent—one flip doesn’t affect the probability of another.

Probabilities are interpreted in three main ways: (1) As degree of belief measuring the strength with which one believes that an event will occur. This can be quite subjective. (2) As relative frequency measuring the number of occurrences of an event divided by the number of opportunities for it to occur. Such probabilities are empirical. (3) As uniform uncertainty assigning the same degree of uncertainly to each of the underlying outcomes. These are sometimes called theoretical probabilities.

Although these three ways of interpreting probability can be at odds, in some situations they can all agree. Thus, in tossing a die, one assigns uniform uncertainty to each face because of the die’s geometry and symmetry, which suggests that no face is more likely to appear than any other. At the same time, repeated tossing of the die shows that relative frequencies approximate the equal probabilities assigned by uniform uncertainty. And finally, one’s degree of belief that a given face will be tossed typically looks to these other two ways of interpretation.

Of course, these ways of interpreting probability can also disagree. For one time events in which no frequency data are available and where uniform uncertainty makes no sense, one’s degree of belief can be highly subjective and the corresponding probability may convey a false sense of precision. What is the probability that Congress’s approval rating will rise in the next year? We may have a strong belief about this, but that belief will also be highly subjective.


Viewing all articles
Browse latest Browse all 4

Trending Articles