Module: Basic statistics
Quote of the page
All truly wise thoughts have been thought already thousands of times; but to make them truly ours, we must think them over again honestly, till they take root in our personal experience.
- Johann Wolfgang von Goethe
The theory of probability arose out of the study of gambling, as we saw earlier. Gambling is essentially a kind of risk; you risk a financial loss in the hope of a financial gain. Probability can be used to measure the risk, and to help you assess whether a particular kind of bet is worth taking. Not surprisingly, then, the most straightforward application of probability is to other forms of risk-taking behavior.
The most obvious kinds of risk-taking behavior are economic. For example, investing in the stock exchange is clearly related to gambling, in that the investor accepts a degree of financial risk in the hope of financial gain. But risk-taking is involved in virtually every kind of human activity. When you take medicine, there is always a risk of adverse effects; when you take a bus, there is always a risk that you will be injured in an accident; and so on. Probability can be used to help us make reasonable choices in the face of the inevitable risks involved in life.
The basic concept needed for the analysis of risks is expected value. For example, consider the following bet. I toss two coins, and I pay you $2 if they both show heads but you pay me $1 if one or both show tails. The expected value of this bet is obtained by multiplying the probability of each outcome by its value to you, and then adding the results. In the bet, there are four possible outcomes (HH, TT, HT and TH), each of which has a probability of 1/4. The first outcome has a positive value of $2 (you win two dollars), and the other three outcomes have a negative value of $1 (you lose a dollar). So the expected value to you of the bet is ($2 x 1/4) - ($1 x 3/4) = -$0.25.
What does this expected value mean? Clearly it doesn't mean that you should expect to lose $0.25 on the bet, since you will either win $2 or lose $1. But it tells you something about what you should expect in the long run. If you bet over and over again, you will win some and lose some, but eventually your winnings and losses will average out to a loss of $0.25 per game. In the long run, this bet will not be financially worthwhile to you (but it will be worthwhile to me!).
A bet with an expected value of zero is called a fair bet. For example, if the above bet is modified so that I pay you $3 if both coins show heads, then it is fair. It is fair in the sense that there is no built-in bias in favor of either one of us.
The bets you can make at casinos or at the racetrack are almost never fair. The expected value of these bets is usually negative for the person making the bet, and positive for the house; that's how the house makes a profit. The expected profit for the house, expressed as a percentage of the amount bet, is called the house edge. For example, if a particular kind of bet costs $10 and has an expected payout of $9, then the house has an expected profit of $1 per bet, giving a house edge of 10%.
For example, in the game of roulette, a ball is thrown onto a spinning wheel on which there are a number of indentations or "pockets"; when the wheel stops, the ball comes to rest in one of the pockets. (The idea for this device is often credited to Pascal.) The pockets are numbered from 1 to 36. A bet on a single number pays at 35:1, which means that a $1 bet returns $36 ($35 winnings plus your original bet). If that were the whole story, then each bet would be fair, and the expected profit for the casino would be nothing.
But of course that's not the whole story; there is also a pocket on the wheel numbered "zero", so the probability of each number is 1/37, not 1/36. If you bet $1 on a single number, the expected value of the bet is ($35 x 1/37) - ($1 x 36/37) = -$0.027. In other words, the expected profit for the house is 2.7 cents for every dollar bet, giving a house edge of 2.7%.
Similarly at the racetrack. The odds quoted on a horse tell you the payout for a $1 bet; for example, if a horse has odds of 8:1, and you bet on it to win, then your payout is $9 if it wins ($8 plus your original bet). If the quoted odds told you the actual chance of the horse winning, then the bet would be fair. But the quoted odds don't necessarily have anything to do with the chance that the horse will win. Instead, they are calculated based on the amount of money that has been bet on that horse.
For example, suppose a total of $1,000,000 has been bet on a given race (counting only bets to win), and suppose that $50,000 has been bet on the horse Blaise. To calculate the odds on Blaise, the house first subtracts 15% from the total bet, giving a "win pool" of $850,000. The house intends to divide this win pool among the winning bets. In other words, if Blaise wins, the house intends to pay out the $50,000 bet on Blaise plus $800,000 in winnings. Since the winnings are 16 times as big as the amount bet, the odds are set at 16:1. The odds are set in this way for each horse (and similarly for other kinds of bet), ensuring that the house edge is 15% whichever horse wins.
Many books have been written on "systems" for winning at gambling games. Most of them don't work, because the house retains its edge no matter what you do. For example, whatever betting system you use when playing roulette, your expected loss is still 2.7 cents for every dollar you bet. However, for some types of game, there are ways to decrease your expected loss, and perhaps even turn it into an expected profit. In lottery games (like Mark Six), you can't increase your chance of winning, but you can increase your expected payout if you win. That's because the win pool is divided among the winning players; the fewer players that picked the same numbers as you, the more you win. If you pick unpopular numbers, the expected value of your bet goes up, and may even become positive. A similar system can be used at horse racing.
For a detailed account of betting systems, good and bad, see J. D. McGervey (1986), Probabilities in Everyday Life. Chicago: Nelson-Hall.