Suppose you’re a
gambler in Renaissance Italy. Well, actually, you haven’t done much gambling at
all recently. You’ve been in a bit of a slump since you lost much of your money
to your friend Gerolamo Cardano, who boasts that he can predict outcomes of
dice games based on mathematics. You yourself don’t care much for mathematics –
you’re more of a doer than a thinker – but his success really struck you.
Clearly there is more to these games than lucky numbers and divine
intervention. Might there be some way for you to make predictions about the
outcomes of dice rolls like Cardano does, but without using any sort of
mathematical formula?
You start thinking, and suddenly you have a brilliant
idea. If Cardano is correct and different dice roles occur with a certain
frequency, you should be able to figure out that frequency by rolling dice and
seeing how often a certain roll appears. You decide to see if you can predict
the frequency of rolling a seven with a pair of dice. You toss your pair of
dice ten times and get a seven twice. Does this mean that, out of every ten
times you roll a pair of dice, two of those times their values must sum to
seven? You try it again, but this time you only get a seven once. In
frustration, you repeat this process hundreds of times over, but the number of
sevens you get for every ten rolls is always variable. Still, you realize that
getting five sevens out of ten rolls happens a lot let frequently than getting
one seven out of ten rolls. Perhaps there is still some deeper pattern at play.
You decide to combine all of your results, and you notice that the number of
times you rolled a seven divided by the total number of rolls is very close to
1/6. You continue tossing your dice some more, and you realize the more tosses
you perform, the closer the ratio of seven-rolls to the total number of rolls
is to 1/6.
The formal name for this very familiar principle is the
Law of Large Numbers, which states that the frequency of an event occurring over
a large number of trials is close to the expected frequency of that event
occurring. For example, if you were to flip a coin ten times, it wouldn’t be
out of the usual to get 60% heads, but getting 60% heads out of a thousand coin
flips would probably cause you to doubt the fairness of that coin.
This allows us to deduce the probability of an event by “working
backwards”. One issue, however, is with the definition of “large”. Does large
mean ten thousand or ten billion? The answer is that it really depends on the
system being considered, but the number of trials is probably “large enough” if
there is very little fluctuation in the results if that same number of trials
is repeated again.
For complicated systems like Chutes and Ladders, computing
the probability first in this “empirical” fashion is a good way to confirm that
the probability determined later with mathematics is correct. One of the tasks
I had in mind at the start of this project was finding out the expected
probability for completing a game of Chutes and Ladders in a certain number of
moves. I discovered that “large” for Chutes and Ladders is in the millions, so
computing this probability by playing the game over and over again is out of
the question. Instead, in this past week I wrote a program that can simulate games
of Chutes and Ladders very rapidly and report on the number of times a game was
completed in a certain number of moves. Here are the results represented by an
Excel graph:
The
average number of moves required to finish a game obtained was 39.879.
Of
course, obtaining the probabilities in this fashion doesn’t reveal why the
results are what they are, which I think is the more interesting part.
No comments:
Post a Comment