Simulation 2

Pair of Dice Example

Here we'll simulate rolling a pair or 6-sided dice, keeping track of what numbers we get. So one trial = roll a pair of 6 sided dice. How often should a 7 appear (aka the ideal "expected" result)? How often should a 2 appear?


Mathematically, the 7 should appear 1/7 of the time. As then number of trials gets larger (100, 1000, 10000), note how the simulation produces a count which gets closer to the mathematically expected result.

Head Count Example

Trial = flip a coin 10 times, count the number of heads. Here we use a second, nested loop to flip a coin 10 times, counting the number of times that heads comes up.

How often do you expect that all 10 flips are heads?


Heads In A Row Example

Flip a coin 10 times, but stop on the first tails. How many heads in a row do you get? ("break;" is a command to immediately exit the current loop.)


Mehran Class Bet Exercise

On certain days in Mehran's probability class, Mehran will give a student $50 if they can roll 10 6-sided dice and get a total of 25 or less or 45 or more. What are the odds of the student winning?


  // your code here
  sum = 0;
  for (j: series(10)) {
    sum = sum + random(1, 6);
  }
  if (sum <= 25 || sum >= 45) {
    histogram.add(1);
  }
  // the odds work out to about 7.8%

Roulette Exercise

Roulette -- the wheel with the little ball that falls into a slot, appearing in many movies (Roulette wikipedia). The slots are numbered 0..36. The simplest bet is to take 19..36 (or 1..18), which pays double if the ball falls in that range. If the ball falls in the 0 slot, then all players lose -- this provides the small house edge which is magnified greatly the more times the player plays. The effect of playing more and more times can be explored with the simulation.

Add code here so one trial is playing roulette with a starting balance of 10, betting 1 each time, and playing 50 rounds (or until out of money). For each trial, record the ending balance in the histogram. (Solution available.)


The code shows the range of outcomes playing roulette 50 times. What happens if the player plays 500 or 5000 times? Side question: if the player plays 10 times, there is an even/odd pattern in their possible balance. Why is this?

    // your code here
    if (bal == 0) {
      break;
    }
    bal = bal - 1;  // place the bet
    roll = random(0, 36);
    if (roll >= 19) { // get double back on win
      bal = bal + 2;
    }

Rickety Bridge Exercise

You are in a party of 6 on one side of a deep ravine. Pirates are making you all walk across a rickety bridge across the ravine, but only one person can go across at a time. For the 1st person to go across, there is 1/6 chance of bridge collapse. For the 2nd person, a 2/6 chance of collapse, and so on. The 6th person has a 6/6 (100%) chance of collapse if they have to go. If the bridge collapses, nobody else has to go. If you want to avoid being on the bridge when it collapses, which person 1-6 do you want to be?


What is the best person to be? It's harder to see which is the worst person to be, but it can be done with a large number of trials.

    // your code here
    roll = random(1, 6);
    if (roll <= j) { // bridge collapses for person j
      break;
    }
    // i.e. random(1, 6) <= the person's number (1, 2, 3, ...)
    // is the probability of the bridge breaking (1/6, 2/6, 3/6, ...)

If you are interested in probability, it is also possible to work out the exact mathematics for each person. Construct a tree of the events. From the start position, there are two branches: a. person 1 bridge collapse (1/6), b. person 1 makes it across (5/6). Then extending from branch (b) there are two branches for person 2, and so on. For the probability of a particular outcome, work from the start, multiplying together the probabilities of each branch taken. Note that for each a/b branch, the probabilities add to 1 (either a or b happens). Working out just the first couple layers of the tree, you can check your math vs. the results of the simulation.

Monty Hall Problem

This is a very famous probability problem based on the Lets Make A Deal TV show. See Monty Hall Problem (wikipedia) for background. Provided for this problem is a neitherOfThese(a, b) function which given two numbers that are 1, 2, or 3, returns one of 1, 2, or 3 which is not equal to the two passed in numbers. This function is simple, but turns out to be very useful for the Monty Hall problem.

For the standard strategy, the player keeps their original pick, and so wins 1/3 of the time. The code below simulates the standard strategy. The "switch" strategy means the player changes their pick to the door which is not their original pick and is not the door revealed by Monty. Is the switch strategy superior? The reasoning is very unintuitive, however the simulation can quite clearly show the performance of both strategies. Add code below the standard strategy to implement/print the results of the switch strategy to answer this question.


  // change our pick to the *other* one -- not our original
  // pick and not what Monty revealed.
  pick = neitherOfThese(pick, reveal);
  if (pick == good) {
    win2++;
  }