AP Statistics Lectures

by Arnold Kling

The Expectation Operator

When you spin a dreidel, you either have to put a toothpick into the center or do nothing, or win half the toothpicks in the center or win all the toothpicks in the center. For example, if you spin a dreidel with eight toothpicks in the center, the four possible outcomes are {-1,0,4,8}.

If you spin a dreidel and there are eight toothpicks in the center, on average how many toothpicks will you win? Adding up the values of the four outcomes and dividing by four gives 11/4. This is the expected value of the random variable derived from spinning the dreidel with eight toothpicks in the center.

What is the expected value if there are four toothpicks in the center?

If not all outcomes are equally likely, then the expected value must take this into account. For example, imagine a six-sided dreidel where the two additional sides say S, where you lose one. Then the probability of S is 1/2, while the probability of N, G, or H is 1/6 each. Then, with eight toothpicks in the center the expected value of the random variable is:

E(X) = (1/2)(-1) + (1/6)(0) + (1/6)(4) + (1/6)(8) = 3/2

The expected value of the random variable derived from spinning the dreidel goes down if we have a dreidel that is "weighted" to come up S more often. This makes sense.

In general, the expected value of a random variable, written as E(X), is equal to the weighted average of the outcomes of the random variable, where the weights are based on the probabilities of those outcomes. We can talk about E(X), E(X^{2}), and so forth.

If a is a constant, we can talk about E(X+a), E(X-a), E(aX), and so forth. If Y is another random variable, we can talk about E(X+Y), E(XY), etc.

On an old AP exam, there was a problem where you needed to find the expected cost of repairs for a computer the first year after you buy it. If the probability of needing no repairs is .7, the probability of needing a $100 repair is .2, and the probability of needing a $300 repair is .1, what is the expected cost of repair? What would be a fair price to pay for a warranty that offered free repairs in the first year?

Suppose that I offer to make a bet with you. I will give you $30. Then, I will keep flipping a coin until it comes up tails. If n is the number of heads that I get before it comes up tails, you pay me $2^{n}. The payoff is a random variable. Call it X. The distribution of X is:

number of heads (n) | $2^{n} |
probability |
---|---|---|

0 | $1 | 1/2 |

1 | $2 | 1/4 |

2 | $4 | 1/8 |

3 | $8 | 1/16 |

n | $2^{n} |
2^{-(n+1)} |

As long as I get fewer than five heads in a row, you have to pay me less than the $30 I pay you. Does this look like an attractive game to you?

Most people would say, "Yes." However, what happens when you take the expected value of what you will have to pay?

E(X) = (1/2)($1) + (1/4)($2) + (1/8)($4) + ... = $1/2 + $1/2 + $1/2 + ...

The expected value of what you will have to pay is infinity! You really should think twice before agreeing to play this game. This example is known as the St. Petersburg Paradox.

Moments

E(X) is called the mean of X. Often, it is written as , pronounced "X bar."

first moment of X. We define other moments as:

is also called the
second moment: E(X - ^{2}

third moment: E(X - )^{3}

nth moment: E(X - )^{n}

)third moment: E(X - )

nth moment: E(X - )

The second moment, also called the variance of X, is a measure of the spread of the distribution of X. Synonyms for spread include variability, volatility, and uncertainty.

The third moment is a measure of skewness or asymmetry in the distribution of X. For example, suppose that we hold a raffle where we sell 100 tickets for $10 each, and we give a $500 prize to the winner. The average prize is $5. However, 99 people will get less than the average, and one person will get way more than the average. That asymmetry is called skewness.

Rules for the Expectation Operator

The expectation operator, E(X), takes the weighted sum of a random variable. In the case where there are two outcomes {x_{1},x_{2}}, E(X) takes p_{1}x_{1} + p_{2}x_{2}, where p_{1} and p_{2} are the respective probabilities of the two outcomes.

Here are some important rules for manipulating equations that involve E(X). In the following, assume that a and b are constants, and Y is another random variable.

E(a) = a

E(a + bX) = a + bE(X)

E(X + Y) = E(X) + E(Y)

E(bX)^{2} = E(b^{2}X^{2}) = b^{2}E(X^{2})

E(a + bX) = a + bE(X)

E(X + Y) = E(X) + E(Y)

E(bX)

Using the example with two outcomes, verify these rules. Then, derive a rule for E(a+bX)^{2}.

What is nice is that the expectation operator, E(), is consistent with all of the usual rules of algebra. In particular,

E(X+Y)^{2} = E(X^{2} + Y^{2} + 2XY)