Expected Value

From Department of Mathematics at UTSA
Jump to navigation Jump to search

Let be a (discrete) random variable with a finite number of finite outcomes occurring with probabilities respectively. The expectation of is defined as

Since the expected value is the weighted sum of the values, with the probabilities as the weights.

If all outcomes are equiprobable (that is, ), then the weighted average turns into the simple average. On the other hand, if the outcomes are not equiprobable, then the simple average must be replaced with the weighted average, which takes into account the fact that some outcomes are more likely than others.

An illustration of the convergence of sequence averages of rolls of a dice to the expected value of 3.5 as the number of rolls (trials) grows.

Examples

  • Let represent the outcome of a roll of a fair six-sided dice. More specifically, will be the number of pips showing on the top face of the dice after the toss. The possible values for are 1, 2, 3, 4, 5, and 6, all of which are equally likely with a probability of 1/6. The expectation of is
If one rolls the dice times and computes the average (arithmetic mean) of the results, then as grows, the average will almost surely converge to the expected value, a fact known as the strong law of large numbers.
  • The roulette game consists of a small ball and a wheel with 38 numbered pockets around the edge. As the wheel is spun, the ball bounces around randomly until it settles down in one of the pockets. Suppose random variable represents the (monetary) outcome of a $1 bet on a single number ("straight up" bet). If the bet wins (which happens with probability 1/38 in American roulette), the payoff is $35; otherwise the player loses the bet. The expected profit from such a bet will be
That is, the bet of $1 stands to lose , so its expected value is


Resources

Licensing

Content obtained and/or adapted from: