Processing math: 100%

Monday, June 2, 2014

Section 2.5 - Sum Expected Values

What is the expected value of the sum X+Y given X and Y are (not necessarily independent) random variables?

From the definition of expected value, E(X+Y)=xy(x+y)p(X=xY=y) Evaluating this sum as given requires work proportional to n2; can it be evaluated more efficiently? Because X and Y aren’t necessarily independent, the probability p(X=xY=y) can’t be simplified in the same way it was when computing the expected value of random-value products.? Are there some other tricks that can simplify evaluation?

Multiply p through the sum (x+y) and separate the summations to get E(X+Y)=xyxp(X=xY=y) +xyyp(X=xY=y) Taking the left summation, x is constant in the inner-most summation and can be brought out xxyp(X=xY=y) The sum of the probabilities for all possible values of Y is one, but it’s conditioned by the probability that X is equal to some value x, pX(x): xxpX(x) which is E(X).

A similar trick applies to the right summation after the two summations are switched, and, E(X+Y)=E(X)+E(Y) A little more algebra shows the expected value is a linear operator: E(aX+bY)=aE(X)+bE(Y)

Because random variables are closed under addition and multiplication? and addition and multiplication are associative,? the sum and product results generalize to any number of terms: E(iXi)=iE(Xi)E(iXi)=iE(Xi)