Monday, June 2, 2014

Section 2.5 - Sum Expected Values

What is the expected value of the sum \(X + Y\) given \(X\) and \(Y\) are (not necessarily independent) random variables?

From the definition of expected value, \[ E(X + Y) = \sum_x\sum_y (x + y)\,p(X = x \wedge Y = y) \] Evaluating this sum as given requires work proportional to \(n^2\); can it be evaluated more efficiently? Because \(X\) and \(Y\) aren’t necessarily independent, the probability \(p(X = x \wedge Y = y)\) can’t be simplified in the same way it was when computing the expected value of random-value products.? Are there some other tricks that can simplify evaluation?

Multiply \(p\) through the sum \((x + y)\) and separate the summations to get \[ E(X + Y) = \sum_x\sum_y x\,p(X = x \wedge Y = y)\ + \sum_x\sum_y y\,p(X = x \wedge Y = y)\] Taking the left summation, \(x\) is constant in the inner-most summation and can be brought out \[ \sum_x x \sum_y \,p(X = x \wedge Y = y) \] The sum of the probabilities for all possible values of \(Y\) is one, but it’s conditioned by the probability that \(X\) is equal to some value \(x\), \(p_X(x)\): \[ \sum_x x\,p_X(x) \] which is \(E(X)\).

A similar trick applies to the right summation after the two summations are switched, and, \[ E(X + Y) = E(X) + E(Y) \] A little more algebra shows the expected value is a linear operator: \[ E(aX + bY) = aE(X) + bE(Y) \]

Because random variables are closed under addition and multiplication? and addition and multiplication are associative,? the sum and product results generalize to any number of terms: \[\begin{array}{lcr} E(\sum_i X_i) &=& \sum_i E(X_i) \\ E(\prod_i X_i) &=& \prod_i E(X_i) \end{array}\]