Friday, May 30, 2014

Section 2.5 - Product Expected Value

Given independent random variables \(X\) and \(Y\),? what is the expected value of the product \(XY\)?

From the definition of expected vlaue, \[ E(XY) = \sum_x\sum_y xy\,p(XY = xy)\] Because \(X\) and \(Y\) are independent, the probability of their product is the product of their probabilities \[p(XY = xy) = p_X(X = x)\,p_Y(Y = y)\] and the expected value becomes \[ E(XY) = \sum_x\sum_y xy\,p_X(X = x)\,p_Y(Y = y)\] where \(p_X\) is the probability distribution for \(X\) and similarly for \(p_Y\).?

Evaluating the expected value as given takes work proportional to \(n^2\). Can the sum be evaluated with less work? Taking a hint from the sum’s double loop structure, arrange the summands in a matrix so that the \((i, j)\)th element is \(x_iy_j\,p_X(X = x_i)\,p_Y(Y = y_j)\):
\[\begin{array} x_{1}y_{1}p_X(1)p_Y(1) & x_{1}y_{2}p_X(1)p_Y(2) & \cdots & x_{1}y_{c}p_X(1)p_Y(c) \\ x_{2}y_{1}p_X(2)p_Y(1) & x_{2}y_{2}p_X(2)p_Y(2) & \cdots & x_{2}y_{c}p_X(2)p_Y(c) \\ \vdots & \vdots & \ddots & \vdots \\ x_{r}y_{1}p_X(r)p_Y(1) & x_{r}y_{2}p_X(r)p_Y(2) & \cdots & x_{r}y_{c}p_X(r)p_Y(c) \end{array}\] where \(p_X(i)\) and \(p_Y(j)\) are space-saving versions of \(p_X(X = x_i)\) and \(p_Y(Y = y_j)\) respectively, and \(r\) and \(c\) are the size of the domains (outcomes) for \(X\) and \(Y\) respectively. The \(i\)th row of the matrix is \[\begin{array} x_{i}y_{1}p_X(i)p_Y(1) & x_{i}y_{2}p_X(i)p_Y(2) & \cdots & x_{i}y_{c}p_X(i)p_Y(c) \end{array}\] The term \(x_ip_X(i)\) is a constant in the row and can be factored out \[\begin{array} x_ip_X(i)[y_1p_Y(1) & y_2p_Y(2) & \cdots & y_cp_Y(c)] \end{array}\] Summing the factored row vector gives the expected value of \(Y\): \(x_ip_X(i)E(Y)\). A similar set of steps can be applied to column vectors from the matrix. These results can be added to the matrix: \[\begin{array} x_{1}y_{1}p_X(1)p_Y(1) & x_{1}y_{2}p_X(1)p_Y(2) & \cdots & x_{1}y_{c}p_X(1)p_Y(c) & x_1p_X(1)E(Y) \\ x_{2}y_{1}p_X(2)p_Y(1) & x_{2}y_{2}p_X(2)p_Y(2) & \cdots & x_{2}y_{c}p_X(2)p_Y(c) & x_2p_X(2)E(Y) \\ \vdots & \vdots & \ddots & \vdots & \vdots \\ x_{r}y_{1}p_X(r)p_Y(1) & x_{r}y_{2}p_X(r)p_Y(2) & \cdots & x_{r}y_{c}p_X(r)p_Y(c) & x_rp_X(r)E(Y) \\ y_1p_Y(1)E(X) & y_2p_Y(2)E(X) & \cdots & y_cp_Y(c)E(X) \end{array}\] Row vector \(r + 1\) can be run through the same manipulations with the constant \(E(X)\) to produce \(E(X)E(Y)\). Mercifully, doing similarly on column vector \(c + 1\) also produces \(E(X)E(Y)\).

Evaluating the expected value of the product of two independent random variables requires work proportional to \(n\), and, in particular, the expected value of the product of two independent random variables is the product of the expected value of each random value \(E(XY) = E(X)E(Y)\).