Given independent random variables X and Y,? what is the expected value of the product XY?
From the definition of expected vlaue, E(XY)=∑x∑yxyp(XY=xy) Because X and Y are independent, the probability of their product is the product of their probabilities p(XY=xy)=pX(X=x)pY(Y=y) and the expected value becomes E(XY)=∑x∑yxypX(X=x)pY(Y=y) where pX is the probability distribution for X and similarly for pY.?
Evaluating the expected value as given takes work proportional to n2. Can the sum be evaluated with less work? Taking a hint from the sum’s double loop structure, arrange the summands in a matrix so that the (i,j)th element is xiyjpX(X=xi)pY(Y=yj):
1y1pX(1)pY(1)x1y2pX(1)pY(2)⋯x1ycpX(1)pY(c)x2y1pX(2)pY(1)x2y2pX(2)pY(2)⋯x2ycpX(2)pY(c)⋮⋮⋱⋮xry1pX(r)pY(1)xry2pX(r)pY(2)⋯xrycpX(r)pY(c)
where pX(i) and pY(j) are space-saving versions of pX(X=xi) and pY(Y=yj) respectively, and r and c are the size of the domains (outcomes) for X and Y respectively. The ith row of the matrix is
iy1pX(i)pY(1)xiy2pX(i)pY(2)⋯xiycpX(i)pY(c)
The term xipX(i) is a constant in the row and can be factored out
ipX(i)[y1pY(1)y2pY(2)⋯ycpY(c)]
Summing the factored row vector gives the expected value of Y: xipX(i)E(Y). A similar set of steps can be applied to column vectors from the matrix. These results can be added to the matrix:
1y1pX(1)pY(1)x1y2pX(1)pY(2)⋯x1ycpX(1)pY(c)x1pX(1)E(Y)x2y1pX(2)pY(1)x2y2pX(2)pY(2)⋯x2ycpX(2)pY(c)x2pX(2)E(Y)⋮⋮⋱⋮⋮xry1pX(r)pY(1)xry2pX(r)pY(2)⋯xrycpX(r)pY(c)xrpX(r)E(Y)y1pY(1)E(X)y2pY(2)E(X)⋯ycpY(c)E(X)
Row vector r+1 can be run through the same manipulations with the constant E(X) to produce E(X)E(Y). Mercifully, doing similarly on column vector c+1 also produces E(X)E(Y).
Evaluating the expected value of the product of two independent random variables requires work proportional to n, and, in particular, the expected value of the product of two independent random variables is the product of the expected value of each random value E(XY)=E(X)E(Y).