Monday, June 2, 2014

Section 2.5 - Sum Expected Values

What is the expected value of the sum \(X + Y\) given \(X\) and \(Y\) are (not necessarily independent) random variables?

From the definition of expected value, \[ E(X + Y) = \sum_x\sum_y (x + y)\,p(X = x \wedge Y = y) \] Evaluating this sum as given requires work proportional to \(n^2\); can it be evaluated more efficiently? Because \(X\) and \(Y\) aren’t necessarily independent, the probability \(p(X = x \wedge Y = y)\) can’t be simplified in the same way it was when computing the expected value of random-value products.? Are there some other tricks that can simplify evaluation?

Friday, May 30, 2014

Section 2.5 - Product Expected Value

Given independent random variables \(X\) and \(Y\),? what is the expected value of the product \(XY\)?

From the definition of expected vlaue, \[ E(XY) = \sum_x\sum_y xy\,p(XY = xy)\] Because \(X\) and \(Y\) are independent, the probability of their product is the product of their probabilities \[p(XY = xy) = p_X(X = x)\,p_Y(Y = y)\] and the expected value becomes \[ E(XY) = \sum_x\sum_y xy\,p_X(X = x)\,p_Y(Y = y)\] where \(p_X\) is the probability distribution for \(X\) and similarly for \(p_Y\).?

Evaluating the expected value as given takes work proportional to \(n^2\). Can the sum be evaluated with less work?

Sunday, June 19, 2011

Section 2.5 - Random Variables, Mean and the Expected Value

A random variable maps outcomes from a sample space to numbers, making the outcomes easier to manipulate. For example, there isn’t much to be done with outcomes from the coin-flip sample space { H, T }, but mapping outcomes to integers with the random variable X = { (H, 0), (T,1) } allows mathematical manipulations. The mapping used depends on what’s being done. X's previous definition might be useful when counting the number of tails, while the definition X = { (H, 1), (T, -1) } is more suited for keeping track of wins and losses.

Given a random variable X over a sample space, the probability that Xequals a particular value k is the sum of the probabilities of the outcomes that X maps to k: \[Pr(X = k) = \sum_{X(o) = k} Pr(o) \] For example, let the random variable X map the outcome i of a single die roll to i. The probability that X is even is \[ Pr(even(X)) = Pr(2)+ Pr(4) + Pr(6) = 1/6 + 1/6 + 1/6 = 1/2\]Computing the mean (or average or expected value) of a random value is a common mathematical manipulation. A random variable’s mean is the sum of the values in the range, with each value weighted by its probability: \[ \sum_{k\in range(X)} kPr(X = k) \] The mean is often represented by \(\mu\). To continue the die-roll example, X's mean is
\[ \begin{eqnarray}
& & 1Pr(X = 1) + 2Pr(X = 2) + \cdots + 6Pr(X = 6) \\
&=& 1\cdot1/6 + 2\cdot1/6 + \cdots + 6\cdot1/6 \\
&=& (1 + 2 + \cdots + 6)1/6 \\
&=& 21\cdot1/6 \\
&=& 3.5
\end{eqnarray} \]A random variable’s mean is not necessarily a member of its range.

Friday, June 17, 2011

Exercise 2.2-16

How many distinct three-letter combinations (“words”) can you make from the letters of Mississippi?

Tuesday, May 31, 2011

Exercise 2.2-3

What is the probability of cutting two cards of the same rank from a well-shuffled deck?

Monday, May 30, 2011

Section 2.4 - The Binomial Distribution—Bernoulli Trials

A Bernoulli trial produces an outcome which satisfies a criterion with probability \(p\), and so fails to satisfy the criterion with compliment probability \(q = 1 - p\). What is the probability that \(n\) successive, independent Bernoulli trials results in exactly \(k\) satisfactory outcomes?

The compound outcome is an arrangement of \(k\) satisfactory outcomes and \(n - k\) unsatisfactory outcomes. Because the Bernoulli-trial outcomes are independent, the outcome probabilities are multiplied to get the combined-outcome probability, and because multiplication is commutative, any individual compound outcome has probability \(p^kq^{n-k}\). There are \(C(n, k)\) such compound outcomes, and the probability that \(n\) successive, independent Bernoulli trials results in exactly \(0 \leq k \leq n\) satisfactory outcomes when satisfactory outcomes have probability \(p\) is \[ b(k; n, p) = C(n, k)p^kq^{n - k}\] The compound-outcome sample space is not uniform. Even if \(p = 1/2\),? \(C(n, k)\) varies with \(k\); the variation becomes more complex when \(p\neq q\).

Sunday, May 29, 2011

Section 2.3 - Combinations

Given \(n\) distinct items, there are \(P(n, k)\) permutations of \(0 \leq k \leq n\) items. Each permutation of \(k\) items can be re-arranged \(k!\) ways. If item order doesn’t matter, there are
\[ \frac{P(n, k)}{k!} = \frac{n!}{k!(n-k)!} = C(n, k) \]
orderless permutations, or combinations. Sanity check: \(C(n, n) = n!/(n!(n - n)!) = 1/0! = 1\).

Combinations are fun to play with. First, \(C(n, k) = C(n, n - k)\).? Then
\[\begin{eqnarray}
C(n, k + 1) & = & \frac{n!}{(k + 1)!(n - k - 1)!} \\
& = & \frac{n!(n - k)}{(k + 1)k!(n - k)(n - k - 1)!} \\
& = & \frac{n - k}{k + 1}C(n, k)
\end{eqnarray}\]
\(C(n, k + 1)\) is largest when \((n - k)/(k + 1)\) is closest to 1, which occurs when \(k\) is closest to \((n - 1)/2\). Similarly
\[\begin{eqnarray}
C(n + 1, k) & = & \frac{(n + 1)!}{k!(n + 1 - k)!} \\
& = & \frac{(n + 1)n!}{k!(n + 1 - k)(n - k)!} \\
& = & \frac{n + 1}{n + 1 - k}C(n, k)
\end{eqnarray}\]