Processing math: 100%

Monday, June 2, 2014

Section 2.5 - Sum Expected Values

What is the expected value of the sum X+Y given X and Y are (not necessarily independent) random variables?

From the definition of expected value, E(X+Y)=xy(x+y)p(X=xY=y) Evaluating this sum as given requires work proportional to n2; can it be evaluated more efficiently? Because X and Y aren’t necessarily independent, the probability p(X=xY=y) can’t be simplified in the same way it was when computing the expected value of random-value products.? Are there some other tricks that can simplify evaluation?

Friday, May 30, 2014

Section 2.5 - Product Expected Value

Given independent random variables X and Y,? what is the expected value of the product XY?

From the definition of expected vlaue, E(XY)=xyxyp(XY=xy) Because X and Y are independent, the probability of their product is the product of their probabilities p(XY=xy)=pX(X=x)pY(Y=y) and the expected value becomes E(XY)=xyxypX(X=x)pY(Y=y) where pX is the probability distribution for X and similarly for pY.?

Evaluating the expected value as given takes work proportional to n2. Can the sum be evaluated with less work?

Sunday, June 19, 2011

Section 2.5 - Random Variables, Mean and the Expected Value

A random variable maps outcomes from a sample space to numbers, making the outcomes easier to manipulate. For example, there isn’t much to be done with outcomes from the coin-flip sample space { H, T }, but mapping outcomes to integers with the random variable X = { (H, 0), (T,1) } allows mathematical manipulations. The mapping used depends on what’s being done. X's previous definition might be useful when counting the number of tails, while the definition X = { (H, 1), (T, -1) } is more suited for keeping track of wins and losses.

Given a random variable X over a sample space, the probability that Xequals a particular value k is the sum of the probabilities of the outcomes that X maps to k: Pr(X=k)=X(o)=kPr(o) For example, let the random variable X map the outcome i of a single die roll to i. The probability that X is even is Pr(even(X))=Pr(2)+Pr(4)+Pr(6)=1/6+1/6+1/6=1/2Computing the mean (or average or expected value) of a random value is a common mathematical manipulation. A random variable’s mean is the sum of the values in the range, with each value weighted by its probability: krange(X)kPr(X=k) The mean is often represented by μ. To continue the die-roll example, X's mean is
1Pr(X=1)+2Pr(X=2)++6Pr(X=6)=11/6+21/6++61/6=(1+2++6)1/6=211/6=3.5A random variable’s mean is not necessarily a member of its range.

Friday, June 17, 2011

Exercise 2.2-16

How many distinct three-letter combinations (“words”) can you make from the letters of Mississippi?

Tuesday, May 31, 2011

Exercise 2.2-3

What is the probability of cutting two cards of the same rank from a well-shuffled deck?

Monday, May 30, 2011

Section 2.4 - The Binomial Distribution—Bernoulli Trials

A Bernoulli trial produces an outcome which satisfies a criterion with probability p, and so fails to satisfy the criterion with compliment probability q=1p. What is the probability that n successive, independent Bernoulli trials results in exactly k satisfactory outcomes?

The compound outcome is an arrangement of k satisfactory outcomes and nk unsatisfactory outcomes. Because the Bernoulli-trial outcomes are independent, the outcome probabilities are multiplied to get the combined-outcome probability, and because multiplication is commutative, any individual compound outcome has probability pkqnk. There are C(n,k) such compound outcomes, and the probability that n successive, independent Bernoulli trials results in exactly 0kn satisfactory outcomes when satisfactory outcomes have probability p is b(k;n,p)=C(n,k)pkqnk The compound-outcome sample space is not uniform. Even if p=1/2,? C(n,k) varies with k; the variation becomes more complex when pq.

Sunday, May 29, 2011

Section 2.3 - Combinations

Given n distinct items, there are P(n,k) permutations of 0kn items. Each permutation of k items can be re-arranged k! ways. If item order doesn’t matter, there are
P(n,k)k!=n!k!(nk)!=C(n,k)
orderless permutations, or combinations. Sanity check: C(n,n)=n!/(n!(nn)!)=1/0!=1.

Combinations are fun to play with. First, C(n,k)=C(n,nk).? Then
C(n,k+1)=n!(k+1)!(nk1)!=n!(nk)(k+1)k!(nk)(nk1)!=nkk+1C(n,k)
C(n,k+1) is largest when (nk)/(k+1) is closest to 1, which occurs when k is closest to (n1)/2. Similarly
C(n+1,k)=(n+1)!k!(n+1k)!=(n+1)n!k!(n+1k)(nk)!=n+1n+1kC(n,k)