Review sheet for test 3

Test 3 will cover the following sections.

3.2 Expectation of discrete random variables

In this section you learn the all-useful formula

$$E(g(x)) = \sum_k g(k) P(X=k)$$

So to figure out the expectation this way we get our hands on the distribution and go from there. As well, we learn the following properties of $E(X)$:

$$E(X+Y) = E(X) + E(Y),\quad E(aX) = aE(X)$$

A very useful trick is to write the random variable you want to find the expectation of as a sum $X = X_1 + X_2 + \cdot + X_n$ and then figure out $E(X_i)$ for each $i$. Judicious use of this can save us lots of hard work.

Here are some problems you should be familiar with:

  1. What is expected minimum of 2 dice rolls.

  2. What is mean and variance of a indicator random variable $P(X=0)=1-P(X=1)$.

  3. What is mean and variance of binomial distribution.

  4. If you now $g(x,y) = P(X=x,Y=y)$ what is $E(XY)$?

  5. 3.2 7,10,13. Review exercises 2,3

3.3 Standard deviation and Chebyshev’s theorem

In this section we learn formulas for the variance. As well we learn that if the $X_i$ are independent that

$$\text{\sf{\textbf{var}}}(X_1 + X_2 + \cdot + X_n) = \text{\sf{\textbf{var}}}(X_1) + \cdots \text{\sf{\textbf{var}}}(X_n)$$

This allows us to easily figure out means and variances for sums of random variables such as those that yield the Binomial distribution.

We learn Chebyshev’s theorem which tells us how $\sigma$ measures the spread of a random variable. You should know how to algebraically manipulate expressions with absolute values, such as

$$|X - \mu| < k\sigma$$

There are two very important theorems mentioned in this section: the law of large numbers which tells us the expectation is related to an average value (how) and the cental limit theorem which tells us that $X_1 + X_2 + \cdot X_n$ is approximately normal with mean $nE(X_i)$ and variance $n \text{\sf{\textbf{var}}}(X_1)$ (What assumptions?)

Here are some problems you should be familiar with:

  1. What is variance of a dice roll? 100 dice rolls?

  2. Exercise 8,12,13 Review exercises 6,20

3.4 Discrete distributions

In this section we looked at random variables which can have infinitely many discrete values. A good example are random variables with the geometric distribution: $P(X=k)=q^{k-1}p$. You should know how to find the mean of this distribution using the power series trick.

Here are some problems you should be familiar with:

  1. A flight attendant has an equal chance of going to any of the 50 states on a given flight. How many flight on average does he take before going to all 50 states?

  2. HW 5,10 Review exercises 15

[3.5 The Poisson distribution

] We didn’t do much in this section (pgs. 222-224), but we already know when this distribution is used: as an approximation for the binomial distribution when $np=\lambda$ is close to 1. I might ask a question like #2 in the homework.

3.6 Symmetry

We skipped this section. ;^(

4.1 Continuous distributions

This section deals with distributions which have a density $f(x)$. This allows one to write

$$P(a < X \leq b) = \int_a^b f(x) dx,\quad E(g(X)) = \int_{-\infty}^{\infty} g(x) f(x) dx.$$

Examples are the uniform distribution, the normal distribution and the exponential distribution. You should know how to do some simple integrals to make use of these formula.

We skipped the section on fitting a curve to empirical distributions.

Some sample problems would be

  1. If X has density $f(x) = cx^3$ on $[0,3]$ and 0 otherwise find $E(X)$ and $\text{\sf{\textbf{var}}}(X)$,

  2. If $X$ is uniform on $[-25,25]$ and $U$ is uniform on $[0,1]$, Then $X$ and $-a+bU$ have the same distribution for which $a,b$? Use our answer and the fact that $\text{\sf{\textbf{var}}}(U)=1/12$ to find $\text{\sf{\textbf{var}}}(X)$.

  3. Let $X_i$ be uniform on $[0,10]$. Use the normal approximation to find $P(X_1+X_2+\cdots+X_{100} > 543)$

4.2 The exponential distribution

This is the special distribution which has density $f(x) = \lambda e^{-\lambda x}$ for $x > 0$. We have the following: $P(T > t) = e^{-\lambda t}$, the memoryless property, and the relationship with the geometric distribution.

In practice, exponential random variables are used as the inter-arrival times in a Poisson arrival process.

Some sample problems would be

  1. Homework problems 3, 4