Continuous Distributions

Uniform Distribution

X is said to have a uniform distribution on the interval [A,B] if

\[ X \sim U[A, B] \\ f(x) = 1/(B-A) \\ \text{ if } A<x<B, 0 \text{ otherwise}\\ E[X] = (A+B)/2 \\ Var[X] = (B-A)^2/12 \]

Exponential Distribution

X is said to have an exponential distribution rate \(\lambda\) if

\[ X \sim Exp(\lambda )\\ f(x) = \lambda \exp(-\lambda x) \\ \text{ if } x>0, 0 \text{ otherwise}\\ E[X] = 1/\lambda \\ Var[X] = 1/\lambda^2 \]

The Gamma Distribution

Recall the gamma function:

\[ \Gamma( \alpha )=\int_0^\infty t^{\alpha-1}e^{-t} dt \]

The gamma function is famous for many things, among them the relationship

\[\Gamma( \alpha+1 )= \alpha \Gamma( \alpha )\] which implies

\[\Gamma( n) = (n-1)!\] we also have

\[\Gamma (1/2) = \sqrt \pi\]

Now X is said have a Gamma distribution with parameters (\(\alpha\), \(\beta\)) if

\[ X \sim \text{Gamma}(\alpha, \beta )\\ f(x) = \frac{1}{\Gamma(\alpha) \beta^\alpha} x^{\alpha-1} \exp(-x/\beta) \\ \text{ if } x>0, 0 \text{ otherwise}\\ E[X] = \alpha\beta \\ Var[X] = \alpha\beta^2 \]

By definition we have X>0, and so the Gamma is the basic example of a r.v. on \([0, \infty )\), or a little more general (using a change of variables) on any open half interval

Note if \(X \sim \text{Gamma}(1, \beta)\) then \(X \sim \text{Exp}(1/\beta)\).

Another important special case is if \(X \sim \text{Gamma}(n/2, 2)\), then X is called a Chi-square r.v. with n degrees of freedom, denoted by \(X \sim \chi^2(n)\).

There is an important connection between the Gamma and the Poisson distributions:

If \(X \sim \text{Gamma}(n, \beta)\) and \(Y \sim \text{Pois}(x/\beta)\) then

\(P(X \le x) = P(Y \ge n)\)

n <- 5
beta <- 2.3
x <- 1.7
pgamma(x, n, 1/beta)
## [1] 0.0009986
1-ppois(n-1, x/beta)
## [1] 0.0009986

Beta Distribution

X is said to have a Beta distribution with parameters \(\alpha\) and \(\beta\) if

\[ X \sim \text{Beta}(\alpha, \beta )\\ f(x) = \frac{\Gamma(\alpha +\beta)}{\Gamma(\alpha) \Gamma(\beta)} x^{\alpha-1} (1-x)^{\beta-1} \\ \text{ if } 0<x<1, 0 \text{ otherwise}\\ E[X] = \frac{\alpha}{\alpha+\beta} \\ Var[X] = \frac{\alpha \beta}{(\alpha+\beta)^2(\alpha+\beta+1)} \]

By definition we have 0 < X < 1, and so the Beta is the basic example of a r.v. on [0,1], or a little more general (using a change of variables) on any open finite interval.

Special case: Beta(1,1) = U[0,1]

Note:

  1. \(X, Y\) iid Exp\((\lambda)\) then \(X + Y \sim Gamma(2, \lambda)\) (and not exponential)

  2. X, Y iid \(\chi^2(n)\), then \(X + Y\sim \chi^2(2n)\)

Normal (Gaussian) Distribution

X is said to have a normal distribution with mean \(\mu\) and variance \(\sigma\)2 if it has

\[ f(x) = \frac{1}{\sqrt{2\pi\sigma^2}}\exp \left( -\frac{1}{2} \frac{(x-\mu)^2}{\sigma^2} \right) \]

We denote the normal distribution by \(X \sim N(\mu, \sigma)\). (using the standard deviation)

Of course we have \(EX = \mu\) and \(Var(X) = \sigma^2\).

We also have the following interesting features:

  1. If \(X\sim N(\mu_X, \sigma_X)\) and \(Y \sim N(\mu_Y, \sigma_Y)\)

then

\(Z = X + Y \sim N(\mu_X + \mu_Y, \sqrt{\sigma_X^2 + \sigma_Y^2 + 2\sigma_X \sigma_Y \rho})\)

where \(\rho\) = cor(X,Y).

  1. if cov(X, Y) = 0 then X and Y are independent

  2. \(P(X > \mu) = P(X < \mu) = 1/2\)

  3. \(P(X > \mu+x) = P(X < \mu-x)\)

Mixture of Discrete and Continuous Distributions

Say we have a computer program that generates data as follows:

first it generates a rv Z~Ber(p). It then generates a second rv \(X\sim N(\mu Z, 1)\) that is \(X\sim N(0,1)\) if Z = 0 and \(X\sim N(\mu, 1)\) if Z = 1, for some \(\mu\).

Now