Problem 1

Let \(X\sim Gamma(2, 1)\). Verify that the best moment bound is smaller than the best Chernoff bound for \(P(X>10)\).

\[ \begin{aligned} &E[X^k] = \int_{0}^\infty x^k \frac{1^2}{\Gamma(2)}x^{2-1} e^{-x} dx = \\ &\Gamma(k+2)\int_{0}^\infty \frac{1}{\Gamma(k+2)}x^{k+2-1} e^{-x} dx = \\ &\Gamma(k+2)=(k+1)k\Gamma(k) \end{aligned} \]

So the moment bounds are \(E[X^k]/t^{k}=\frac{k(k+1)\Gamma(k)}{t^k}\). Let’s call the bound \(g(k)\)

\[g(k+1) = \frac{(k+1)(k+2)\Gamma(k+1)}{t^{k+1}} =\\ \frac{(k+1)(k+2)k\Gamma(k)}{tt^k}=\frac{k+2}{t}g(k)\] and so g is decreasing as long as \(k+2<t\) and increasing from then on. So we find \(k=round(t-2)=round(10-2)=8\),

\[10\times11\times \Gamma(10)/10^8=0.00363\]

For the Chernoff bound we need the moment generating function, which we found previously to be

\[\psi_X(t) = \frac1{(1-\beta t)^\alpha} = \frac1{(1-t)^2}\]

So the Chernoff bound for P(X>a) is given by \[ \begin{aligned} &P(X>a) \le \inf_t \left\{e^{-ta}\psi_{X}(t) \right\} = \\ &\inf_t \left\{e^{-ta} \frac1{(1-t)^2} \right\} \\ &\frac{d}{dt} \left\{\frac{e^{-at}}{(1-t)^2} \right\} = \\ &\frac{e^{-at}(-a)(1-t)^2-e^{-at}2(1-t)(-1)}{(1-t)^4} = \\ &e^{-at}\frac{2-a(1-t)}{(1-t)^3} = 0\\ &t=\frac{a-2}{a}\\ &e^{-(\frac{a-2}{a})a} \frac1{(1-(\frac{a-2}{a}))^2}=\frac{a^2e^{2-a}}{4} \end{aligned} \]

a=10
a^2*exp(2-a)/4
## [1] 0.008386566

which is indeed bigger then 0.00363.

Of course we can also use R for the exact answer:

1-pgamma(10, 2, 1)
## [1] 0.0004993992

Problem 2

Let \(X_i\sim U[-1,1]\), \(i=1,...,100\) and \(\bar{X}=\frac1{100} \sum_{i=1}^{100} X_i\). Find the best Chernoff bound for \(P(|\bar{X}|>0.20)\).

First note that \(Y_i=(X_i+1)/2\sim U[0,1]\) and because the density of \(X_i\) is symmetric around 0 we have

\[P(|\bar{X}|>0.20)=2P(\bar{X}>0.20) =\\ 2P(\frac1{100} \sum_{i=1}^{100} X_i>0.2)=\\ 2P( \sum_{i=1}^{100} 2\left[(X_i+1)/2\right]-1>20)=\\ 2P( \sum_{i=1}^{100} \left[2Y_i-1\right]>20)=\\ 2P(2\sum_{i=1}^{100} Y_i-100>20)\\ 2P(\sum_{i=1}^{100} Y_i>60)\]

Now

\[ \begin{aligned} &\psi_{\sum_{i=1}^{100} Y_i}(t) = \left[\psi_{Y_1}(t)\right]^{100} =[(e^t-1)/t]^{100} \\ &P(\sum_{i=1}^{100} Y_i>60) \le [(e^t-1)/t]^{100}e^{-60t} \end{aligned} \]

Now we need to find the minimum of this function:

curve( ((exp(x)-1)/x)^100*exp(-60*x), 0.5, 2)

x=seq(0.5,2,length=1000)
y=((exp(x)-1)/x)^100*exp(-60*x)
c(x[which.min(abs(y))],y[which.min(abs(y))])
## [1] 1.22972973 0.00230225

and so the best Chernoff bound is

\[P(|\bar{X}|>0.20) \le 2[(e^{1.23}-1)/1.23]^{100}e^{-60\times1.23}=0.0046\]

x=apply(matrix(runif(100*10000,-1,1), ncol=100), 1, mean)
sum(abs(x)>0.2)/10000
## [1] 6e-04

Problem 3

Say \(X_n\sim N(0, \frac1n)\). Show that \(X_n\rightarrow 0\) in quadratic mean, in distribution and in probability using definition 3.2.6 only.

  • convergence in quadratic mean

\[ \begin{aligned} &E[X_n]=0\\ &var(X_n) =\frac1{n^2}\rightarrow 0 \end{aligned} \]

  • convergence in distribution

\[ \begin{aligned} &F_n(x) =\int_{-\infty}^x \frac{\sqrt{n}}{\sqrt{2\pi}}\exp\{-\frac{n}2t^2\}dt = \\ &\int_{-\infty}^x \frac{1}{\sqrt{2\pi}}\exp\{-\frac{1}2(\sqrt{n}t)^2\}(\sqrt{n}dt) = \\ &\int_{-\infty}^{\sqrt nx} \frac{1}{\sqrt{2\pi}}\exp\{-\frac{1}2z^2\}dz = \Phi(\sqrt nx)\\ \end{aligned} \]

where \(\Phi\) is the cdf of a standard normal random variable.

Now

\[ \begin{aligned} &\lim_{n\rightarrow -\infty} F_n(x) = \Phi(-\infty)=0\text{ if }x<0\\ &\lim_{n\rightarrow \infty} F_n(x) = \Phi(\infty)=1\text{ if }x>0 \end{aligned} \]

But \(F_{\delta_0}(x) = I_{[0,\infty)}(x)\), so \(\lim_{n\rightarrow \infty} F_n(x) = F_{\delta_0}(x)\) for all \(x\ne 0\), that is all \(x\) where \(F_{\delta_0}\) is continuous.

  • convergence in probability

\[ \begin{aligned} &P(|X_n-0|\ge\epsilon) = \\ &P(|X_n-0|\ge(\frac{\epsilon}{\sigma})\sigma) \le \\ &(\frac{\sigma}{\epsilon})^2 = \frac1{n^2\epsilon}\rightarrow 0\\ \end{aligned} \]

where we use Chebysheff’s inequality.