Problem 1

Below is a sample from a Poisson distribution with rate \(\lambda\).

  1. Find the maximum likelihood estimator of \(\lambda\).

  2. Find the Bayesian estimator for \(\lambda\) if the prior distribution is exponential with rate 1 and using the posterior mean.

## x
##  5  6  7  8  9 10 11 12 13 14 15 16 19 
##  1  1  4  1  4  7  6  8  4  4  1  7  2
  1. Find the maximum likelihood estimator of \(\lambda\).

\[ \begin{aligned} &L(\lambda) =\prod_{i=1}^n f(x_i|\lambda) = \prod_{i=1}^n \frac{\lambda^{x_i}}{x_i!}e^{-\lambda}=\frac{\lambda^{\sum_{i=1}^nx_i}}{\prod_{i=1}^nx_i!}e^{-n\lambda}\\ &\log L(\lambda) =\left( \sum_{i=1}^nx_i\right) \log \lambda - \log\left(\prod_{i=1}^nx_i\right)-n\lambda\\ &\frac{\log L(\lambda)}{d\lambda} = \frac{\sum_{i=1}^nx_i}{\lambda}-n=0\\ &\hat{\lambda} = \frac{\sum_{i=1}^nx_i}{n}=\bar{x} \end{aligned} \]

mean(x)
## [1] 11.76
  1. Find the Bayesian estimator for \(\lambda\) if the prior distribution is exponential with rate 1 and using the posterior mean.

\[ \begin{aligned} &f(\pmb{x},\lambda) = \frac{\lambda^{\sum_{i=1}^nx_i}}{\prod_{i=1}^nx_i!}e^{-n\lambda}\times e^{-\lambda} = \frac{\lambda^{\sum_{i=1}^nx_i}}{\prod_{i=1}^nx_i!}e^{-(n+1)\lambda}\\ &m(\pmb{x}) = \int_{-\infty}^{\infty} f(\pmb{x},\lambda)d\lambda = \\ &\int_0^\infty \frac{\lambda^{\sum_{i=1}^nx_i}}{\prod_{i=1}^nx_i!}e^{-(n+1)\lambda}d\lambda=\\ &\frac{1}{\prod_{i=1}^nx_i!}\frac{\Gamma(1+\sum_{i=1}^nx_i)}{(n+1)^{1+\sum_{i=1}^nx_i}}\int_0^\infty\frac{(n+1)^{1+\sum_{i=1}^nx_i}}{\Gamma(1+\sum_{i=1}^nx_i)}\lambda^{(1+\sum_{i=1}^nx_i)-1}e^{-(n+1)\lambda}d\lambda=\\ &\frac{1}{\prod_{i=1}^nx_i!}\frac{\Gamma(1+\sum_{i=1}^nx_i)}{(n+1)^{1+\sum_{i=1}^nx_i}} \end{aligned} \] because the integrand is a Gamma density. Now

\[ \begin{aligned} &f(\lambda|\pmb{x}) = \frac{f(\pmb{x},\lambda)}{m(\pmb{x})} = \\ &\frac{\frac{\lambda^{1+\sum_{i=1}^nx_i}}{\prod_{i=1}^nx_i!}e^{-(n+1)\lambda}}{\frac{1}{\prod_{i=1}^nx_i!}\frac{\Gamma(1+\sum_{i=1}^nx_i)}{(n+1)^{1+\sum_{i=1}^nx_i}}} = \\ &\frac{(n+1)^{2+\sum_{i=1}^nx_i}}{\Gamma(1+\sum_{i=1}^nx_i)}\lambda^{(1+\sum_{i=1}^nx_i)-1}e^{-(n+1)\lambda} \end{aligned} \]

and so \(\lambda|\pmb{x}\sim \Gamma(1+\sum_{i=1}^nx_i, n+1)\). Finally

\[\hat{\lambda} = E[\lambda|\pmb{x}] = \frac{1+\sum_{i=1}^nx_i}{n+1}\]

(1+sum(x))/(length(x)+1)
## [1] 11.54902

Problem 2

Let \((X_1, X_2, X_3)\) be a multinomial normal random vector with mean vector \((1, 1, 1)\) and variance-covariance matrix

\[ \begin{bmatrix} 1 & 0.5 & 0.1 \\ 0.5 & 2 & -0.3 \\ 0.1 & -0.3 & 3\\ \end{bmatrix} \]

Find an approximation to \(var[\frac{X_1+2X_2+3X_3}{X_1^2+X_2^2+X_3^2}]\)

\[ \begin{aligned} &h(x_1, x_2, x_3) = \frac{x_1+2x_2+3x_3}{x_1^2+x_2^2+x_3^2}\\ &h_{x_1}(x_1, x_2, x_3) = \frac{(x_1^2+x_2^2+x_3^2)-(x_1+2x_2+3x_3)2x_1}{(x_1^2+x_2^2+x_3^2)^2} \\ &h_{x_2}(x_1, x_2, x_3) = \frac{2(x_1^2+x_2^2+x_3^2)-(x_1+2x_2+3x_3)2x_2}{(x_1^2+x_2^2+x_3^2)^2} \\ &h_{x_3}(x_1, x_2, x_3) = \frac{3(x_1^2+x_2^2+x_3^2)-(x_1+2x_2+3x_3)2x_3}{(x_1^2+x_2^2+x_3^2)^2} \\ \end{aligned} \]

so

\[ \begin{aligned} &h(\mu_1,\mu_2,\mu_3) = \frac{1*1+2*1+3*1}{1^2+1^2+1^2} = 2 \\ &\mu_1^2+\mu_2^2+\mu_3^2 = 1+1+1 =3\\ &\mu_1+2\mu_2+3\mu_3^2 = 1+2+3 =6\\ &h_{x_1}(1,1,1) = (3-6*2)/9) = -1\\ &h_{x_2}(1,1,1) = (2*3-6*2)/9) = -2/3\\ &h_{x_3}(1,1,1) = (3*3-6*2)/9) = -1/3\\ &var[\frac{X_1+2X_2+3X_3}{X_1^2+X_2^2+X_3^2}] \approx \\ &\sum_{i=1}^3 \left[h_{x_i}(1, 1, 1)\right]^2 var[X_i]+2\sum_{i<j} \left[ h_{x_i}(1, 1, 1)h_{x_j}(1, 1, 1)\right] cov[X_i, X_j] = \\ &(-1)^2*1+(-2/3)^2*2+(-1/3)^2*3 +\\ &2\left[(-1)(-2/3)0.5+(-1)(-1/3)0.1+(-2/3)(-1/3)(-0.3)\right] = 2.82 \end{aligned} \]

This is not very good:

library(mvtnorm)
s=rbind(c(1, 0.5, 0.1), c(0.5, 2, -0.3), c(0.1, -0.3, 3))
x=rmvnorm(1e5, c(1,1,1), s)
y=(x[,1]+2*x[,2]+3*x[,3])/(x[,1]^2+x[,2]^2+x[,3]^2)
var(y)
## [1] 1.355516