Problem 1 Say \(X_1,..,X_n\) is a sample from a mixture of normal distributions with density
\[f(x;\mu,\sigma)=\frac1{2\sqrt{2\pi}}\exp\{-\frac12 x^2\}+\frac1{2\sqrt{2\pi\sigma^2}}\exp\{-\frac1{2\sigma^2} (x-\mu)^2\}\]
Let \(L(\mu,\sigma)\) be the likelihood function. Show that \(\max \{L(\mu,\sigma):-\infty<\mu<\infty;0<\sigma<\infty\}=\infty\).
(This is an example where the likelihood function is quite badly behaved)
\[ \begin{aligned} &l(\mu,\sigma) =\log L(\mu,\sigma) = \\ &\sum_{i=1}^n\log \left(\frac1{2\sqrt{2\pi}}\exp\{-\frac12 x_i^2\}+\frac1{2\sqrt{2\pi\sigma^2}}\exp\{-\frac1{2\sigma^2} (x_i-\mu)^2\}\right)\\ &l(x_1,\sigma) = \\ &\sum_{i=2}^n\log \left(\frac1{2\sqrt{2\pi}}\exp\{-\frac12 x_i^2\}+\frac1{2\sqrt{2\pi\sigma^2}}\exp\{-\frac1{2\sigma^2} (x_i-x_1)^2\}\right) +\\ &\frac1{2\sqrt{2\pi}}\exp\{-\frac12 x_1^2\}+\frac1{2\sqrt{2\pi\sigma^2}} \end{aligned} \]
Now
\[ \begin{aligned} &\lim_{\sigma\rightarrow 0} \frac{c}{\sigma}\exp(-t/\sigma^2)=0\\ &\lim_{\sigma\rightarrow 0} \frac{c}{\sigma} = \infty\\ &\lim_{\sigma\rightarrow 0} l(x_1,\sigma) =\infty \\ \end{aligned} \]
Problem 2 Say \(X\sim Beta(a, 1)\). Find the posterior distribution when the prior is \(\pi(a)=1/a,a>1\).
\[ \begin{aligned} &\pi(a) =1/a, a>1 \\ &f_X(x|a) = ax^{a-1},0<x<1\\ &f(x,a)=f_X(x|a)\pi(a) = ax^{a-1}/a=x^{a-1}\\ &m(x) =\int_{-\infty}^\infty f(x,a)da =\int_1^\infty x^{a-1} da = \\ &\int_1^\infty \exp((a-1)\log x) da = \\ &\exp((a-1)\log x)/\log x|_1^\infty = \\ &x^{a-1}/\log x|_1^\infty= -1/\log(x) \\ &f(a|x) = \frac{f(x, a)}{m(a)} = \frac{x^{a-1}}{-1/\log x} = -\log x\text{ } x^{a-1}\\ &a>1, 0<x<1 \end{aligned} \]
Below is data from a geometric random variable with rate r, that is \(P(X=k)=p(1-p)^{k-1}, r=1,2,...\). Find the Bayesian point estimator of \(p\) using the mean of the posterior distribution the prior \(p\sim U[0,1]\)
x | Counts |
---|---|
1 | 64 |
2 | 20 |
3 | 9 |
4 | 5 |
5 | 2 |
Now
\[f(\pmb{x},\vert p) = \prod_{k=1}^n p(1-p)^{x_i-1} = p^n(1-p)^{\sum x_i-n}\] Let \(y=\sum x_i\), then \[ \begin{aligned} &f(\pmb{x},p) = p^n(1-p)^{y-n}I_{[0,1]}(p)\\ &m(\pmb{x}) = \int_0^1 p^n(1-p)^{y-n}dp = \\ &\frac{\Gamma(n+1)\Gamma(y-n+1)}{\Gamma(y+2)}\int_0^1 \frac{\Gamma(y+2)}{\Gamma(n+1)\Gamma(y-n+1)}p^{(n+1)-1}(1-p)^{(y-n+1)-1}dp = \\ &\frac{\Gamma(n+1)\Gamma(y-n+1)}{\Gamma(y+2)} \end{aligned} \]
and so the posterior distribution is
\[f(p\vert \pmb{x}) =\frac{f(\pmb{x},p)}{m(\pmb{x})} = \frac{\Gamma(y+2)}{\Gamma(n+1)\Gamma(y-n+1)}p^n(1-p)^{y-n}I_{[0,1]}(p)\]
and so \(p\vert\pmb{x}\sim Beta(n+1, \sum x_i-n+1)\), and the estimator is
\[E[p\vert\pmb{x}] = \frac{n+1}{y+2}\]
round((length(x)+1)/(sum(x)+2), 3)
## [1] 0.62