Say \(X_1,..,X_n\) is a sample from a mixture of normal distributions with density
\[f(x;\mu,\sigma)=\frac1{2\sqrt{2\pi}}\exp\{-\frac12 x^2\}+\frac1{2\sqrt{2\pi\sigma^2}}\exp\{-\frac1{2\sigma^2} (x-\mu)^2\}\]
Let \(L(\mu,\sigma)\) be the likelihood function. Show that \(\max \{L(\mu,\sigma):-\infty<\mu<\infty;0<\sigma<\infty\}=\infty\).
(This is an example where the likelihood function is quite badly behaved)
Say \(X\sim Beta(a, 1)\). Find the posterior distribution when the prior is \(\pi(a)=1/a,a>1\).
Below is data from a geometric random variable with rate p, that is \(P(X=k)=p(1-p)^{k-1}, p=1,2,...\). Find the Bayesian point estimator of \(p\) using the mean of the posterior distribution and the prior \(p\sim U[0,1]\).
x | Counts |
---|---|
1 | 64 |
2 | 20 |
3 | 9 |
4 | 5 |
5 | 2 |