Say we have a single observation x from a rv X with distribution Pois(\(\lambda\)+b), where b is known.
\[ \begin{aligned} &f(x|\lambda) = \frac{(\lambda+b)^x}{x!}e^{-\lambda-b}\\ &l(\lambda) =\log f(x|\lambda)=x\log(\lambda+b)-\log(x!)-\lambda-b \\ &\frac{dl}{d\lambda} =\frac{x}{\lambda+b}-1=0 \\ &\hat{\lambda}=x-b \end{aligned} \]
Say we test \(H_0:\lambda=\lambda_0\) vs \(H_a:\lambda\ne\lambda_0\), then the LRT statistic is
\[ \begin{aligned} &\lambda(x)=2(l(\hat{\lambda})-l(\lambda_0)) = \\ &2(x\log(x-b+b)-\log(x!)-(x+b) \\ &-x\log(\lambda_0+b)+\log(x!)+(\lambda_0+b) = \\ &2(x\log(x)-x-x\log(\lambda_0+b)+\lambda_0) \end{aligned} \]
We reject the null if \(\lambda(x)>cr=qchisq(1-\alpha,1)\), so the acceptance region is \(\{x: \lambda(x)<cr\}\). To invert the test we need to find all the values of \(\lambda_0\) that lead to acceptance:
ci.a <- function(x, b, alpha=0.05) {
cr <- qchisq(1-alpha, 1)
lrt <- function(l, x, b) {
(-2)*(x*log(x)-x-x*log(l+b)+l)
}
l <- seq(0, 3*(x-b), length=1000)
y <- lrt(l, x, b)
y1 <- y[l<x-b]
l1 <- l[l<x-b]
left <- l1[which.min(abs(y1-max(y1)+cr))]
y1 <- y[l>x-b]
l1 <- l[l>x-b]
right <- l1[which.min(abs(y1-max(y1)+cr))]
plot(l, y, type="l")
abline(h=max(y)-cr)
abline(v=c(left, right))
round(c(left, right), 2)
}
ci.a(23, 5.6)
## [1] 9.25 28.11
Let’s draw the lrt as a function of x:
lrt <- function(x, l, b) {
2*(x*log(x)-x-x*log(l+b)+l)
}
x <- 0:60
y <- lrt(x, 23, 5.6)
plot(x, y)
we see that “lrt is small” if and only if \(x>cr_1\) or \(x<cr_2\). Using \(\alpha/2\) one ach side we find
\[ \begin{aligned} &\frac\alpha2 = P(X\le cr_1) = ppois(cr_1, \lambda+b)\\ &\frac\alpha2 = P(X\gt cr_2) = 1- ppois(cr_2, \lambda+b)\\ \end{aligned} \] these equations need to be solved numerically:
ci.b <- function(x, b, alpha=0.05) {
l <- seq(0, 3*(x-b), length=1000)
y <- ppois(x, l+b)
left <- l[which.min(abs(y-(1-alpha/2)))]
right <- l[which.min(abs(y-alpha/2))]
round(c(left, right), 2)
}
ci.b(23, 5.6)
## [1] 9.77 28.90
\[ \begin{aligned} &f(x;\lambda) = \frac{(\lambda+b)^x}{x!}e^{-\lambda-b}\\ &m(x) = \int_0^\infty \frac{(\lambda+b)^x}{x!}e^{-\lambda-b} d\lambda = \\ &\int_0^\infty \frac{1}{x!}\sum_{k=0}^x\left({x\choose k}\lambda^k b^{x-k} \right) e^{-\lambda-b} d\lambda = \\ &\sum_{k=0}^x b^{x-k}e^{-b}\int_0^\infty \frac{1}{(x-k)!k!}\lambda^k e^{-\lambda} d\lambda = \\ &\sum_{k=0}^x b^{x-k}e^{-b}\frac{1}{(x-k)!}\int_0^\infty \frac{1}{\Gamma(k+1)}\lambda^{k+1-1} e^{-\lambda} d\lambda = \\ &\sum_{k=0}^x\frac{ b^{x-k}}{(x-k)!}e^{-b}=e^{-b}\sum_{i=0}^x\frac{ b^{i}}{i!} \end{aligned} \]
now for x not to small \(e^{-b}\sum_{i=0}^x\frac{ b^{i}}{i!}=e^{-b}e^b=1\), so the posterior distribution is the same as the likelihood and so the Bayesian credible interval is the same as the 95% confidence interval in b.