Let \(X\) be a random variable with density \(f(x)=xe^{-x^2}\), \(x>0\). Find the variance of \(X\).
First note that
\[\int_0^{\infty} xe^{-x^2}dx = -\frac12 e^{-x^2}\vert_0^{\infty}=\frac12\]
so the density is not normalized.
Next
\[ \begin{aligned} &E[X] =\int_0^{\infty} x\times 2x e^{-x^2}dx = \\ &\int_0^{\infty} 2x^2 e^{-x^2}dx = \\ &\int_0^{\infty} (\sqrt2 x)^2 e^{-\frac12(\sqrt2x)^2} \frac1{\sqrt{2}}(\sqrt{2}dx)\text{ }(t=\sqrt2x;dt=\sqrt2dx)=\\ &\frac1{\sqrt2}\int_0^{\infty} t^2 e^{-\frac12t^2}dt = \\ &\frac1{2\sqrt2}\int_{-\infty}^{\infty} t^2 e^{-\frac12t^2}dt = \\ &\frac{\sqrt \pi}{2}\int_{-\infty}^{\infty}\frac1{\sqrt{2\pi}} t^2 e^{-\frac12t^2}dt = \frac{\sqrt \pi}{2}\\ \end{aligned} \]
and
\[ \begin{aligned} &E[X^2] =\int_0^{\infty} x^2\times 2x e^{-x^2}dx = \\ &2x^2(-\frac12 e^{-x^2}\vert_0^{\infty}- \int_0^{\infty} 4x\times (-\frac12 e^{-x^2}dx = \\ &\int_0^{\infty} 2x e^{-x^2}dx = \\ &-e^{-x^2}\vert_0^{\infty}=1 \end{aligned} \]
and so
\[var(X)=E[X^2]-E[X]^2=1-(\frac{\sqrt \pi}{2})^2 = 1-\frac{\pi}{4}=0.215\]
Let \(X\sim U[0,1]\) and \(Y|X=x\sim U[0,x]\), \(0<y<1\). Find the covariance of X and Y.
First we have \(E[X]=\frac12\). Then
\[ \begin{aligned} &f_{X,Y}(x,y) = f_X(x)f_{Y|X=x}(y|x) = 1\times \frac1x\text{; }0<y<x<1\\ &f_Y(y) =\int_{-\infty}^\infty f_{X,Y}(x,y) dx =\\ &\int_y^1 \frac1x dx = \log x|_y^1=-\log y;0<y<1\\ \end{aligned} \]
so
\[ \begin{aligned} &E[Y] =\int_0^1 y (-\log y) dy= \\ &-\left(\frac12 y^2\log y -\frac14y^2|_0^1\right) =\frac14 \end{aligned} \]
\[ \begin{aligned} &E[XY] =\int_0^1\int_y^1 xy\frac1x dx dy = \\ &\int_0^1 y\left(\int_y^1 1 dx\right) dy = \\ &\int_0^1 y\left(1-y\right) dy = \\ &\frac{y^2}2-\frac{y^3}3|_0^1 = \frac16 \end{aligned} \]
or easier using the formula for conditionl expectations:
\[ \begin{aligned} &E[XY] = E\{E[XY\vert X]\} =E\{X[X/2]\}=\\ &E[X^2]/2 = \left(var(X)+E[X]^2\right)/2 = \\ &(\frac1{12}+(\frac12)^2)/2 = 1/6\\ \end{aligned} \]
and so finally
\[cov(X,Y)=E[XY]-E[X]E[Y]=\frac16-\frac12\frac14=\frac{1}{24}\]
We have a sample from \(X_1,..,X_n\) are iid \(U[0,\theta]\):
0.005, 0.126, 0.582, 0.778, 1.109, 2.495, 2.610, 4.595, 7.926, 8.594
\[ \begin{aligned} &f(x_1,..,x_n|\theta) = \prod_{i=1}^n f_X(x_i|\theta) I_{[0,\theta]}(x_i)=\\ &\prod_{i=1}^n\frac1{\theta}I_{[0,\theta]}(x_i)=\frac1{\theta^n}I_{[0,\theta]}(\max\{x_i\})\\ &L(\theta|\pmb{x})=\frac1{\theta^n}I_{[\max\{x_i\},\infty)}(\theta) \end{aligned} \]
so the likelihood function is 0 up to \(\max\{x_i\}\), there it jumps to \(\frac1{(\max\{x_i\})^n}\) and then decrease. Therefore the mle is \(\max\{x_i\}=8.594\).
From probability theory we know that if \(Y_1,..,Y_n\sim U[0,1]\) and independent, then \(\max\{Y_i\}\sim \text{Beta}(1,n)\). But if \(X_i\sim U[0,\theta]\), then \(Y_i=X_i/\theta\sim U[0,1]\), and so \(M/\theta\sim \text{Beta}(1,n)\).
Note that the density of a Beta(1,n) is given by \(g(x)=nx^{n-1}\), \(0<x<1\), and so the cdf is \(G(x)=x^{n}\), \(0<x<1\).
We wish to test \(H_0:\theta=\theta_0\) vs \(H_a:\theta>\theta_0\), so it makes sense to reject the null hypothesis if \(M\) is large. Therefore
\[ \begin{aligned} &\alpha = P(\text{reject }H_0\vert H_0\text{ true}) = \\ &P(M>c\vert \theta=\theta_0) = \\ &P(M/\theta_0>c/\theta_0\vert \theta=\theta_0) = \\ &1-G(c/\theta_0)=1-(c/\theta_0)^n\\ &1-\alpha=(c/\theta_0)^n\\ &c=\theta_0(1-\alpha)^{1/n} \end{aligned} \]
10*(1-0.05)^(1/10)
## [1] 9.948838
and so we reject the null hypothesis if the maximum is larger than 9.95. We have M=8.594, and so we fail to reject the null hypothesis.