Problem 1

Say X be a random variable with \(P(X=0)=P(X=1)=1/2\), and Y a random variable with conditional density \(f_{Y|X=x}(y|x) = cy^x(1-y)\). Find the mean and variance of Y using theorem 1.9.6.

\[ \begin{aligned} &E[Y^k|X=x]=\int_0^1 y^kcy^x(1-y) dy = \\ &c\int_0^1 y^{x+k} -y^{x+k+1} dy = \\ &c\left(\frac1{x+k+1}y^{x+k+1} -\frac1{x+k+2}y^{x+k+2}|_0^1\right)=\\ &c\left(\frac1{x+k+1} -\frac1{x+k+2}\right) = \\ &\frac{c}{(x+k+1)(x+k+2)} \end{aligned} \] so

\[ \begin{aligned} &c = (x+1)(x+2) \\ &E[Y|X=x] = \frac{(x+1)(x+2)}{(x+2)(x+3)}=\frac{x+1}{x+3}\\ &E[Y^2|X=x] = \frac{(x+1)(x+2)}{(x+3)(x+4)}\\ &var(Y|X=x) = \frac{(x+1)(x+2)}{(x+3)(x+4)}-\left( \frac{x+1}{x+3}\right)^2 =\\ &\frac{x+1}{x+3}\left[\frac{x+2}{x+4}-\frac{x+1}{x+3} \right]=\\ &\frac{2(x+1)}{(x+3)^2(x+4)} \end{aligned} \]

\[ \begin{aligned} &E[Y] = E\{E[Y|X]\} = E\left\{\frac{X+1}{X+3}\right\} =\\ &\frac{0+1}{0+3}\frac12+\frac{1+1}{1+3}\frac12 = \frac16+\frac14=\frac{5}{12}=0.4167\\ &\\ &var(Y) = E\{var(Y|X)\}+var\{E[Y|X]\} = \\ &E\left\{\frac{2(X+1)}{(X+3)^2(X+4)}\right\}+var\left\{\frac{X+1}{X+3}\right\} = \\ &E\left\{\frac{2(X+1)}{(X+3)^2(X+4)}\right\}+E\left\{\left(\frac{X+1}{X+3}\right)^2\right\} - (\frac{5}{12})^2 \\ &E\left\{\frac{2(X+1)}{(X+3)^2(X+4)}\right\} = \\ &\frac{2}{3^24}\frac12+\frac{4}{4^25}\frac12=\frac{1}{3^22^2}+\frac1{2^35}=\frac{19}{3^22^35}\\ &E\left\{\left(\frac{X+1}{X+3}\right)^2\right\}=(\frac13)^2\frac12+(\frac24)^2\frac12 =\frac{13}{3^22^3}\\ &var(Y)=\frac{19}{3^22^35}+\frac{13}{3^22^3}-(\frac{5}{12})^2=\\ &\frac{19}{3^22^35}+\frac{13}{3^22^3}-\frac{25}{3^22^4}=\frac{43}{3^22^45}=0.05972 \end{aligned} \]

n=1e4
x=sample(0:1,size=n,replace=TRUE)
y=rbeta(n, x+1, 2)
c(mean(y), 5/12)
## [1] 0.4181207 0.4166667
c(var(y), 43/9/16/5)
## [1] 0.06008744 0.05972222

Problem 2

Let X be a random variable with moment generating function \(\psi\). Assuming the conditions of theorem 1.10.5 hold show that

\[\frac{d^2 \log \psi(t) }{dt^2}|_{t=0}=var(X)\] Use this to find the variance of \(X\sim Geom(p)\)

\[ \begin{aligned} &\frac{d^2 \log \psi(t) }{dt^2} = \\ &\frac{d}{dt}\left\{\frac{d}{dt} \log \psi(t)\right\} = \\ &\frac{d}{dt}\frac{\psi'(t)}{\psi(t)} = \\ &\frac{ \psi''(t)\times \psi(t)-\left(\psi'(t)\right)^2}{\psi(t)^2} \\ &\\ &\frac{d^2 \log \psi(t) }{dt^2}|_{t=0} = \\ &\frac{E[X^2]E[X^0]-(E[X^1])^2}{E[e^0]^2}=\\ &E[X^2]-(E[X])^2=var(X) \end{aligned} \]
\(X\sim Geom(p)\), so by 1.10.3 \(\psi(t)=\frac{pe^t}{1-(1-p)e^t}\). Therefore

\[ \begin{aligned} &\log \psi(t) = \log p +t-\log\left(1-(1-p)e^t\right)\\ &\frac{d}{dt} \log \psi(t) = 1+\frac{(1-p)e^t}{1-(1-p)e^t}\\ &\frac{d^2}{dt^2} \log \psi(t) = \frac{(1-p)e^t[1-(1-p)e^t]-(1-p)e^t[-(1-p)e^t]}{[1-(1-p)e^t]^2}\\ &\frac{d^2}{dt^2} \log \psi(t)|_{t=0} = \frac{(1-p)[1-(1-p)]-(1-p)[-(1-p)]}{[1-(1-p)]^2}=\\ &\frac{(1-p)p+(1-p)^2}{p^2}=\frac{(1-p)[p+(1-p)]}{p^2}=\frac{1-p}{p^2} \end{aligned} \]

Problem 3

Let X be a random variable with density \(f(x)=3x^2\), \(0<x<1\). Find the moment generating function of X.

\[ \begin{aligned} &\psi(t) =\int_0^1 e^{tx} 3x^2 dx=\\ &3x^2 \frac1te^{tx}|_0^1-\int_0^1 6x\frac1te^{tx} dx = \\ &\frac{3e^{t}}{t}-\frac{6}{t}\int_0^1 xe^{tx} dx = \\ &\frac{3e^{t}}{t}-\frac{6}{t}\left[x\frac{1}{t}e^{tx}|_0^1-\int_0^1 \frac1te^{tx} dx \right] = \\ &\frac{3e^{t}}{t}-\frac{6}{t}\left[\frac{1}{t}e^{t}-\frac1t\int_0^1 e^{tx} dx \right] = \\ &\frac{3e^{t}}{t}-\frac{6}{t}\left[\frac{1}{t}e^{t}-(\frac1{t^2} e^{tx}|_0^1 \right] = \\ &\frac{3e^{t}}{t}-\frac{6}{t}\left[\frac{1}{t}e^{t}-\frac1{t^2} (e^{t}-1) \right] = \\ &\frac{3e^{t}}{t}-\frac{6e^{t}}{t^2}+\frac{6(e^{t}-1)}{t^3} \end{aligned} \]

Problem 4

  1. Find a condition on the moment generating function of a random variable X so that X and Y=-X have the same distribution. Give an example.

\[ \begin{aligned} &\psi_Y(t) = \\ &E[e^{tY}]=E[e^{t(-X)}] = \\ &E[e^{(-t)X}] = \\ &\psi_X(-t)=:\psi_X(t) \\ \end{aligned} \]

so the mgf has to be an even function in the neighborhood of 0. For example say \(X\sim U[-1,1]\), then

\[ \begin{aligned} &\psi_X(t) = E[e^{tX}] = \\ & \int_{-\infty}^{\infty} e^{tx}\frac12 I_{-1,1}(x) dx = \\ &\frac12 \int_{-1}^1 e^{tx}dx = \\ &\frac12 \frac1te^{tx}|_{-1}^1 = \\ &\frac{e^{t}-e^{-t}}{2t}\\ &\\ &\psi_X(-t)=\frac{e^{-t}-e^{-(-t)}}{2(-t)}=\frac{e^{t}-e^{-t}}{2t}=\psi_X(t) \end{aligned} \]

  1. Find a condition on the moment generating function of a random variable X so that X and Y=1-X have the same distribution. Give an example.

\[ \begin{aligned} &\psi_Y(t) = \\ &E[e^{tY}]=E[e^{t(1-X)}] = \\ &e^tE[e^{(-t)X}] = \\ &e^t\psi_X(-t)=:\psi_X(t) \\ \end{aligned} \]

For example say \(X\sim U[0,1]\), then

\[ \begin{aligned} &\psi_X(t) = E[e^{tX}] = \\ & \int_{-\infty}^{\infty} e^{tx}I_{0,1}(x) dx = \\ &\int_{0}^1 e^{tx}dx = \\ &\frac1te^{tx}|_{0}^1 = \frac{e^{t}-1}{t}\\ &\\ &e^t\psi_X(-t)=e^t\frac{e^{-t}-1}{-t}=\\ &e^t\frac{1-e^{-t}}{t}=\frac{e^{t}-1}{t}=\psi_X(t) \end{aligned} \]