Let \(\pmb{Z}=\begin{pmatrix} Z_1 & ... & Z_p \end{pmatrix}'\) be independent standard normal random variables, that is
\[f_{Z_i}(x)=\frac1{\sqrt{2\pi}}\exp\{- x^2/2\}\]
Then we have
\[ \begin{aligned} &f_{\pmb{Z}}(z_1,..,z_p) =\prod_{i=1}^p f_{Z_i}(z_i) =\\ &\prod_{i=1}^p \frac1{\sqrt{2\pi}}\exp\{- z_i^2/2\} = \\ &\frac1{(\sqrt{2\pi})^p}\exp\{- \sum_{i=1}^p z_i^2/2\} = \\ &(2\pi)^{-p/2}\exp\{-\pmb{z}'\pmb{z}/2\} \end{aligned} \]
A random vector with this density is said to have a multivariate normal distribution with mean vector \(\pmb{0}\) and covariance matrix \(\pmb{I}\).
Notation: \(\pmb{Z}\sim N_p(\pmb{0},\pmb{I})\)
Let \(\pmb{\Sigma}^{1/2}\) be a symmetric square root matrix as defined before, and let \[\pmb{X}=\pmb{\Sigma}^{1/2}\pmb{Z}+\pmb{\mu}\]
then
proof
\[E[\pmb{X}]=E[\pmb{\Sigma}^{1/2}\pmb{Z}+\pmb{\mu}]=\pmb{\Sigma}^{1/2}E[\pmb{Z}]+\pmb{\mu}=\pmb{\Sigma}^{1/2}\pmb{0}+\pmb{\mu}=\pmb{\mu}\]
\[cov(\pmb{X})=cov(\pmb{\Sigma}^{1/2}\pmb{Z}+\pmb{\mu})=\pmb{\Sigma}^{1/2}cov(\pmb{Z})({\Sigma}^{1/2})'=\pmb{\Sigma}^{1/2}\pmb{I}{\Sigma}^{1/2}=\pmb{\Sigma}\]
The density of \(\pmb{X}\) is given by
\[f_{\pmb{X}}(\pmb{x})=(2\pi)^{-p/2}\vert \pmb{\Sigma}\vert^{-1/2}\exp\{-(\pmb{y}-\pmb{\mu})' \pmb{\Sigma}^{-1}(\pmb{y}-\pmb{\mu})/2\}\]
proof
By the change of variable formula from calculus we have the following: say \(\pmb{y}=\pmb{\Sigma}^{1/2}\pmb{z}+\pmb{\mu}\)
\[f_{\pmb{X}}(\pmb{x})=f_{\pmb{Z}}(\pmb{z})\text{abs}(\vert \pmb{\Sigma}^{-1/2}\vert)\]
\(\vert \pmb{\Sigma}^{-1/2}\vert\) is called the Jacobian of the transformation.
Now \(\pmb{\Sigma}^{-1/2}\) is positive definite, so the Jacobian is also positive and we have
\[f_{\pmb{X}}(\pmb{x})=f_{\pmb{Z}}(\pmb{z})\vert \pmb{\Sigma}^{-1/2}\vert\]
\(\pmb{x}=\pmb{\Sigma}^{1/2}\pmb{z}+\pmb{\mu}\) implies \(\pmb{z}=\pmb{\Sigma}^{-1/2}(\pmb{x}-\pmb{\mu})\)
Finally
\[ \begin{aligned} &f_{\pmb{X}}(\pmb{x})=f_{\pmb{Z}}(\pmb{z})\vert \pmb{\Sigma}\vert^{-1/2} = \\ &(2\pi)^{-p/2}\exp\{-\pmb{z}'\pmb{z}/2\}\vert \pmb{\Sigma}\vert^{-1/2} = \\ &(2\pi)^{-p/2}\vert \pmb{\Sigma}\vert^{-1/2}\exp\{-\left(\pmb{\Sigma}^{-1/2}(\pmb{x}-\pmb{\mu})\right)'\left(\pmb{\Sigma}^{-1/2}(\pmb{x}-\pmb{\mu})\right)/2\} = \\ &(2\pi)^{-p/2}\vert \pmb{\Sigma}\vert^{-1/2}\exp\{-\left(\pmb{x}-\pmb{\mu}\right)'\pmb{\Sigma}^{-1/2}\pmb{\Sigma}^{-1/2}\left(\pmb{x}-\pmb{\mu}\right)/2\} = \\ &(2\pi)^{-p/2}\vert \pmb{\Sigma}\vert^{-1/2}\exp\{-\left(\pmb{x}-\pmb{\mu}\right)'\pmb{\Sigma}^{-1}\left(\pmb{x}-\pmb{\mu}\right)/2\} \end{aligned} \]
Notation: \(\pmb{X}\sim N_p(\pmb{\mu},\pmb{\Sigma})\)
Let \(\pmb{X}\sim N(\pmb{\mu}, \pmb{\Sigma})\) and let \(\pmb{Z}=\pmb{\Sigma}^{-1/2}(\pmb{X}-\pmb{\mu})\), then \(\pmb{Z}\sim N(\pmb{0}, \pmb{I})\)
Let p=1, then
\(\Sigma=[a]\), \(x' \Sigma x = ax^2 \ge 0\) iff \(a \ge 0\)
\(|\Sigma|=a, \Sigma^{-1}=1/a\), and
\[ \begin{aligned} &f_X(x) = (2\pi)^{-1/2}\vert \pmb{\Sigma}\vert^{-1/2}\exp\{-\left(\pmb{x}-\pmb{\mu}\right)'\pmb{\Sigma}^{-1}\left(\pmb{x}-\pmb{\mu}\right)/2\}\\ &\frac1{\sqrt{2\pi}}a^{-1/2}\exp\{-\left(x-\mu\right)\frac1a\left(x-\mu\right)/2\} = \\ &\frac1{\sqrt{2\pi a}}\exp\{-\frac{(x-\mu)^2}{2a}\} \end{aligned} \]
The moment generating function of a random vector \(\pmb{X}\) is defined by
\[\psi(\pmb{t})=E\left[e^{\pmb{t}'\pmb{X}} \right]\]
Let \(Z\sim N(0,1)\), then
\[ \begin{aligned} &\psi(t) =E\left[e^{tZ} \right]= \\ &\int_{-\infty}^\infty e^{tz} \frac1{\sqrt{2\pi}}\exp\{-\frac{z^2}{2}\} dz = \\ &\int_{-\infty}^\infty \frac1{\sqrt{2\pi}}\exp\{tz-\frac{z^2}{2}\} dz = \\ &\int_{-\infty}^\infty \frac1{\sqrt{2\pi}}\exp\{-\frac12 (z^2 -2tz+t^2) +t^2/2\} dz = \\ &\exp\{t^2/2\}\int_{-\infty}^\infty \frac1{\sqrt{2\pi}}\exp\{-\frac12 (z-t)^2\} dz = \\ &\exp\{t^2/2\} \\ \end{aligned} \]
because the integral is over the density of a N(t,1) random variable and therefore equal to 1.
Let \(X\sim N(\mu,\sigma^2)\), then
\[ \begin{aligned} &\psi_{X}(t) = E\left[e^{tX} \right]=\\ &E\left[e^{t(\sigma Z+\mu)} \right]= \\ &e^{\mu t}E\left[e^{(\sigma t) Z} \right]= \\ &e^{\mu t}\psi_{Z}(\sigma t) =\\ &e^{\mu t}\exp\{(\sigma t)^2/2\} =\\ &\exp\{\sigma^2 t^2/2+\mu t\} \end{aligned} \]
Say \(\pmb{X}\sim N_p(\pmb{\mu}, \pmb{\Sigma})\), then
\[\psi(\pmb{t})=\exp\{\pmb{t}'\pmb{\mu}+\frac12\pmb{t}'\pmb{\Sigma}\pmb{t}\}\]
proof
Let \(\pmb{Z}\sim N(\pmb{0}, \pmb{I})\), then \(\pmb{t'Z}\sim N(\pmb{t'0},\pmb{t'It})=N(\pmb{0},\pmb{t't})\). Let the random variable \(U\sim N(0, \pmb{t't})\), then
\[ \begin{aligned} &\psi_{\pmb{Z}}(\pmb{t}) = E\left[e^{t'Z}\right] = \\ &E\left[e^{1U}\right] = \psi_U(1) = \\ &e^{(\pmb{t't})1^2/2} =e^{\pmb{t't}/2} \end{aligned} \] and so
\[ \begin{aligned} &\psi_{\pmb{X}}(\pmb{t}) = E\left[e^{\pmb{t'X}}\right] = \\ &E\left[e^{\pmb{t}'(\pmb{\Sigma}^{1/2}\pmb{Z}+\pmb{\mu})}\right] = \\ &E\left[e^{(\pmb{t}'\pmb{\Sigma}^{1/2})\pmb{Z})}\right]e^{\pmb{t'\mu}} = \\ &e^{(\pmb{t}'\pmb{\Sigma}^{1/2})(\pmb{t}'\pmb{\Sigma}^{1/2})'/2}e^{\pmb{t'\mu}} = \\ &e^{\pmb{t}'\pmb{\Sigma}^{1/2}\pmb{\Sigma}^{1/2}\pmb{t}/2}e^{\pmb{t'\mu}} = \\ &\exp\{\pmb{t}'\pmb{\mu}+\frac12\pmb{t}'\pmb{\Sigma}\pmb{t}\} \end{aligned} \]
Recall two properties of moment generating functions:
Say \(\pmb{X}\sim N(\pmb{\mu}, \pmb{\Sigma})\), \(\pmb{a}\) and \(\pmb{b}\) are vectors of constants and \(\pmb{A}\) a matrix of constants. Then
proof
\[ \begin{aligned} &\psi_Y(t) =E[e^{t\pmb{a}'\pmb{Z}}]=E[e^{(t\pmb{a})'\pmb{Z}}] =\\ &\psi_{\pmb{Z}}(t\pmb{a}) = e^{(t\pmb{a})'(t\pmb{a})/2} =\\ &e^{(\pmb{a}'\pmb{a})t^2/2} \\ \end{aligned} \]
\[ \begin{aligned} &\psi_Y(t) =E[e^{t\pmb{a}'\pmb{X}}]=\\ &E[e^{(t\pmb{a})'\pmb{X}}] =\psi_{\pmb{X}}(t\pmb{a}) = \\ &\exp\{{(t\pmb{a})'\pmb{\mu}-(t\pmb{a})'\Sigma(t\pmb{a})/2} \}=\\ &\exp\{{(\pmb{a}'\pmb{\mu})t-(\pmb{a}'\Sigma\pmb{a})t^2/2} \}\\ \end{aligned} \]
and this is the mgf of a normal random variable with mean \(\pmb{a}'\pmb{\mu}\) and variance \(\pmb{a}'\Sigma\pmb{a}\).
ii and iii are done similarly.
The marginal distributions of a multivariate normal distribution are also multivariate normal.
\(X_i\sim N(\mu_i, \sigma_{ii})\)
proof
Say we want to find the marginal \(\pmb{Y}\). Then there exists a matrix \(\pmb{A}\) (of 0’s and 1’s) such that \(\pmb{AX}=\pmb{Y}\) and the result follows from (5.2.8ii).
direct consequence of i.
if \(\pmb{V}= \begin{pmatrix} \pmb{X} \\ \pmb{Y} \end{pmatrix}\) is \(N(\pmb{\mu}, \pmb{\Sigma})\), then \(\pmb{X}\) and \(\pmb{Y}\) are independent if and only if \(\pmb{\Sigma}_{xy}=\pmb{O}\)
proof
Suppose \(\pmb{\Sigma}_{xy}=\pmb{O}\). Then
\[ \pmb{\Sigma} = \begin{pmatrix} \Sigma_{xx} & \pmb{O} \\ \pmb{O} & \Sigma_{yy} \end{pmatrix} \]
and so
\[ \begin{aligned} &\pmb{t}'\pmb{\mu}+\frac12\pmb{t}'\pmb{\Sigma}\pmb{t} = \\ &(\pmb{t}'_x, \pmb{t}'_y)\begin{pmatrix}\pmb{\mu}_x\\\pmb{\mu}_y\end{pmatrix}+\frac12(\pmb{t}'_x, \pmb{t}'_y) \begin{pmatrix} \Sigma_{xx} & \pmb{O} \\ \pmb{O} & \Sigma_{yy} \end{pmatrix}\begin{pmatrix}\pmb{t}_x\\\pmb{t}_y\end{pmatrix} = \\ &\pmb{t}'_x\pmb{\mu}_x +\pmb{t}'_y \pmb{\mu}_y+ \frac12\pmb{t}'_x\pmb{\Sigma}_{xx}\pmb{t}_x + \frac12\pmb{t}'_y\pmb{\Sigma}_{yy}\pmb{t}_y \\ &\psi_{\pmb{V}}(\pmb{t}) = \exp\{\pmb{t}'_x\pmb{\mu}_x+ \frac12\pmb{t}'_x\pmb{\Sigma}_{xx}\pmb{t}_x\}\exp\{\pmb{t}'_y\pmb{\mu}_y+ \frac12\pmb{t}'_y\pmb{\Sigma}_{yy}\pmb{t}_y\} \end{aligned} \]
which is the product of two mgf’s of multitvariate normals and therefore \(\pmb{X}\) and \(\pmb{Y}\) are independent.
If \(\pmb{X}\sim N(\pmb{\mu}, \pmb{\Sigma})\), then \(X_i\perp X_j\) iff \(\sigma_{ij}=0\)
If \(\pmb{X}\sim N(\pmb{\mu}, \pmb{\Sigma})\) and if \(cov(\pmb{A}\pmb{X}, \pmb{B}\pmb{X})=\pmb{A}\pmb{\Sigma}\pmb{B}'=\pmb{O}\) then \(\pmb{A}\pmb{X}\perp \pmb{B}\pmb{X}\).
if \(\pmb{V}= \begin{pmatrix} \pmb{X} \\ \pmb{Y} \end{pmatrix}\) is \(N(\pmb{\mu}, \pmb{\Sigma})\) and \(\pmb{\Sigma}_{xy}\ne\pmb{O}\), then the conditional distribution of \(\pmb{Y}\vert \pmb{X}=\pmb{x}\) is multivariate normal with
\[ \begin{aligned} &E[ \pmb{Y}\vert \pmb{X}=\pmb{x} ] =\pmb{\mu}_y+\pmb{\Sigma}_{yx}\pmb{\Sigma}^{-1}_{xx}(\pmb{x}-\pmb{\mu}_x) \\ &cov(\pmb{Y}\vert \pmb{X}=\pmb{x}) =\pmb{\Sigma}_{yy}- \pmb{\Sigma}_{yx}\pmb{\Sigma}^{-1}_{xx}\pmb{\Sigma}_{xy} \end{aligned} \]
proof
We have
\[f_{\pmb{Y}\vert \pmb{X}=\pmb{x}}(\pmb{y}\vert \pmb{x}) =\frac{f(\pmb{x}, \pmb{y})}{f_{\pmb{X}}(\pmb{x})}\]
and this ratio can be evaluated directly from the definitions.
Say \(\begin{pmatrix} X \\ Y \end{pmatrix}\) is bivariate normal with mean vector \((0\text{ } 0)'\) and covariance matrix \(\Sigma=\begin{pmatrix} 1 & \rho \\ \rho & 1 \end{pmatrix}\). So
\[ \begin{aligned} &det(\Sigma)=1-\rho^2 \\ &\Sigma^{-1}=\frac{1}{1-\rho^2}\begin{pmatrix} 1 & -\rho \\ -\rho & 1 \end{pmatrix}\\ &(x\,\, y)'\frac{1}{1-\rho^2}\begin{pmatrix} 1 & -\rho \\ -\rho & 1 \end{pmatrix}\begin{pmatrix}x \\y\end{pmatrix} =\\ &\frac{1}{1-\rho^2}(x\,\, y)'\begin{pmatrix} x -\rho y) \\ -\rho x+ y) \end{pmatrix} =\\ &\frac{1}{1-\rho^2} \left(x^2 -2\rho xy+y^2\right) \end{aligned} \] the marginal is given by
\[f_X(x) = (2\pi)^{-1/2}\exp\{-\frac12 x^2\}\]
\[ \begin{aligned} &f_{Y|X=x}(y|x) =\frac{f(x,y)}{f_X(x)} = \\ &\frac{(2\pi\sqrt{1-\rho^2})^{-1}\exp\{-\frac{1}{2(1-\rho^2)} \left(x^2 -2\rho xy+y^2\right)\}}{(2\pi)^{-1/2}\exp\{-\frac12 x^2\}} = \\ &(2\pi(1-\rho^2))^{-1/2}\exp \left\{ -\frac12 \left[ \frac{1}{1-\rho^2} (x^2 -2\rho xy+y^2) - x^2 \right] \right\} = \\ &\frac1{\sqrt{2\pi(1-\rho^2)}}\exp \left\{ -\frac1{2(1-\rho^2)} \left[ x^2 -2\rho xy+y^2 - (1-\rho^2)x^2 \right] \right\} = \\ &\frac1{\sqrt{2\pi(1-\rho^2)}}\exp \left\{ -\frac1{2(1-\rho^2)} \left[ y^2-2\rho xy+\rho^2x^2 \right] \right\} = \\ &\frac1{\sqrt{2\pi(1-\rho^2)}}\exp \left\{ -\frac1{2(1-\rho^2)} (y - \rho x)^2 \right\} \\ \end{aligned} \]
and so \(Y|X=x\sim N(\rho x, 1-\rho^2)\).
Say \[ \begin{pmatrix} X \\ Y \end{pmatrix} \sim N \left( \begin{pmatrix} 1 \\ 2 \end{pmatrix}, \begin{pmatrix} 1 & 0.4 \\ 0.4 & 1 \\ \end{pmatrix} \right) \]
We want to find \(E[X|Y=2.3]\)
\[ \begin{aligned} &E[ \pmb{X}\vert \pmb{Y}=y] =\mu_x+\pmb{\Sigma}_{xy}\pmb{\Sigma}^{-1}_{yy}(y-\mu_y) \\ &\mu_y = 2 \\ &\pmb{\Sigma}_{xy}= 0.4 \\ &\pmb{\Sigma}_{yy} = 1 \\ &\pmb{\Sigma}_{yy}^{-1} = 1 \\ &E[X\vert Y=2.3] =\\ &1+0.4 \times 1 (2.3-2) = 1.12 \end{aligned} \]
Let’s see whether we can verify that with R
library(mvtnorm)
mu=c(1, 2)
vc=rbind(c(1, 0.4), c(0.4, 1))
xy=rmvnorm(1e5, mu, vc)
x=xy[ abs(xy[,2]-2.3)<0.1, 1]
length(x)
## [1] 7710
mean(x)
## [1] 1.137246
Say \[ \begin{pmatrix} X \\ Y \\ Z \end{pmatrix} \sim N \left( \begin{pmatrix} 1 \\ 2 \\ 3 \end{pmatrix}, \begin{pmatrix} 1 & 0.4 & 0.6 \\ 0.4 & 1 & -0.2 \\ 0.6 & -0.2 & 1\\ \end{pmatrix} \right) \]
Find \(E[X|Y=2.3, Z= 2.8]\)
\[ \begin{aligned} &E[ X\vert Y=y, Z=z ] =\mu_x+\pmb{\Sigma}_{x(yz)}\pmb{\Sigma}^{-1}_{(yz)(yz)}\begin{pmatrix} y-\mu_y \\ z-\mu_z \end{pmatrix} \\ &(\mu_y \,\, \mu_z)'=(2 \,\, 3)' \\ &\pmb{\Sigma}_{x(yz)}=(0.4 \,\, 0.6) \\ &\pmb{\Sigma}_{(yz)(yz)} = \begin{pmatrix} 1 &-0.2 \\ -0.2 &1 \end{pmatrix} \\ &\pmb{\Sigma}_{(yz)(yz)}^{-1} =\frac{1}{0.96} \begin{pmatrix} 1 &0.2 \\ 0.2 &1 \end{pmatrix} \\ &E[ X\vert Y=2.3, Z=2.8 ] =\\ &1+(0.4 \,\, 0.6)\frac{1}{0.96} \begin{pmatrix} 1 &0.2 \\ 0.2 &1 \end{pmatrix} \begin{pmatrix} 2.3-2 \\ 2.8-3 \end{pmatrix} = \\ &1+\frac{1}{0.96}(0.4 \,\, 0.6) \begin{pmatrix} 0.23 \\ -0.14 \end{pmatrix} = \\ &1+ 0.02/0.96 = 1.021\\ \end{aligned} \]
Let’s see whether we can verify that with R
library(mvtnorm)
mu=c(1, 2, 3)
vc=rbind(c(1, 0.4, 0.6), c(0.4, 1, -0.2), c(0.6, -0.2, 1) )
xyz=rmvnorm(1e5, mu, vc)
round(cor(xyz), 3)
## [,1] [,2] [,3]
## [1,] 1.000 0.403 0.600
## [2,] 0.403 1.000 -0.197
## [3,] 0.600 -0.197 1.000
xy=xyz[ abs(xyz[,3]-2.8)<0.075, 1:2]
x=xy[ abs(xy[,2]-2.3)<0.075, 1]
length(x)
## [1] 351
mean(x)
## [1] 1.030432
Let \(\pmb{V}\sim N_{p+q}(\pmb{\mu},\pmb{\Sigma} )\) and let \(\pmb{V}\) be partitioned as follows:
\(\pmb{V} = \begin{pmatrix} \pmb{X} \\ \pmb{Y} \end{pmatrix}\)
\(\pmb{\mu} = \begin{pmatrix} \pmb{\mu}_x \\ \pmb{\mu}_y \end{pmatrix}\)
\(\pmb{\Sigma} = \begin{pmatrix} \pmb{\Sigma}_{xx} & \pmb{\Sigma}_{xy} \\ \pmb{\Sigma}_{yx} & \pmb{\Sigma}_{yy} \end{pmatrix}\)
Denote the covariance of the conditional distribution of \(\pmb{Y}\) given \(\pmb{X}\) by
\[\sigma_{ij\cdot rs..q}\]
where \(X_i,X_j\) are two of the variables in \(\pmb{X}\) and \(Y_r,Y_s,..,Y_q\) are variables in \(\pmb{Y}\). For example \(\sigma_{23\cdot 124}\) is the covariance between \(X_2\) and \(X_3\) in the conditional distribution of \(V_1,..,V_4\) given \(V_5,..,V_9\) (say).
The partial correlation coefficient \(\rho_{ij\cdot rs..q}\) is defined in the usual way:
\[\rho_{ij\cdot rs..q}=\frac{\sigma_{ij\cdot rs..q}}{\sqrt{\sigma_{ii\cdot rs..q}\sigma_{jj\cdot rs..q}}}\]
Say \(\pmb{V}\) is a multivariate normal random variable with covariance matrix
\[ \pmb{\Sigma} = \begin{pmatrix} 10 & 0 & 1 & -2 \\ 0 & 5 & 3 & -2 \\ 1 & 3 & 4 & 1 \\ -2 & -2 & 1 & 6 \end{pmatrix} \]
and we use the partition \(\begin{pmatrix} \pmb{X} \\ \pmb{Y} \end{pmatrix}\) with
\[ \pmb{\Sigma} = \begin{pmatrix} 10 & 0 & | & 1 & -2 \\ 0 & 5 & | &3 & -2 \\ - & -& | & - & - \\ 1 & 3 & | &4 & 1 \\ -2 & -2 & | & 1 & 6 \end{pmatrix} \]
\[ \begin{aligned} &cov(\pmb{X}|\pmb{Y}) = \pmb{\Sigma}_{xx}-\pmb{\Sigma}_{xy}\pmb{\Sigma}^{-1}_{xx}\pmb{\Sigma}_{yx} \\ &\pmb{\Sigma}^{-1}_{xx} = \begin{pmatrix} 10 & 0 \\ 0 & 5 \end{pmatrix}^{-1} = \frac1{50} \begin{pmatrix} 5 & 0 \\ 0 & 10 \end{pmatrix}= \begin{pmatrix} 0.1 & 0 \\ 0 & 0.2 \end{pmatrix}\\ &\pmb{\Sigma}^{-1}_{xx}\pmb{\Sigma}_{yx} =\begin{pmatrix} 0.1 & 0 \\ 0 & 0.2 \end{pmatrix} \begin{pmatrix} 1 & -2 \\ 3 & -2 \end{pmatrix} = \begin{pmatrix} 0.1 & -0.2 \\ 0.6 & -0.4 \end{pmatrix}\\ &\pmb{\Sigma}_{xy}\pmb{\Sigma}^{-1}_{xx}\pmb{\Sigma}_{yx}= \begin{pmatrix} 1 & 3 \\ -2 & -2 \end{pmatrix} \begin{pmatrix} 0.1 & -0.2 \\ 0.6 & -0.4 \end{pmatrix}= \begin{pmatrix} 1.9 & -1.4 \\ -1.4 & 1.2 \end{pmatrix}\\ &cov(\pmb{X}|\pmb{Y}) = \begin{pmatrix} 10 & 0 \\ 0 & 5 \end{pmatrix}- \begin{pmatrix} 1.9 & -1.4 \\ -1.4 & 1.2 \end{pmatrix}= \begin{pmatrix} 8.1 & 1.4 \\ 1.4 & 3.8 \end{pmatrix} \end{aligned} \]
We can also use R:
A=matrix(c(10,0,1,-2,0,5,3,-2,1,3,4,1,-2,-2,1,6), 4, 4)
solve(A[1:2, 1:2])
## [,1] [,2]
## [1,] 0.1 0.0
## [2,] 0.0 0.2
solve(A[1:2, 1:2])%*%A[1:2, 3:4]
## [,1] [,2]
## [1,] 0.1 -0.2
## [2,] 0.6 -0.4
A[3:4, 1:2]%*%solve(A[1:2, 1:2])%*%A[1:2, 3:4]
## [,1] [,2]
## [1,] 1.9 -1.4
## [2,] -1.4 1.2
A[1:2,1:2]-A[3:4, 1:2]%*%solve(A[1:2, 1:2])%*%A[1:2, 3:4]
## [,1] [,2]
## [1,] 8.1 1.4
## [2,] 1.4 3.8
So now (in terms of \(\pmb{V}=(v_1,..,v_4)'\)
\[\rho_{12\cdot 34}=\frac{\sigma_{12\cdot 34}}{\sqrt{\sigma_{11\cdot 34}\sigma_{22\cdot 34}}} = \frac{1.4}{\sqrt{8.1\times 3.8}} = 0.25\]