A matrix is a rectangular or square array of numbers or variables. We use uppercase boldface letters to represent matrices. All elements of matrices will be real numbers or variables representing real numbers. Here is an example of a \(3\times 2\) matrix
\[ \pmb{A} = \left(a_{ij}\right) = \begin{pmatrix} 0.5 & 12 \\ 0.8 & 9\\ -0.1 & 14 \\ \end{pmatrix} \]
A vector is an \(n\times 1\) matrix:
\[ \pmb{x} = \begin{pmatrix} x_1 \\ x_2 \\ x_3 \\ \end{pmatrix} \]
A matrix of dimension \(1\times 1\) is called a scalar.
Two matrices A and B are equal if they have the same dimension and if \(a_{ij}=b_{ij}\) for all i and j.
The transpose A’ of a matrix is the matrix with rows and columns exchanged.
\[ \pmb{A}' = \left(a_{ji}\right) = \begin{pmatrix} 0.5 & 0.8 & -0.1 \\ 12 & 9 & 14 \\ \end{pmatrix} \]
An \(n\times m\) matrix \(\pmb{A}\) is called square if n=m.
A matrix \(\pmb{A}\) is called symmetric if \(\pmb{A}= \pmb{A}'\).
The diagonal of a matrix \(\pmb{A}\) are the elements \((a_{ii})\). A matrix is called diagonal if \((a_{ij})=0\) for all \(i\ne j\).
A matrix \(\pmb{A}\) with \((a_{ii})=1\) for all \(i\) and \((a_{ij})=0\) for all \(i\ne j\) is called an identity matrix.
A matrix is called upper triangular if \((a_{ij})=0\) for all \(i<j\) and lower triangular if \((a_{ij})=0\) for all \(i>j\).
A vector of 1’s is denoted by \(\pmb{j}\):
\[ \pmb{j} = \begin{pmatrix} 1 \\ 1 \\ 1 \\ \end{pmatrix} \]
A square matrix of 1’s is denoted by \(\pmb{J}\):
\[ \pmb{J} = \begin{pmatrix} 1 &1&1 \\ 1 &1&1 \\ 1 &1&1 \\ \end{pmatrix} \]
A vector and a matrix of 0’s are denoted by
\[ \pmb{0} = \begin{pmatrix} 0 \\ 0 \\ 0 \\ \end{pmatrix} \]
and \[ \pmb{O} = \begin{pmatrix} 0 &0&0 \\ 0 &0&0 \\ 0 &0&0 \\ \end{pmatrix} \]
Let \(\pmb{A}\) be a \(n\times m\) matrix and \(\pmb{B}\) be a \(m\times k\) matrix, then product \(\pmb{C}=\pmb{AB}\) is defined by
\[c_{ij}=\sum_{l=1}^m a_{il}b_{lj}\]
Let \(\pmb{a} = (a_1,..,a_n)'\) and \(\pmb{b} = (b_1,..,b_n)'\) then \(\pmb{a}'\pmb{a} = \sum_{i=1}^n a_{i}^2\) and \(\pmb{a}'\pmb{b} = \sum_{i=1}^n a_{i}b_{i}\)
\[ \begin{pmatrix} 0.5 & 12 \\ 0.8 & 9\\ -0.1 & 14 \\ \end{pmatrix} \begin{pmatrix} 1 & 2 & 5 & -2.5 & 7 \\ .6 & 0.8 &6 &0 & -2.7 \end{pmatrix}=\\ \begin{pmatrix} 7.7 & 10.6 & 74.5 & -1.25& -28.9 \\ 6.2 & 8.8 & 58.0 &-2.00& -18.7\\ 8.3& 11.0& 83.5& 0.25& -38.5 \end{pmatrix} \]
Matrix multiplication in R is done like this:
A=matrix(c(0.5, 0.8,-0.1, 12,9,14), 3,2)
B=matrix(c(1, 0.6, 2, 0.8, 5, 6, -2.5, 0, 7, -2.7), 2,5)
A%*%B
## [,1] [,2] [,3] [,4] [,5]
## [1,] 7.7 10.6 74.5 -1.25 -28.9
## [2,] 6.2 8.8 58.0 -2.00 -18.7
## [3,] 8.3 11.0 83.5 0.25 -38.5
In general \(\pmb{AB}\ne \pmb{BA}\).
proof
\(\pmb{AB}\) and \(\pmb{BA}\) can only exist if both matrices are square. Now say
\[ \begin{pmatrix} 5 & 12 \\ 8 & 9\\ \end{pmatrix} \begin{pmatrix} 0 & 2 \\ 1 & 0 \\ \end{pmatrix} = \begin{pmatrix} 12& 10 \\ 9& 16 \\ \end{pmatrix} \]
but
\[ \begin{pmatrix} 0 & 2 \\ 1 & 0 \\ \end{pmatrix} \begin{pmatrix} 5 & 12 \\ 8 & 9\\ \end{pmatrix} = \begin{pmatrix} 16& 18 \\ 5& 12 \\ \end{pmatrix} \]
Let \(\pmb{x}\) be a vector, then the Euclidean distance or length of the vector is defined by
\[\sqrt{\pmb{x'x}}=\sqrt{\sum_{i=1}^n x_i^2}\]
Let \(\pmb{A}\) be an \(n\times m\) matrix and \(\pmb{j}\) a vector of 1’s, then
\[ \pmb{Aj}= \begin{pmatrix} \sum\limits_{i=1}^n a_{1i} \\ \sum\limits_{i=1}^n a_{2i} \\ ...\\ \sum\limits_{i=1}^n a_{mi} \\ \end{pmatrix} \]
\(\pmb{(AB)'}=\pmb{A'B'}\)
proof
Let \(\pmb{C}=\pmb{AB}\). Now
\[ \begin{aligned} &(\pmb{AB})'_{ij}= (\pmb{C}')_{ij}=c_{ji} = \\ &\sum_{k=1}^n a_{jk}b_{ki} = \\ &\sum_{k=1}^n b_{ki}a_{jk} = \\ &\sum_{k=1}^n (\pmb{B})_{ki}(\pmb{A})_{jk} = \\ &\sum_{k=1}^n (\pmb{B'})_{ik}(\pmb{A'})_{kj} = \\ &\pmb{B}'\pmb{A}' \end{aligned} \]
Partitioned Matrix
We can partition a matrix as follows
\[ \pmb{A} = \begin{pmatrix} \pmb{A_{11}} & \pmb{A_{12}} \\ \pmb{A_{21}} & \pmb{A_{22}} \\ \end{pmatrix} \]
\[ \pmb{A} = \begin{pmatrix} 1 & 3 & 1 & 0 \\ 0 & 4 & 5 &1 \\ 2 & 1 & 3 &7 \\ 1 & 1 & 5 &3 \\ \end{pmatrix} = \begin{pmatrix} \pmb{A_{11}} & \pmb{A_{12}} \\ \pmb{A_{21}} & \pmb{A_{22}} \\ \end{pmatrix} \]
where
\[ \pmb{A_{11}} = \begin{pmatrix} 1 & 3 \\ 0 & 4 \end{pmatrix} \\ \pmb{A_{12}} = \begin{pmatrix} 1 & 0 \\ 5 &1 \\ \end{pmatrix} \\ \pmb{A_{21}} = \begin{pmatrix} 2 & 1 \\ 1 & 1 \\ \end{pmatrix}\\ \pmb{A_{22}} = \begin{pmatrix} 3 &7 \\ 5 &3 \\ \end{pmatrix} \]
If \(\pmb{A}\) is symmetric matrix and \(\pmb{x}, \pmb{y}\) are vectors, then
is calles a linear form.
is called the quadratic form.
is called a bilinear form.
A set of vectors \(\pmb{a}_1,..,\pmb{a}_n\) is called linearly dependent if there exist salars \(c_1,..,c_n\) (not all 0) such that
\[c_1\pmb{a}_1+..+c_n \pmb{a}_n=0\]
If no such coefficients \(c_1,..,c_n\) can be found the vectors are alled linearly independent.
The \(rank\) of a square matrix \(\pmb{A}\) is the number of linearly independent columns of \(\pmb{A}\).
An \(n\times p\) matrix \(\pmb{A}\) with n<p is said to be full rank if rank( \(\pmb{A}\) )=n. A full-rank square matrix is called nonsingular.
\(\text{rank}(\pmb{AB}) \le \min \{\text{rank}(\pmb{A});\text{rank}(\pmb{B}) \}\)
\(\text{rank}(\pmb{AA'})=\text{rank}(\pmb{A'A})=\text{rank}(\pmb{A})\)
proof omitted
A nonsingular matrix \(\pmb{A}\) has a unique inverse \(\pmb{A}^{-1}\) such that
\[\pmb{A}\pmb{A}^{-1} = \pmb{A}^{-1}\pmb{A} = \pmb{I}\]