Skip to main content

Subsection 2.3.5 Special Square Matrices

Special square matrices refer to matrices that have certain properties that make them unique in various applications. Some of them are discussed here: Identity matrix, Diagonal matrix, Symmetric matrix, Skew-symmetric matrix, etc.

Subsubsection 2.3.5.1 Unit or Identity Matrix:

This is a square matrix in which all the diagonal elements are equal to 1, and all the other elements are 0. It is denoted by I.
\begin{equation*} I=\delta_{ij} = \begin{cases} 1, & \text{if } i = j\\ 0, & \text{otherwise} \end{cases} \end{equation*}
and for any matrix A, \(AI = IA = A \text{.}\) A \(3\times 3\) unit matrix is shown as
\begin{equation*} I = \begin{bmatrix} 1 & 0 & 0\\0 & 1 & 0\\0 & 0 & 1 \end{bmatrix} \end{equation*}

Subsubsection 2.3.5.2 Diagonal Matrix

This is a square matrix in which all the non-diagonal elements are 0. The diagonal elements can be any numbers, including 0. It is denoted by D. A diagonal matrix can be used to represent a system of linear equations with diagonal coefficients. A diagonal matrix is shown as
\begin{equation*} D = \begin{bmatrix} a & 0 & 0\\0 & b & 0\\0 & 0 & c \end{bmatrix} \end{equation*}

Subsubsection 2.3.5.3 The Inverse, Singular, and Non - Singular Matrices

The inverse or reciprocal of a square matrix is denoted by the relation \(AA^{-1}=I\text{.}\) Non-square matrix does not have inverse. Some of the square matrices also do not have inverses, they are called a singular or noninvertible matrices and those which have inverse are called invertible or non-singular matrices.

Subsubsection 2.3.5.4 Cofactor Matrix, \(A^{c}\)

A cofactor of an element in a given square matrix is the determinant formed by removing the row and column through the element with proper sign. The determinant is preceded by plus sign or minus sign according as the sum of the location numbers of the row and column which have been removed is even or odd. By following the above process of finding a cofactor, we can have as many cofactors as elements in the given matrix. The matrix formed by these process is called a cofactor matrix. It is defined by \(A^{c} = A^{ij}\text{.}\) For example: if
\begin{equation*} A=\begin{bmatrix} a_{11} & a_{12} & a_{13}\\ a_{21} & a_{22} & a_{23}\\ a_{31} & a_{32} & a_{33} \end{bmatrix}; \end{equation*}
then,
\begin{equation*} A^{c}=\begin{bmatrix} A_{11} & A_{12} & A_{13}\\ A_{21} & A_{22} & A_{23}\\ A_{31} & A_{32} & A_{33} \end{bmatrix} \end{equation*}
where,
\begin{equation*} A^{11}= (-1)^{1+1}{\begin{vmatrix} a_{22} & a_{23} \\ a_{32} & a_{33} \end{vmatrix}}, \quad A^{12}= (-1)^{1+2}{\begin{vmatrix} a_{21} & a_{23} \\ a_{31} & a_{33} \end{vmatrix}}, \end{equation*}
\begin{equation*} A^{13}= (-1)^{1+3}{\begin{vmatrix} a_{21} & a_{22} \\ a_{31} & a_{32} \end{vmatrix}}, \end{equation*}
\begin{equation*} A^{21}= (-1)^{2+1}{\begin{vmatrix} a_{12} & a_{13} \\ a_{32} & a_{33} \end{vmatrix}}, \quad A^{22}= (-1)^{2+2}{\begin{vmatrix} a_{12} & a_{13} \\ a_{31} & a_{33} \end{vmatrix}}, \end{equation*}
\begin{equation*} A^{23}= (-1)^{2+3}{\begin{vmatrix} a_{11} & a_{12} \\ a_{31} & a_{32} \end{vmatrix}}, \end{equation*}
\begin{equation*} A^{31}= (-1)^{3+1}{\begin{vmatrix} a_{12} & a_{13} \\ a_{22} & a_{23} \end{vmatrix}}, \quad A^{32}= (-1)^{3+2}{\begin{vmatrix} a_{11} & a_{13} \\ a_{21} & a_{23} \end{vmatrix}}, \end{equation*}
\begin{equation*} A^{33}= (-1)^{3+3}{\begin{vmatrix} a_{11} & a_{12} \\ a_{21} & a_{22} \end{vmatrix}}. \end{equation*}

Subsubsection 2.3.5.5 Adjoint of a Matrix, \(\hat{A}\)

The adjoint or adjugate of a matrix is the transpose of its cofactor matrix. i.e., \(adj A=\hat{A}=A^{ct}\text{,}\) e.g.
\begin{equation*} A= \begin{pmatrix} 1 & 3\\2 & 1 \end{pmatrix}, \quad A^{c}= \begin{pmatrix} 1 & -2\\-3 & 1 \end{pmatrix}, \end{equation*}
\begin{equation*} A^{ct}= \begin{pmatrix} 1 & -3\\-2 & 1 \end{pmatrix} = \hat{A} \end{equation*}

Properties.

The matrices \(A\) and \(\hat{A}\) are commutative and their product is a scalar matrix. The diagonal element of which is \(\vert A \vert \text{,}\) i.e., \(A\cdot (adj A)\equiv (adj A)\cdot A = |A|I\text{,}\) where \(I\) is an unit matrix. Also,
\begin{equation*} A^{-1}=\frac{I}{A}=\frac{A\cdot (adj A)}{|A|\cdot A} = \frac{adj A}{|A|} = \frac{A^{c t}}{|A|}, \end{equation*}
where A is a non-singular matrix, i.e., \(\vert A \vert \neq 0\text{.}\)

Subsubsection 2.3.5.6 Self - Adjoint Matrix

If \(adj A=A\) then the matrix \(A\) is called a self - adjoint matrix, e.g.
\begin{equation*} A=\begin{pmatrix} 1 & 0\\0 & 1 \end{pmatrix}, \quad A^{c} = \begin{pmatrix} 1 & 0\\0 & 1 \end{pmatrix}, \end{equation*}
and
\begin{equation*} A^{ct}=\hat{A}=\begin{pmatrix} 1 & 0\\0 & 1 \end{pmatrix} = A. \end{equation*}

Subsubsection 2.3.5.7 Symmetric Matrix

If \(A^{t}=A\text{,}\) then the matrix \(A\) is called a symmetric matrix, e.g.
\begin{equation*} A=\begin{pmatrix} 0 & 1\\ 1 & 0 \end{pmatrix}, \quad A^{t}=\begin{pmatrix} 0 & 1\\1 & 0 \end{pmatrix} =A \end{equation*}

Subsubsection 2.3.5.8 Antisymmetric (Skew) Matrix

If \(A^{t}=-A\text{,}\) then the matrix \(A\) is called an antisymmetric matrix, e.g.
\begin{equation*} A= \begin{pmatrix} 0 & -i\\i & 0 \end{pmatrix}, \quad A^{t} = \begin{pmatrix} 0 & i\\-i & 0 \end{pmatrix}. \end{equation*}

Subsubsection 2.3.5.9 Hermitian Matrix

If \(A^{\dagger}=A\) or \(a_{ij}=\bar{a}_{ji}\) in a square matrix, then the matrix \(A\) is said to be a Hermitian matrix, e.g.
\begin{equation*} A=\begin{pmatrix} 0 & -i\\i & 0 \end{pmatrix}, \quad A^{*}= \begin{pmatrix} 0 & i\\-i & 0 \end{pmatrix}, \end{equation*}
\begin{equation*} A^{\dagger}=(A^{*})^{t}= \begin{pmatrix} 0 & -i\\i & 0 \end{pmatrix} = A. \end{equation*}
If \(A^{\dagger}=-A\text{,}\) then \(A\) is known as a skew Hermitian matrix.

Subsubsection 2.3.5.10 Unitary Matrix

A square matrix \(A\) is said to be an unitary matrix, if \(AA^{\dagger}=I\text{,}\) e.g.
\begin{equation*} A=\begin{pmatrix} 0 & 1\\1 & 0 \end{pmatrix}, \quad A^{*}=\begin{pmatrix} 0 & 1\\1 & 0 \end{pmatrix}, \quad (A^{*})^{t} = A^{\dagger} = \begin{pmatrix} 0 & 1\\1 & 0 \end{pmatrix} \end{equation*}
and
\begin{equation*} AA^{\dagger}= \begin{pmatrix} 0 & 1\\1 & 0 \end{pmatrix}\begin{pmatrix} 0 & 1\\1 & 0 \end{pmatrix} = \begin{pmatrix} 0+1 & 0+0\\0+0 & 1+0 \end{pmatrix} = \begin{pmatrix} 1 & 0\\0 & 1 \end{pmatrix}=I \end{equation*}

Subsubsection 2.3.5.11 Orthogonal Matrix

A square matrix A is said to be an orthogonal matrix if \(AA^{t}=I\text{,}\) e.g.
\begin{equation*} A=\begin{pmatrix} 0 & 1\\1 & 0 \end{pmatrix}, \quad A^{t}=\begin{pmatrix} 0 & 1\\1 & 0 \end{pmatrix}, \end{equation*}
and,
\begin{equation*} AA^{t} = \begin{pmatrix} 0 & 1\\1 & 0 \end{pmatrix}\begin{pmatrix} 0 & 1\\1 & 0 \end{pmatrix} = \begin{pmatrix} 0+1 & 0+0\\0+0 & 1+0 \end{pmatrix} = \begin{pmatrix} 1 & 0\\0 & 1 \end{pmatrix}=I. \end{equation*}