Skip to main content

Section 2.6 ExamplesC

Example 2.6.1.

Show that there is only one eigen value for each eigen vector in a square matrix and also show that in such a matrix there may exist one or more eigen vectors for each eigen value.
Solution.
Let us assume that there exist two distinct eigen values \(\lambda_{1}\) and \(\lambda_{2}\) corresponding to a given eigen vector \(X\) of a square matrix \(A\text{.}\) Then, we have
\begin{equation*} AX=\lambda_{1}X, \quad AX=\lambda_{2}X \end{equation*}
on subtracting, we get -
\begin{equation*} (\lambda_{1}-\lambda_{2})X=0 \end{equation*}
as,
\begin{equation*} (\lambda_{1}-\lambda_{2}) \neq 0 \end{equation*}
hence,
\begin{equation*} X=0 \end{equation*}
This is a contradiction that \(X\) is a non - zero vector. Hence corresponding to an eigen vector \(X\) there is only one eigen value of the square matrix \(A\text{.}\) Again, if \(\lambda\) be the eigen value of \(A\text{,}\) then the corresponding characteristic (eigen) vector \(X\) will be given by \(AX=\lambda_{1}X\text{.}\)
Let \(k\) be any non - zero scalar, then \(A(kX) = \lambda (kX)\text{.}\) Thus, \(kX\) is also an eigen vector of \(A\) corresponding to the same eigen value \(\lambda\text{.}\)

Example 2.6.2.

The eigen values of an orthogonal matrix have unit modulus.
Solution.
Let \(A\) be an orthogonal matrix. Hence \(A^{'}A=AA^{'}=I\text{.}\) Now, from a characteristic equation, we have
\begin{equation} AX=\lambda X \tag{2.6.1} \end{equation}
on taking the transpose on both sides of eqn. (2.6.1), we get -
\begin{equation} (AX)' =(\lambda X)'\tag{2.6.2} \end{equation}
on multiplying eqns. (2.6.1) and (2.6.2), we get -
\begin{align*} (AX)' (AX) \amp =(\lambda X)' (\lambda X)\\ \text{or,} \quad (X' A')(AX) \amp =\lambda^{2} X' X \\ \text{or,} \quad X'(A' A)X \amp =\lambda^{2} X' X \\ \text{or,} \quad X' X \amp =\lambda^{2} X' X \\ \text{or,} \quad (1-\lambda^{2})X' X \amp = 0 \end{align*}
Since \(X\) is an eigen vector, \(X \neq 0\) and consequently \(X' X \neq 0\text{.}\) Hence, we get -
\begin{equation*} (1-\lambda^{2}) = 0 \end{equation*}
or,
\begin{equation*} \lambda = \pm 1. \hspace{5pt} proved. \end{equation*}

Example 2.6.3.

The eigen values of a unitary matrix have unit modulus.
Solution.
Let \(A\) be a unitary matrix. Hence
\begin{equation} A^{\dagger}A = AA^{\dagger} =I \tag{2.6.3} \end{equation}
If \(\lambda\) be an eigen value of the matrix \(A\) and \(X\) is its eigen vector then from characteristic equation
\begin{equation} AX=\lambda X \tag{2.6.4} \end{equation}
on taking the transpose conjugate of eqn. (2.6.4), we get -
\begin{equation*} (AX)^{\dagger}=(\lambda X)^{\dagger} \end{equation*}
or,
\begin{equation} X^{\dagger}A^{\dagger}= \bar{\lambda}X^{\dagger} \tag{2.6.5} \end{equation}
on multiplying eqns. (2.6.4) and (2.6.5), we get -
\begin{equation*} (X^{\dagger}A^{\dagger})(AX)= (\bar{\lambda}X^{\dagger})(\lambda X) \end{equation*}
or,
\begin{equation*} X^{\dagger}(A^{\dagger}A)X= (\lambda\bar{\lambda})(X^{\dagger} X) \end{equation*}
or,
\begin{equation*} X^{\dagger}X = (\lambda\bar{\lambda})(X^{\dagger} X) \end{equation*}
\begin{equation} (1-\lambda\bar{\lambda})X^{\dagger} X =0 \tag{2.6.6} \end{equation}
Since \(X\) is an eigen vector, it is a non - zero vector. Hence, \(X^{\dagger} X \neq 0\text{.}\) Therefore, frome eqn. (2.6.6), we get -
\begin{equation*} 1-\lambda\bar{\lambda} = 0 \end{equation*}
\begin{equation*} \Rightarrow \lambda\bar{\lambda} = 1 \end{equation*}
or,
\begin{equation*} |\lambda|^{2} = 1 \end{equation*}
\begin{equation*} \therefore \vert\lambda \vert = 1 \hspace{3pt} proved. \end{equation*}

Example 2.6.4.

Find the eigen values and eigen vector for the matrix,
\begin{equation*} A=\begin{pmatrix} 3 & 1\\2 & 2 \end{pmatrix} \end{equation*}
Solution.
To find the eigen value, the secular equation is given by
\begin{equation*} \vert A-\lambda I\vert=0 \end{equation*}
That is,
\begin{equation*} \begin{vmatrix} \begin{pmatrix} 3 & 1\\2 & 2 \end{pmatrix} - \lambda \begin{pmatrix} 1 & 0\\0 & 1 \end{pmatrix} \end{vmatrix} = 0 \end{equation*}
or,
\begin{equation*} \begin{vmatrix} 3-\lambda & 1\\ 2 & 2-\lambda \end{vmatrix} =0 \end{equation*}
\begin{align*} or, (3-\lambda)(2-\lambda)-2 \amp = 0\\ or, \quad 6-2\lambda -3 \lambda + \lambda^{2} \amp =0\\ or, \lambda^{2} -5\lambda +4 \amp =0\\ or, (\lambda -1)(\lambda -4) \amp =0 \\ \Rightarrow \lambda \amp = 1, 4 \end{align*}
Hence the eigen values are \(\lambda_{1} = 1\text{,}\) and \(\lambda_{2} = 4\text{.}\)
The eigen vector associated with \(\lambda_{1}\) is given by
\begin{equation*} (A-\lambda I)X =0\text{.} \end{equation*}
i.e.,
\begin{equation*} \begin{bmatrix} 3-1 & 1\\2 & 2-1 \end{bmatrix} \begin{bmatrix} x_{1}\\x_{2} \end{bmatrix} = \begin{bmatrix} 0\\0 \end{bmatrix} \quad \text{for}\quad \lambda = 1 \end{equation*}
\begin{equation*} \text{or,} \quad \begin{bmatrix} 2 & 1\\2 & 1 \end{bmatrix} \begin{bmatrix} x_{1}\\x_{2} \end{bmatrix} = \begin{bmatrix} 0\\0 \end{bmatrix} \end{equation*}
This gives
\begin{equation*} 2x_{1} + x_{2} = 0 \Rightarrow x_{1} = -x_{2}/2 \end{equation*}
and
\begin{equation*} 2x_{1} + x_{2} = 0 \Rightarrow x_{1} = -x_{2}/2 \end{equation*}
or,
\begin{equation*} \frac{x_{1}}{1} = -\frac{x_{2}}{2} = k_{1} \quad \text{(say)} \end{equation*}
Hence the required eigen vector corresponding to the eigen value \(\lambda_{1} = 1\text{,}\) is
\begin{equation*} X= \begin{bmatrix} x_{1}\\x_{2} \end{bmatrix} = \begin{bmatrix} k_{1}\\-2k_{1} \end{bmatrix} = \begin{bmatrix} 1\\-2 \end{bmatrix} \end{equation*}
by choosing \(k_{1}=1\text{.}\)
Similarly, eigen vector for eigen value \(\lambda_{2} =4\) can also be found.

Example 2.6.5.

Find the eigen values and associated eigen vector for the matrix
\begin{equation*} A = \begin{bmatrix} 2 & -1 & 1\\-1 & 2 & -1\\1 & -1 & 2 \end{bmatrix} \end{equation*}
Solution.
We know that the secular equation is \(|A-\lambda I|=0,\) i.e.,
\begin{equation*} \begin{vmatrix} 2-\lambda & -1 & 1 \\ -1 & 2-\lambda & -1 \\ 1 & -1 & 2-\lambda \end{vmatrix} =0 \end{equation*}
or,
\begin{equation*} (2-\lambda)[(2-\lambda)(2-\lambda)-1]-1[-1+1(2-\lambda)] +1[1-(2-\lambda)]=0 \end{equation*}
or,
\begin{equation*} (2-\lambda)[(4-4\lambda + \lambda^{2}-1]-1[-1+2-\lambda] +1[1-2+\lambda]=0 \end{equation*}
or,
\begin{equation*} (2-\lambda)[ \lambda^{2}-4\lambda +3]-(1-\lambda) +(\lambda -1)=0 \end{equation*}
or,
\begin{equation*} 2 \lambda^{2}- \lambda^{3}-8\lambda +4 \lambda^{2}+6 - 3 \lambda-1+\lambda +\lambda -1=0 \end{equation*}
or,
\begin{equation*} 6 \lambda^{2}- \lambda^{3}-9\lambda +4=0 \end{equation*}
or,
\begin{equation*} \lambda^{3}- 6 \lambda^{2}+ 9\lambda - 4=0 \end{equation*}
Now from zero method, \(\lambda = 1\) satisfies the above equation. Hence we can write the above equation as
\begin{equation*} \lambda^{3}- \lambda^{2} -5\lambda^{2}+ 5\lambda +4\lambda - 4=0 \end{equation*}
or,
\begin{equation*} \lambda^{2}(\lambda-1)-5\lambda (\lambda-1) + 4(\lambda-1)=0 \end{equation*}
or,
\begin{equation*} (\lambda-1)(\lambda^{2}-5\lambda + 4)=0\quad \Rightarrow \lambda = 1,1,4 \end{equation*}
Hence the eigen values are \(\lambda_{1} = 1\text{,}\) and \(\lambda_{2} = 4\text{.}\) The eigen vector associated with \(\lambda_{1} = 1\) is given by \((A-\lambda I)X = 0\text{,}\) i.e.,
\begin{equation*} \begin{bmatrix} 1 & -1 & 1 \\ -1 & 1 & -1 \\ 1 & -1 & 1 \end{bmatrix} \begin{bmatrix} x_{1}\\x_{2}\\x_{3} \end{bmatrix} = \begin{bmatrix} 0\\0\\0 \end{bmatrix} \end{equation*}
this gives
\begin{equation*} x_{1} - x_{2}+x_{3}=0 \end{equation*}
or,
\begin{equation*} -x_{1} + x_{2} - x_{3}=0 \end{equation*}
and
\begin{equation*} x_{1} - x_{2}+x_{3}=0 \end{equation*}
on solving these, we get -
On solving these we get -
\begin{equation*} \frac{x_{1}}{1-1} = \frac{x_{2}}{-1+1} =\frac{x_{3}}{1-1} = k_{1} \quad \text{(say)} \end{equation*}
or,
\begin{equation*} \frac{x_{1}}{0} = \frac{x_{2}}{0} =\frac{x_{3}}{0} = k_{1} \end{equation*}
Hence the required eigen vector corresponding to the eigen value \(\lambda_{1}=1\text{,}\) is
\begin{equation*} X_{1} = \begin{bmatrix} x_{1}\\x_{2}\\x_{3} \end{bmatrix}=\begin{bmatrix} 0\\0\\0 \end{bmatrix}. \end{equation*}
The eigen vector corresponding to the root, \(\lambda_{2}=4\) is given by
\begin{equation*} \begin{bmatrix} 2-4 & -1 & 1 \\ -1 & 2-4 & -1 \\ 1 & -1 & 2-4 \end{bmatrix} \begin{bmatrix} x_{1}\\x_{2}\\x_{3} \end{bmatrix}=\begin{bmatrix} 0\\0\\0 \end{bmatrix}. \end{equation*}
or,
\begin{equation*} \begin{bmatrix} -2 & -1 & 1 \\ -1 & -2 & -1 \\ 1 & -1 & -2 \end{bmatrix} \begin{bmatrix} x_{1}\\x_{2}\\x_{3} \end{bmatrix}=\begin{bmatrix} 0\\0\\0 \end{bmatrix} \end{equation*}
This gives
\begin{align*} -2x_{1} - x_{2}+x_{3}\amp =0 \\ or, -x_{1} -2 x_{2} - x_{3} \amp =0\\ and x_{1} - x_{2}-2x_{3} \amp =0 \end{align*}
On solving these equations, we get -
\begin{equation*} \frac{x_{1}}{4-1} = \frac{x_{2}}{-1-2} =\frac{x_{3}}{+1+2} = k_{2} \quad \text{(say)} \end{equation*}
or,
\begin{equation*} \frac{x_{1}}{3} = \frac{x_{2}}{-3} =\frac{x_{3}}{3} = k_{2} \end{equation*}
Hence an eigen vector corresponding to the eigen value, \(\lambda_{2}=4\) will be
\begin{equation*} X_{2} = \begin{bmatrix} x_{1}\\x_{2}\\x_{3} \end{bmatrix}=3k_{2}\begin{bmatrix} 1\\-1\\1 \end{bmatrix}. \end{equation*}

Example 2.6.6.

Find the eigen values and eigen vectors of the given Hermitian matrix
\begin{equation*} A = \begin{bmatrix} 0 & 1 & 1 \\ 1 & 0 & 1 \\ 1 & 1 & 0 \end{bmatrix}. \end{equation*}
Solution.
The characteristic equation is \(|A-\lambda I| = 0\text{,}\) i.e.,
\begin{equation*} \begin{vmatrix} -\lambda & 1 & 1 \\1 & -\lambda & 1 \\ 1 & 1 & -\lambda \end{vmatrix} = 0 \end{equation*}
\begin{align*} or, -\lambda(\lambda^{2}-1) +1(+\lambda)+1(1+\lambda)\amp =0 \\ or, -\lambda(\lambda+1)(\lambda -1)+2(\lambda+1)\amp =0 \\ or, (\lambda+1)[-\lambda(\lambda -1)+2] \amp =0 \\ or, (\lambda+1)[-\lambda^{2}+\lambda +2] \amp =0 \end{align*}
\begin{equation*} or, \lambda =-1,-1,2 \end{equation*}
The eigen vector corresponding to eigen value \(\lambda_{1}=-1\) is given by \((A-\lambda I)X = 0\text{.}\)
\begin{equation*} \begin{bmatrix} 1 & 1 & 1 \\ 1 & 1 & 1 \\ 1 & 1 & 1 \end{bmatrix} \begin{bmatrix} x_{1} \\ x_{2} \\ x_{3} \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \\ 0 \end{bmatrix} \end{equation*}
\begin{align*} or, x_{1} + x_{2}+ x_{3} \amp =0\\ or, x_{1} + x_{2} + x_{3}\amp =0\\ and, x_{1} + x_{2} + x_{3} \amp =0 \end{align*}
on solving these we get -
\begin{equation*} \frac{x_{1}}{0} = \frac{x_{2}}{0} = \frac{x_{3}}{0} = k_{1} \quad \text{(say)} \end{equation*}
\begin{equation*} \therefore \quad X_{1} = \begin{bmatrix} x_{1} \\ x_{2} \\ x_{3} \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \\ 0 \end{bmatrix} \end{equation*}
The eigen vector corresponding to eigen value \(\lambda_{2}=2\) is given by
\begin{equation*} \begin{bmatrix} -2 & 1 & 1 \\ 1 & -2 & 1 \\ 1 & 1 & -2 \end{bmatrix} \begin{bmatrix} x_{1} \\ x_{2} \\ x_{3} \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \\ 0 \end{bmatrix} \end{equation*}
\begin{align*} or, -2x_{1} + x_{2}+ x_{3}\amp =0\\ or, x_{1} -2x_{2} + x_{3} \amp =0 \\ or, x_{1} + x_{2}-2x_{3}\amp =0 \end{align*}
on solving these we get -
\begin{equation*} \frac{x_{1}}{4-1} = \frac{x_{2}}{1+2} = \frac{x_{3}}{1+2} = k_{2} \quad \text{(say)} \end{equation*}
or,
\begin{equation*} \frac{x_{1}}{3} = \frac{x_{2}}{3} = \frac{x_{3}}{3} = k_{2} \end{equation*}
or,
\begin{equation*} X_{1}=\begin{bmatrix} x_{1} \\ x_{2} \\ x_{3} \end{bmatrix} = \begin{bmatrix} 3k_{2} \\ 3k_{2} \\ 3k_{2} \end{bmatrix} = 3k_{2} \begin{bmatrix} 1 \\ 1 \\ 1 \end{bmatrix} \end{equation*}
where \(k_{2}\) may have any value. For \(3k_{2}=1\text{,}\) We have \(X_{2}=(1,1,1)\text{.}\) Since the matrix is Hermitian, we have -
\begin{equation*} X \cdot X^{\dagger} = \begin{bmatrix} 1 & 1 & 1 \end{bmatrix} \begin{bmatrix} 1\\1\\1 \end{bmatrix} = [1+1+1] = 3 \end{equation*}
\(\therefore\) Normalized eigen vector corresponding to \(\lambda_{2}=2\) is
\begin{equation*} \hat{X}= \frac{X}{\parallel X \parallel} = \frac{1}{\sqrt{3}} \begin{bmatrix} 1\\1\\1 \end{bmatrix}. \end{equation*}

Example 2.6.7.

Find the eigen values and eigen vectors of
\begin{equation*} A=\begin{bmatrix} \cos\theta & -\sin\theta \\ \sin\theta & \cos\theta \end{bmatrix} \end{equation*}
Solution.
The characteristic equation is \(|A-\lambda I| = 0\) or,
\begin{equation*} \begin{bmatrix} \cos\theta - \lambda & -\sin\theta \\ \sin\theta & \cos\theta -\lambda \end{bmatrix}=0 \end{equation*}
\begin{align*} or, (\cos\theta - \lambda)^{2}+\sin^{2}\theta \amp =0 \\ or, \cos^{2}\theta - 2\lambda\cos\theta)+\lambda^{2}+\sin^{2}\theta \amp =0\\ or, 1-2\lambda \cos\theta+\lambda^{2} \amp =0\\ or, \lambda^{2}-2\lambda \cos\theta +1 \amp =0 \end{align*}
\begin{equation*} or, \lambda = \frac{+2\cos\theta \pm \sqrt{4\cos^{2}\theta -4}}{2.1} \end{equation*}
\begin{equation*} = \cos\theta \pm i\sin\theta = e^{i\theta}\cdot e^{-i\theta} \end{equation*}
\begin{equation*} \therefore \quad \lambda_{1}=e^{i\theta}\quad \text{and}\quad \lambda_{2}=e^{-i\theta} \end{equation*}
The eigen vector corresponding to eigen value \(\lambda_{1}=e^{i\theta}\) is given by \((A-\lambda I)X=0\text{,}\) i.e.,
\begin{equation*} \begin{bmatrix} \cos\theta - e^{i\theta} & -\sin\theta \\ \sin\theta & \cos\theta -e^{i\theta} \end{bmatrix} \begin{bmatrix} x_{1}\\x_{2} \end{bmatrix} = \begin{bmatrix} 0\\0 \end{bmatrix} \end{equation*}
or,
\begin{equation*} \begin{bmatrix} \cos\theta - \cos\theta - i\sin\theta & -\sin\theta \\ \sin\theta & \cos\theta - \cos\theta - i\sin\theta \end{bmatrix} \begin{bmatrix} x_{1}\\x_{2} \end{bmatrix} = \begin{bmatrix} 0\\0 \end{bmatrix} \end{equation*}
or,
\begin{equation*} \begin{bmatrix} - i\sin\theta & -\sin\theta \\ \sin\theta & - i\sin\theta \end{bmatrix} \begin{bmatrix} x_{1}\\x_{2} \end{bmatrix} = \begin{bmatrix} 0\\0 \end{bmatrix} \end{equation*}
\begin{equation*} or, \hspace{3pt} -ix_{1}\sin\theta - x_{2}\sin\theta =0 \quad \Rightarrow x_{1}=ix_{2} \end{equation*}
\begin{equation*} and \hspace{6pt} x_{1}\sin\theta -ix_{2}\sin\theta =0 \quad \Rightarrow x_{1}=ix_{2}. \end{equation*}
Putting \(x_{2}=c_{1}\) gives \(x_{1}=ic_{1}\text{.}\) Hence,
\begin{equation} X_{1}=c_{1}\begin{bmatrix} i\\1 \end{bmatrix}, \tag{2.6.7} \end{equation}
similarly,
\begin{equation} X_{2}=c_{1}\begin{bmatrix} -i\\1 \end{bmatrix} \quad \text{for}\quad \lambda_{2}=e^{i\theta}\tag{2.6.8} \end{equation}

Example 2.6.8.

Prove that
\begin{equation*} \begin{bmatrix} \cos\theta & -\sin\theta \\ \sin\theta & \cos\theta \end{bmatrix}^{n}=\begin{bmatrix} \cos n\theta & -\sin n\theta \\ \sin n\theta & \cos n\theta \end{bmatrix} \end{equation*}
Solution.
Let,
\begin{equation*} A = \begin{bmatrix} \cos\theta & -\sin\theta \\ \sin\theta & \cos\theta \end{bmatrix} \end{equation*}
The characteristic equation is \(|A-\lambda I|=0\text{,}\) which gives the eigen values \(\lambda_{1} = e^{i\theta},\) \(\lambda_{2} = e^{-i\theta}\text{.}\) The corresponding eigen vector associated with \(\lambda_{1} = e^{i\theta}\) and \(\lambda_{2} = e^{-i\theta}\) are given by
\begin{equation*} X_{1}= \begin{bmatrix} i\\1 \end{bmatrix}, \end{equation*}
and
\begin{equation*} X_{2}= \begin{bmatrix} 1\\i \end{bmatrix} \end{equation*}
by setting \(c_{1}=1\) in eqns. (2.6.7) and (2.6.8). Now a matrix that diagonalize \(A\) is
\begin{equation*} P = \begin{bmatrix} X_{1} & X_{2} \end{bmatrix} = \begin{bmatrix} i & 1 \\ 1 & i \end{bmatrix} \end{equation*}
and
\begin{equation*} P^{-1} =\frac{adj P}{|P|} =-\frac{1}{2}\begin{bmatrix} i & -1 \\ -1 & i \end{bmatrix} \end{equation*}
Hence a diagonal matrix is \(D =P^{-1}AP\)
\begin{equation*} or, \quad D = -\frac{1}{2} \begin{bmatrix} i & -1 \\ -1 & i \end{bmatrix} \begin{bmatrix} \cos\theta & -\sin\theta \\ \sin\theta & \cos\theta \end{bmatrix} \begin{bmatrix} i & 1 \\ 1 & i \end{bmatrix} \end{equation*}
\begin{equation*} =\begin{bmatrix} e^{i\theta} & 0 \\ 0 & e^{-i\theta} \end{bmatrix} \end{equation*}
\begin{equation*} \therefore D^{n}= \begin{bmatrix} e^{in\theta} & 0 \\ 0 & e^{-in\theta} \end{bmatrix} \end{equation*}
But,
\begin{equation*} D^{n} = (P^{-1}AP)^{n} = (P^{-1}AP)(P^{-1}AP) \cdots (P^{-1}AP) \end{equation*}
\begin{equation*} = P^{-1}A^{n}P \Rightarrow A^{n}=PD^{n}P^{-1} \end{equation*}
\begin{equation*} \therefore A^{n} = -\frac{1}{2}\begin{bmatrix} i & 1 \\ 1 & i \end{bmatrix} \begin{bmatrix} e^{in\theta} & 0 \\ 0 & e^{-in\theta} \end{bmatrix} \begin{bmatrix} i & -1 \\ -1 & i \end{bmatrix} \end{equation*}
or,
\begin{equation*} \begin{bmatrix} \cos\theta & -\sin\theta \\ \sin\theta & \cos\theta \end{bmatrix}^{n} = \begin{bmatrix} \cos n\theta & -\sin n \theta \\ \sin n\theta & \cos n\theta \end{bmatrix} \end{equation*}

Example 2.6.9.

Examine whether the transformation is orthogonal and find its inverse.
\begin{align*} y_{1}\amp = \frac{1}{\sqrt{2}}x_{1} + \frac{1}{\sqrt{2}}x_{3}, \\ y_{2} \amp = x_{2}, \\ and \quad y_{3} \amp = -\frac{1}{\sqrt{2}}x_{1} + \frac{1}{\sqrt{2}}x_{3} \end{align*}
Solution.
This set of equations is equivalent to the single matrix equation
\begin{equation*} Y=AX \end{equation*}
i.e.,
\begin{equation*} \begin{bmatrix} y_{1} \\ y_{2} \\ y_{3} \end{bmatrix} = \begin{bmatrix} \frac{1}{\sqrt{2}} & 0 & \frac{1}{\sqrt{2}} \\ 0 & 1 & 0 \\ -\frac{1}{\sqrt{2}} & 0 & \frac{1}{\sqrt{2}} \end{bmatrix} \begin{bmatrix} x_{1} \\ x_{2} \\ x_{3} \end{bmatrix} \end{equation*}
For orthogonal matrix \(A^{t}A = I\text{,}\) or,
\begin{equation*} \begin{bmatrix} \frac{1}{\sqrt{2}} & 0 & -\frac{1}{\sqrt{2}} \\ 0 & 1 & 0 \\ \frac{1}{\sqrt{2}} & 0 & \frac{1}{\sqrt{2}} \end{bmatrix} \begin{bmatrix} \frac{1}{\sqrt{2}} & 0 & \frac{1}{\sqrt{2}} \\ 0 & 1 & 0 \\ -\frac{1}{\sqrt{2}} & 0 & \frac{1}{\sqrt{2}} \end{bmatrix} = \begin{bmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{bmatrix} I \end{equation*}
Now, \(X= A^{-1}Y\) where \(A^{-1} = \frac{A^{ct}}{|A|}.\text{.}\) Hence,
\begin{equation*} \begin{bmatrix} x_{1} \\ x_{2} \\ x_{3} \end{bmatrix} = \begin{bmatrix} \frac{1}{\sqrt{2}} & 0 & -\frac{1}{\sqrt{2}} \\ 0 & 1 & 0 \\ \frac{1}{\sqrt{2}} & 0 & \frac{1}{\sqrt{2}} \end{bmatrix} \begin{bmatrix} y_{1} \\ y_{2} \\ y_{3} \end{bmatrix} \end{equation*}
Thus the inverse transformation equation is
\begin{equation*} x_{1} = \frac{1}{\sqrt{2}}y_{1} - \frac{1}{\sqrt{2}}y_{3}, \end{equation*}
\begin{equation*} x_{2} = y_{2}, \end{equation*}
and
\begin{equation*} x_{3} = \frac{1}{\sqrt{2}}y_{1} + \frac{1}{\sqrt{2}}y_{3} \end{equation*}

Example 2.6.10.

If the matrices \(A\) and \(B\) are Hermitian and matrices \(C\)and \(D\) are unitary then show that
  1. \(C^{-1}AC\) is Hermitian,
  2. \(C^{-1}DC\) is unitary,
  3. \(i(AB-BA)\) is Hermitian.
Solution.
We know that,if \(C^{\dagger}=C^{-1}\text{,}\) then \(C\) is unitary and if \(A^{\dagger} = A\) then \(A\) is Hermitian.
  1. Since \(C\) is unitary, let \(M = C^{-1}AC = C^{\dagger}AC\) for Hermitian matrix, we have \(M^{\dagger} = M\text{,}\) i.e.,
    \begin{equation*} \left[C^{\dagger}AC\right]^{\dagger} = \left[(C^{\dagger}AC)^{*}\right]^{t} = \left[C^{\dagger *}A^{*}C^{*}\right]^{t} \end{equation*}
    \begin{equation*} = \left[C^{'}A^{*}C^{*}\right]^{'} = C^{*'}A^{*^{'}}C^{''} = C^{\dagger} A^{\dagger} C = C^{-1}AC = M. \end{equation*}
    \(\therefore C^{-1}AC\) is Hermitian matrix.
  2. Let
    \begin{equation*} M= C^{-1}DC = C^{\dagger}DC \end{equation*}
    taking conjugate, we have
    \begin{equation*} M^{*}= \left[C^{\dagger}DC\right]^{*} = C^{\dagger *}D^{*}C^{*} = C^{t}D^{*}C^{*} \end{equation*}
    taking transpose, we get -
    \begin{equation*} \left(M^{*}\right)^{t} = M^{\dagger}= \left[C^{t}D^{*}C^{*}\right]^{t} = C^{*t}D^{*t}C^{*t} = C^{\dagger}D^{\dagger}C \end{equation*}
    Now,
    \begin{equation*} M^{\dagger} M = (C^{\dagger}D^{\dagger}C)(C^{\dagger}DC) = C^{\dagger}D^{\dagger}C C^{\dagger}DC = C^{\dagger}D^{\dagger}DC = C^{\dagger}C = I \end{equation*}
    \(\therefore C^{-1}DC\) is unitary.
  3. By taking conjugate transpose of \(M = i(AB_BA) = iAB-iBA,\) we can easily prove it as a Hermitian.