Skip to main content

Section 2.4 ExamplesB

Example 2.4.1.

If \(A\) and \(B\) are Hermitian, show that \(AB+BA\) is Hermitian and \(AB-BA\) is skew Hermitian.
Solution.
\(\because\) \(A\) and \(B\) are Hermitian, we have
\begin{equation*} A^{\dagger}=A \quad \text{and} \quad B^{\dagger}=B. \end{equation*}
Hence,
\begin{equation*} \left(AB+BA\right)^{\dagger} = (AB)^{\dagger} + (BA)^{\dagger} \end{equation*}
\begin{equation*} = B^{\dagger}A^{\dagger}+A^{\dagger}B^{\dagger} =BA+AB=AB+BA, \end{equation*}
i.e., \(AB+BA\) is Hermitian.
Again,
\begin{equation*} \left(AB-BA\right)^{\dagger} = (AB)^{\dagger} - (BA)^{\dagger} \end{equation*}
\begin{equation*} = B^{\dagger}A^{\dagger}-A^{\dagger}B^{\dagger} =BA-AB =-(AB-BA), \end{equation*}
i.e., \(AB-BA\) is skew-Hermitian.

Example 2.4.2.

Show that the inverse of a matrix is unique.
Solution.
Let \(A\) be a square matrix and the matrices \(B\) and \(C\) are inverse of \(A\text{,}\) then \(AB=BA=I\) also, \(AC =CA=I\text{.}\) Now,
\begin{equation*} CAB=C(AB) = CI =C \quad \text{also}, \quad CAB = (CA)B=IB = B, \end{equation*}
That is,
\begin{equation*} CAB=B=C. \end{equation*}
Hence, \(B\) is not different from \(C\text{,}\) which implies that the inverse of a matrix is unique, i.e., there exists only one inverse matrix to a given matrix.

Example 2.4.3.

If \(A\) is a non - singular matrix, then the transpose (conjugate transpose) of inverse is the inverse of the transpose (conjugate transpose) of \(A\text{.}\)
i.e., \((A^{-1})^{t} = (A^{t})^{-1}\) and \((A^{-1})^{\dagger}=(A^{\dagger})^{-1}\text{.}\)
Solution.
we know that -
\begin{equation*} AA^{-1}=A^{-1}A=I \end{equation*}
\begin{equation*} \text{or,}\quad (AA^{-1})^{t}=(A^{-1}A)^{t}=(I)^{t}=I \end{equation*}
\begin{equation*} \text{or,}\quad (A^{-1})^{t}A^{t} = A^{t}(A^{-1})^{t} = I \end{equation*}
which follows that \((A^{-1})^{t}\) is the inverse of \(A^{t}\text{.}\)
i.e. \((A^{t})^{-1}=(A^{-1})^{t}\text{.}\) proved. Similarly other can be proved.

Example 2.4.4.

If \(A\) and \(B\) are any two square matrices of the same order, then show that
\begin{equation*} Tr(A+B)=TrA+TrB. \end{equation*}
Solution.
Let \(A\) and \(B\) are two matrices of the same order \(n\text{.}\) Then, \(Tr(A+B)\) = sum of the diagonal elements of the matrix \((A+B)\)
\begin{equation*} = \sum\limits_{i=1}^{n}\left(a_{ij}+b_{ij}\right) =\sum\limits_{i=1}^{n}a_{ij}+\sum\limits_{i=1}^{n}b_{ij}=Tr(A)+Tr(B). \end{equation*}

Example 2.4.5.

Show that a real matrix is unitary if and only if it is orthogonal.
Solution.
If \(A\)is real matrix, then \(A^{\dagger}=A^{t}\) for \(A\) to be unitary, we have -
\begin{equation*} A^{\dagger}A=I \end{equation*}
\begin{equation*} \text{or,}\quad A^{t}A=I. \end{equation*}
\(\therefore\) \(A\) is orthogonal. [or, \(A^{'}=A^{-1}\text{,}\) i.e., for a matrix to be orthogonal its transpose coincides with its inverse.]
Conversely, if \(A\) is orthogonal, then
\begin{equation*} A^{t}A=I \end{equation*}
\begin{equation*} \text{or},\quad A^{\dagger}A =I \end{equation*}
\(\therefore\) \(A\) is unitary.

Example 2.4.6.

Show that under a unitary transformation, an orthonormal system of basis vectors is transformed into another orthonormal system.
Solution.
We require,
\begin{equation*} \delta_{ij}=\psi^{'}_{i} \psi^{'}_{j} = \sum\limits_{k} \gamma_{ki}\psi_{k}\cdot \sum\limits_{l} \gamma_{lj}\psi_{l} = \sum\limits_{kl}\gamma_{ki}^{*}\gamma_{lj}\psi_{k}\psi_{l} \end{equation*}
\begin{equation*} = \sum\limits_{kl}\gamma_{ki}^{*}\gamma_{lj}\delta_{kl} = \sum\limits_{kl}\gamma_{ki}^{*}\gamma_{kj} = \sum\limits_{k}(\gamma^{\dagger})_{ik} \gamma_{kj} \end{equation*}
here,
\begin{equation*} \delta_{ij}=\left(\gamma^{\dagger}\gamma\right)_{ij}. \end{equation*}
Thus, \(\gamma^{\dagger}\gamma = 1 \) or, \(\gamma\) must be an unitary matrix.

Example 2.4.7.

Show that a Hermitian matrix remains Hermitian under unitary transformation.
Solution.
For unitary transformation,
\begin{equation} AA^{\dagger}=A^{\dagger}A =I \tag{2.4.1} \end{equation}
If \(A\) to be Hermitian, then \(A^{\dagger}=A\text{.}\) Taking conjugate transpose of eqn. (2.4.1)>, we get -
\begin{equation*} \left(AA^{\dagger}\right)^{\dagger} = \left(A^{\dagger}A\right)^{\dagger} = I^{\dagger} \end{equation*}
\begin{equation*} \text{or,} \quad \left(A^{\dagger}\right)^{\dagger}A^{\dagger} = A^{\dagger}\left(A^{\dagger}\right)^{\dagger} =I \end{equation*}
[\(\because I^{\dagger}=I\) is an identity matrix.]
\begin{equation*} \therefore AA^{\dagger}=A^{\dagger}A =I. \end{equation*}
This equality shows that Hermitian matrix remains Hermitian under unitary transformation.

Example 2.4.8.

Show that the length of a real vector is preserved under orthogonal transformation.
Solution.
Let \(x_{i}\) and \(y_{i}\) are the real vectors of matrices \(X\) and \(Y\) which is related as
\begin{equation} Y=AX \tag{2.4.2} \end{equation}
where \(A\) is a transformation matrix. For an orthogonal transformation, we have -
\begin{equation} Y^{'}Y=(AX)^{'}(AX) = X^{'}A^{'}AX =X^{'}X \tag{2.4.3} \end{equation}
Since \(A\) is considered to be an orthogonal. i.e., \(A^{'}A =I\text{.}\)
Therefore, from eqn. (2.4.3), we get -
\begin{equation*} y^{2}_{i} = x^{2}_{i} \end{equation*}
where,
\begin{equation} \sum\limits_{i=1}^{n}y_{i}=\sum\limits_{j=1}^{n}a_{ij}x_{j}; \tag{2.4.4} \end{equation}
\begin{equation*} Y' =\begin{bmatrix} \cdot & \cdot & \cdot \end{bmatrix}; \end{equation*}
\begin{equation*} \text{and}\quad Y=\begin{bmatrix} \cdot\\\cdot\\\cdot \end{bmatrix}. \end{equation*}
Note: In case \(X\) and \(X^{'}\) are complex then the orthogonal conditions to be
\begin{equation*} \left(AA^{\dagger}\right) = \left(A^{\dagger}A\right) = I, \end{equation*}
i.e., A is an unitary matrix. A real unitary matrix is an orthogonal matrix. (2.4.4) shows that the norm of a real vector remains invariant under orthogonal transformation. i.e.,
\begin{equation*} \Vert X \Vert = \Vert AX \Vert \end{equation*}
\begin{equation*} [\because \quad \Vert X \Vert = \sqrt{x^{2}_{i}} \quad \text{and}\quad \Vert Y \Vert = \sqrt{y^{2}_{i}}]. \end{equation*}

Example 2.4.9.

Show that the eigen values of a Hermitian operator is real and eigen vectors belonging to different eigen values are orthogonal. Or, The eigen values of a Hermitian matrix are all real.
Solution.
Let \(\lambda_{i}\) and \(\lambda_{j}\) be two eigen values and \(X_{i}\) and \(X_{j}\) be two eigen vectors of Hermitian matrix (operator) \(H\text{.}\) Then
\begin{align} HX_{i} \amp = \lambda_{i}X_{i} \tag{2.4.5}\\ HX_{j} \amp = \lambda_{j}X_{j} \tag{2.4.6} \end{align}
premultiplying by \(X^{\dagger}_{j}\) to eqn. (2.4.5) and \(X^{\dagger}_{i}\) to eqn. (2.4.6), respectively, we get -
\begin{align} X^{\dagger}_{j} HX_{i} \amp = \lambda_{i}X^{\dagger}_{j}X_{i} \tag{2.4.7}\\ \text{and}\quad X^{\dagger}_{i} HX_{j} \amp = \lambda_{j}X^{\dagger}_{i}X_{j} \tag{2.4.8} \end{align}
taking conjugate transpose of eqn. (2.4.8), we get -
\begin{equation*} \left(X^{\dagger}_{i} HX_{j}\right)^{\dagger}=\left(\lambda_{j} X_{i}^{\dagger}X_{j}\right)^{\dagger} \end{equation*}
[for Hermitian matrix \(H^{\dagger} =H\)] or,
\begin{equation} X^{\dagger}_{j} HX_{i} = \bar{\lambda}_{j}X^{\dagger}_{j}X_{i} \tag{2.4.9} \end{equation}
equating eqns. (2.4.7) and (2.4.9), we get -
\begin{align} \lambda_{i}X^{\dagger}_{j}X_{i} \amp = \bar{\lambda}_{j}X^{\dagger}_{j}X_{i} \tag{2.4.10}\\ \left(\lambda_{i} - \bar{\lambda}_{j}\right)X^{\dagger}_{j}X_{i} \amp = 0\tag{2.4.11} \end{align}
Case I: Let \(i=j\text{,}\) then \(\left(\lambda_{i} - \bar{\lambda}_{i}\right)=0\) [\(\because X^{\dagger}_{i}X_{i} \neq 0\text{,}\) as \(X_{i}\) is a non - zero vector.
\begin{equation*} \therefore \quad \lambda_{i} = \bar{\lambda}_{i} \end{equation*}
i.e., \(\lambda_{i}\) is real for each \(i\text{.}\)
Case II: Let \(i\neq j\text{.}\) we have \(\lambda_{i} \neq \bar{\lambda}_{j}\text{.}\)
\begin{equation*} \therefore \quad X^{\dagger}_{j}X_{i} = 0. \end{equation*}
which means eigen vectors for different eigen values of \(H\) are orthogonal.

Example 2.4.10.

Show that the trace and determinant of a matrix is invarinat under similarity transformation.
Solution.
Let \(X\) and \(Y\) are two vectors relative to the basis of unprimed system and are represented by \(X'\) and \(Y'\) relative to the basis of a primed system, so that
\begin{equation} X=PX'; \quad Y=PY' \tag{2.4.12} \end{equation}
where \(P\) is transformation matrix. Suppose \(X\) and \(Y\) are themselves related by the transformation matrix \(A\) in the unprimed system, so that
\begin{equation} X=AY\tag{2.4.13} \end{equation}
Now, from eqns. (2.4.12) and (2.4.13), we have
\begin{align} X' \amp = P^{-1}X = P^{-1}AY = P^{-1}APY' \equiv BY' \tag{2.4.14}\\ \text{where} \quad B \amp = P^{-1}AP \tag{2.4.15} \end{align}
This implies that
\begin{equation} A = PBP^{-1}.\tag{2.4.16} \end{equation}
i.e., the matrix of transformation for the same two vectors in the coordinate system is related by \(P^{-1}AP\) and \(PBP^{-1}\text{.}\) Hence \(A\) and \(B\) are said to be related under similarity transformation. From eqn. (2.4.15), we have
\begin{equation*} Tr(B)=\sum\limits_{i}B_{ii}=\sum\limits_{i}(P^{-1}AP)_{ii} =\sum\limits_{i}\sum\limits_{jk}P^{-1}_{ij}A_{jk}P_{ki} \end{equation*}
\begin{equation*} = \sum\limits_{j}\sum\limits_{k}\left[\sum\limits_{i}P_{ki}P^{-1}_{ii}\right]A_{jk} \end{equation*}
\begin{equation*} \left[\because \quad \sum\limits_{i}(AB)_{ij}=\sum\limits_{i}\sum\limits_{j}a_{ij}b_{ji}\right] \end{equation*}
after changing the sign for the places.
\begin{equation*} =\sum\limits_{j}\sum\limits_{k}\left[\sum\limits_{i}PP^{-1}\right]_{kj}A_{jk} = \sum\limits_{j}\left[APP^{-1}\right]_{jj}= \sum\limits_{j}A_{jj} = Tr(A) \end{equation*}
i.e., trace is invarient under similarity transformation. Also,
\begin{equation*} |B| = det B = [P^{-1}AP| = |P^{-1}||A||P| = A \end{equation*}
[\(\because |P^{-1}||P| = 1\)] proved.

Example 2.4.11.

Show that diagonalizing matrix of a Hermitian matrix is unitary.
Solution.
Let \(H\) be a hermitian matrix and \(R\) be its diagonalizing matrix, then
\begin{equation} R^{-1}HR=diag\left(\lambda_{1}, \lambda_{2}, \lambda_{3}, \cdots, \lambda_{n} \right)\tag{2.4.17} \end{equation}
where \(\lambda_{1}{,} \lambda_{2}{,} \lambda_{3}{,} \cdots{,} \lambda_{n}\) are the eigen values of $H$ and are all real.
Taking transposed conjugate on both sides of eqn. (2.4.17), we have
\begin{align*} \left(R^{-1}HR\right)^{\dagger} \amp =\left[diag\left(\lambda_{1}, \lambda_{2}, \lambda_{3}, \cdots, \lambda_{n} \right)\right]^{\dagger} = diag\left(\lambda_{1}, \lambda_{2}, \lambda_{3}, \cdots, \lambda_{n} \right)\\ \text{or,}\quad R^{\dagger}H(R^{-1})^{\dagger} \amp = diag\left(\lambda_{1}, \lambda_{2}, \lambda_{3}, \cdots, \lambda_{n} \right) = R^{-1}HR \end{align*}
which gives
\begin{equation*} R^{\dagger} = R^{-1} \end{equation*}
\begin{equation*} \text{or,}\quad RR^{\dagger} = RR^{-1} = I \end{equation*}
\(\therefore\) R is a unitary matrix.

Example 2.4.12.

Use matrix method to find the voltage drop at resistor \(R_{6} \text{.}\) In the given figure \(R_{1} = 1\Omega \text{,}\) \(R_{2} = 2\Omega\) , \(R_{3}=3\Omega\text{,}\) \(R_{4} = 4\Omega\text{,}\) \(R_{5}=5\Omega\text{,}\) \(R_{6}=6\Omega\text{,}\) \(E_{1} =5 Volt\text{,}\) and \(E_{2} =10 Volt\text{.}\)
Solution.
Consider \(I_{1}{,} I_{2}{,} I_{3}\) are the currents flowing in the loops as shown in Figure 2.4.13 Now using Kirchhoff’s voltage law \(\sum IR = \sum V\text{,}\) from loop 1, we have
Figure 2.4.13.
\begin{align*} 1.I_{1}+2.I_{1}+3.I_{1}-3.I_{2} \amp = 5 \\ \text{or,}\quad 6.I_{1}-3.I_{2} + 0.I_{3} \amp = 5 \end{align*}
From loop 2,
\begin{align*} 5. I_{2}+3.I_{2}+4.I_{2}-3.I_{1}+5.I_{3} \amp = 10 \\ \text{or,}\quad -3.I_{1}+12.I_{2} +5.I_{3} \amp = 10 \end{align*}
From loop 3,
\begin{align*} 6.I_{3}+5.I_{3}+5.I_{2} \amp = 5 \\ \text{or,}\quad 0.I_{1}+5.I_{2} +11.I_{3} \amp = 5 \end{align*}
Hence in the matrix form
\begin{equation*} \begin{bmatrix} 6 & -3 & 0 \\ -3 & 12 & 5\\ 0 & 5 & 11 \end{bmatrix} \begin{bmatrix} I_{1}\\ I_{2}\\I_{3} \end{bmatrix} = \begin{bmatrix} 5\\ 10\\5 \end{bmatrix} \end{equation*}
The determinant of the given matrix is
\begin{equation*} D=\begin{vmatrix} 6 & -3 & 0 \\ -3 & 12 & 5\\ 0 & 5 & 11 \end{vmatrix} \end{equation*}
\begin{equation*} = 6(12\times 11-5\times 5)+3(-3\times 11-0\times 5) +0(-3\times 5-0\times 12) = 543. \end{equation*}
\begin{equation*} \therefore \quad I_{1} = \frac{\begin{vmatrix} 5 & -3 & 0 \\ 10 & 12 & 5\\ 5 & 5 & 11 \end{vmatrix}}{D} = 1.455 A \end{equation*}
\begin{equation*} I_{2} = \frac{\begin{vmatrix} 6 & 5 & 0 \\ -3 & 10 & 5\\ 0 & 5 & 11 \end{vmatrix}}{D} = 1.243 A \end{equation*}
\begin{equation*} I_{3} = \frac{\begin{vmatrix} 6 & -3 & 5 \\ -3 & 12 & 10\\ 0 & 5 & 5 \end{vmatrix}}{D} = -0.1105 A \end{equation*}
\(-ve\) sign indicats that the current \(I_{3}\) should be taken in opposite direction. Hence, the potential drop at \(R_{6} = 6\times 0.1105 = 0.663\) volt.