next up previous
Next: Principal Axes of Rotation Up: Rigid Body Rotation Previous: Rotational Kinetic Energy

Matrix Eigenvalue Theory

It is time to review a little matrix theory. Suppose that ${\bf A}$ is a real symmetric matrix of dimension $n$. If follows that ${\bf A}^\ast = {\bf A}$ and ${\bf A}^T = {\bf A}$, where $~^\ast$ denotes a complex conjugate, and $~^T$ denotes a transpose. Consider the matrix equation
{\bf A} \,{\bf x} = \lambda\,{\bf x}.
\end{displaymath} (472)

Any column vector ${\bf x}$ which satisfies the above equation is called an eigenvector of ${\bf A}$. Likewise, the associated number $\lambda$ is called an eigenvalue of ${\bf A}$. Let us investigate the properties of the eigenvectors and eigenvalues of a real symmetric matrix.

Equation (472) can be rearranged to give

({\bf A} - \lambda\,{\bf 1})\,{\bf x} = {\bf0},
\end{displaymath} (473)

where ${\bf 1}$ is the unit matrix. The above matrix equation is essentially a set of $n$ homogeneous simultaneous algebraic equations for the $n$ components of ${\bf x}$. A well-known property of such a set of equations is that it only has a non-trivial solution when the determinant of the associated matrix is set to zero. Hence, a necessary condition for the above set of equations to have a non-trivial solution is that
\vert{\bf A} - \lambda\,{\bf 1}\vert = 0.
\end{displaymath} (474)

The above formula is essentially an $n$th-order polynomial equation for $\lambda$. We know that such an equation has $n$ (possibly complex) roots. Hence, we conclude that there are $n$ eigenvalues, and $n$ associated eigenvectors, of the $n$-dimensional matrix ${\bf A}$.

Let us now demonstrate that the $n$ eigenvalues and eigenvectors of the real symmetric matrix ${\bf A}$ are all real. We have

{\bf A}\,{\bf x}_i = \lambda_i\,{\bf x}_i,
\end{displaymath} (475)

and, taking the transpose and complex conjugate,
{\bf x}_i^{\ast\,T}\,{\bf A} = \lambda_i^{\,\ast}\,{\bf x}_i^{\ast\,T},
\end{displaymath} (476)

where ${\bf x}_i$ and $\lambda_i$ are the $i$th eigenvector and eigenvalue of ${\bf A}$, respectively. Left multiplying Equation (475) by ${\bf x}_i^{\ast\,T}$, we obtain
{\bf x}_i^{\ast\,T} {\bf A}\,{\bf x}_i = \lambda_i\,{\bf x}_i^{\ast\,T}{\bf x}_i.
\end{displaymath} (477)

Likewise, right multiplying (476) by ${\bf x}_i$, we get
{\bf x}_i^{\ast\,T}\,{\bf A}\,{\bf x}_i = \lambda_i^{\,\ast}\,{\bf x}_i^{\ast\,T}{\bf x}_i.
\end{displaymath} (478)

The difference of the previous two equations yields
(\lambda_i - \lambda_i^{\,\ast})\,{\bf x}_i^{\ast\,T} {\bf x}_i = 0.
\end{displaymath} (479)

It follows that $\lambda_i=\lambda_i^{\,\ast}$, since ${\bf x}_i^{\ast\,T}{\bf x}_i$ (which is ${\bf x}_i^{\,\ast}\cdot{\bf x}_i$ in vector notation) is positive definite. Hence, $\lambda_i$ is real. It immediately follows that ${\bf x}_i$ is real.

Next, let us show that two eigenvectors corresponding to two different eigenvalues are mutually orthogonal. Let

$\displaystyle {\bf A}\,{\bf x}_i$ $\textstyle =$ $\displaystyle \lambda_i\,{\bf x}_i,$ (480)
$\displaystyle {\bf A}\,{\bf x}_j$ $\textstyle =$ $\displaystyle \lambda_j\,{\bf x}_j,$ (481)

where $\lambda_i\neq \lambda_j$. Taking the transpose of the first equation and right multiplying by ${\bf x}_j$, and left multiplying the second equation by ${\bf x}_i^T$, we obtain
$\displaystyle {\bf x}_i^T\,{\bf A}\,{\bf x}_j$ $\textstyle =$ $\displaystyle \lambda_i\,{\bf x}_i^T{\bf x}_j,$ (482)
$\displaystyle {\bf x}_i^T\,{\bf A}\,{\bf x}_j$ $\textstyle =$ $\displaystyle \lambda_j\,{\bf x}_i^T{\bf x}_j.$ (483)

Taking the difference of the above two equations, we get
(\lambda_i-\lambda_j)\,{\bf x}_i^T{\bf x}_j = 0.
\end{displaymath} (484)

Since, by hypothesis, $\lambda_i\neq \lambda_j$, it follows that ${\bf x}_i^T{\bf x}_j = 0$. In vector notation, this is the same as ${\bf x}_i \cdot{\bf x}_j=0$. Hence, the eigenvectors ${\bf x}_i$ and ${\bf x}_j$ are mutually orthogonal.

Suppose that $\lambda_i=\lambda_j=\lambda$. In this case, we cannot conclude that ${\bf x}_i^T{\bf x}_j = 0$ by the above argument. However, it is easily seen that any linear combination of ${\bf x}_i$ and ${\bf x}_j$ is an eigenvector of ${\bf A}$ with eigenvalue $\lambda$. Hence, it is possible to define two new eigenvectors of ${\bf A}$, with the eigenvalue $\lambda$, which are mutually orthogonal. For instance,

$\displaystyle {\bf x}_i'$ $\textstyle =$ $\displaystyle {\bf x}_i,$ (485)
$\displaystyle {\bf x}_j'$ $\textstyle =$ $\displaystyle {\bf x}_j - \left(\frac{{\bf x}_i^T{\bf x}_j}{{\bf x}_i^T{\bf x}_i}\right) {\bf x}_i.$ (486)

It should be clear that this argument can be generalized to deal with any number of eigenvalues which take the same value.

In conclusion, a real symmetric $n$-dimensional matrix possesses $n$ real eigenvalues, with $n$ associated real eigenvectors, which are, or can be chosen to be, mutually orthogonal.

next up previous
Next: Principal Axes of Rotation Up: Rigid Body Rotation Previous: Rotational Kinetic Energy
Richard Fitzpatrick 2011-03-31