Matrix eigenvalue theory
Suppose that is
a real symmetric square matrix of dimension . If follows that
and
, where
denotes a complex conjugate, and denotes a transpose.
Consider the matrix equation

(A.144) 
Any column vector that satisfies this equation is called
an eigenvector of . Likewise, the associated number is called an eigenvalue of (Gradshteyn and Ryzhik 1980c). Let us investigate the properties of the eigenvectors and eigenvalues of a real
symmetric matrix.
Equation (A.144) can be rearranged to give

(A.145) 
where is the unit matrix. The preceding matrix equation is essentially a set of homogeneous
simultaneous algebraic equations for the components of .
A wellknown property of such a set of equations is that it only has a nontrivial
solution when the determinant of the associated matrix is set to zero (Gradshteyn and Ryzhik 1980c).
Hence, a necessary condition for the preceding set of equations to have a nontrivial
solution is that

(A.146) 
where denotes a determinant.
This formula is essentially an thorder polynomial equation
for . We know that such an equation has (possibly complex)
roots. Hence, we conclude that there are eigenvalues, and associated eigenvectors, of the dimensional matrix .
Let us now demonstrate that the eigenvalues and eigenvectors of the real symmetric matrix
are all real. We have

(A.147) 
and, taking the transpose and complex conjugate,

(A.148) 
where and are the th eigenvector and eigenvalue
of , respectively. Left multiplying Equation (A.147) by
, we obtain

(A.149) 
Likewise, right multiplying Equation (A.148) by , we get

(A.150) 
The difference of the previous two equations yields

(A.151) 
It follows that
, because
(which is
in vector notation) is real and positive definite. Hence, is real.
It immediately follows that is real.
Next, let us show that two eigenvectors corresponding to two different eigenvalues are mutually orthogonal. Let
where
. Taking the transpose of the first equation and right multiplying by , and left multiplying the second
equation by
, we obtain
Taking the difference of the preceding two equations, we get

(A.156) 
Because, by hypothesis,
, it follows
that
. In vector notation, this is the same
as
. Hence, the eigenvectors and
are mutually orthogonal.
Suppose that
. In this case, we cannot conclude
that
by the preceding argument. However, it is easily seen that any
linear combination of and is an eigenvector
of with eigenvalue . Hence, it is possible
to define two new eigenvectors of , with the eigenvalue
, that are mutually orthogonal. For instance,
It should be clear that this argument can be generalized to deal with any
number of eigenvalues that take the same value.
In conclusion, a real symmetric dimensional matrix
possesses real eigenvalues, with associated real eigenvectors,
that are, or can be chosen to be, mutually orthogonal.