Orthogonal Matrices can be considered as basis transformations in
. Therefore, considering the matrix as a conglomeration of unit orthogonal column vectors, we obtain for some orthogonal matrix Q,
![]()
![]()
![]()
![]()
The final equation implies a nifty equation,
![]()
The following equation has interesting consequences. Suppose
does not have
as an eigenvalue. Then,
must be invertible, and the following formula holds. Write
![]()
and thus
![]()
We apply this formula to prove the following proposition.
Proposition. Let
be a reflection in the form of
for some unit vector
. Also, set
to be some orthogonal matrix that does not have
as an eigenvalue. Then,
must have an eigenvalue of ![]()
proof. It suffices to show that there exists a vector
that is nonzero and satisfies
, or equivalently
. Rewriting this condition, we obtain
![]()
We wish to use equation (1), and a reasonable guess for
would be
. Then, it suffices to show that
![]()
or even
. This can be directly proved by left multiplying
and right multiplying
to equation (1) and comparing the resulting bilinear forms. □
An implication of this proposition is that a composition of a reflection and an absolute rotation must always fix some vector in ![]()
As a sidenote, I want to comment that the object
is an important matrix in linear algebra. If
has a spectral radius strictly less than 1, the proposed inverse can be expanded by Neumann expansions. In the case where
is orthogonal, the spectral radius is exactly
. Equation (1) is nice since it provides control over this inverse. Also, this object can be considered as an analogy of taking the power series of
around
. If
does not give a fruitful expansion, repeat for
; it is only reasonable to do so.
Leave a Reply