Skip to main content

Section 5.3 The LU and QR methods

We start with applying the LU decomposition to the evaluation of eigenvalues. The main idea of this application is the following result:
This is an immediate consequence of TheoremĀ 5.1.2 since
\begin{equation*} AB = A\cdot BA\cdot A^{-1}. \end{equation*}

The QR method uses a matrix QR decomposition to find all eigenvalues of any given matrix whose eigenvalues are all real and non-zero.

Anyone that studied linear algebra knows what is a QR-decomposition of a matrix \(A\) even without knowing it explicitly. It is a decomposition of \(A\) coming from the Gram-Schmidt method of orthonormalizing a basis of vectors. The idea is to consider each column \(i\) of \(A\) as a base vector \(e_i\) (since \(A\) is invertible, all its columns are linearly independent) and using Gram-Schmidt to ortho-normalize them.

The components of the orthonormalized vectors fill up an orthogonal matrix \(Q\) (namely \(Q^{-1}\) is equal to its transpose \(Q^*\)) and \(Q^*A\) is an upper triangular matrix usually denoted by \(R\text{,}\) so that \(A=QR\text{.}\)

Let us see this in detail starting with the \(2\times2\) matrix and performing calculations in \(D_3\).
\begin{equation*} \begin{pmatrix}1&2\cr 3&4\cr\end{pmatrix}. \end{equation*}
Then
\begin{equation*} v_1 = (1,3),\;\;v_2 = (2,4) \end{equation*}
\begin{equation*} e_1 = \frac{v_1}{\|v_1\|} = \frac{(1,3)}{\|(1,3)\|} \simeq (0.316 , 0.949 ) \end{equation*}
and
\begin{equation*} e_2 = \frac{v_2-\langle v_2,e_1\rangle e_1}{\|v_2-\langle v_2,e_1\rangle e_1\|} \simeq \frac{(2,4) - 4.43(0.316 , 0.949)}{\|(2,4) - 4.43(0.316 , 0.949)\|} \simeq (0.948,-0.319). \end{equation*}
Notice that
\begin{equation*} \|e_1\|\simeq1.00, \; \langle e_1,e_2\rangle\simeq0.00, \; \|e_1\|\simeq1.00. \end{equation*}
Hence
\begin{equation*} Q = \begin{pmatrix}0.316&\phantom{-}0.948\cr 0.949&-0.319\cr\end{pmatrix}. \end{equation*}
Now we can find \(R=Q^*A\text{.}\) Let us do the math to Octave:
To keep our implementation of the QR algorithm light, rather than writing our QR decomposition general code we will use MATLAB's one:
We are now ready to go over the QR method to evaluate eigenvalues. Given a matrix \(A\text{,}\) we find its QR decomposition
\begin{equation*} A=Q_0R_0\text{.} \end{equation*}
We set
\begin{equation*} A_0=R_0Q_0 \end{equation*}
and we find its QR decomposition
\begin{equation*} A_0=Q_1R_1. \end{equation*}
Then we set
\begin{equation*} A_1=R_1Q_1 \end{equation*}
and we find its QR decomposition and we keep repeating this algortihm.

Notice that all matrices of this sequence have the same set of eigenvalues. For instance,
\begin{equation*} A_0=R_0Q_0=Q^*_0Q_0R_0Q_0=Q^*_0AQ_0, \end{equation*}
namely \(A_0\) is similar to \(A\text{.}\) Moreover, the \(A_k\) converge to an upper triangular matrix, whose diagonal terms then are necessarily the eigenvalues of \(A\text{.}\)

The QR algorithm is implemented in the code below. In the last line we call MATLAB's own method to evaluate eigenvalues to double check that our implementation works fine.