Section 5.3 The QR decomposition
Anyone that studied linear algebra knows what is a QR-decomposition of a matrix \(A\) even without knowing it explicitly. It is a decomposition of \(A\) coming from the Gram-Schmidt method of orthonormalizing a basis of vectors. The idea is to consider each column \(i\) of \(A\) as a base vector \(e_i\) (since \(A\) is invertible, all its columns are linearly independent) and using Gram-Schmidt to ortho-normalize them. The components of the orthonormalized vectors fill up an orthogonal matrix \(Q\) (namely \(Q^{-1}\) is equal to its transpose \(Q^*\)) and \(Q^*A\) is an upper triangular matrix usually denoted by \(R\text{,}\) so that \(A=QR\text{.}\) Let us see this in detail starting with the \(2\times2\) matrix and performing calculations in \(D_3\).
\begin{equation*}
\begin{pmatrix}1&2\cr 3&4\cr\end{pmatrix}.
\end{equation*}
Then
\begin{equation*}
v_1 = (1,3),\;\;v_2 = (2,4)
\end{equation*}
\begin{equation*}
e_1 = \frac{v_1}{\|v_1\|} = \frac{(1,3)}{\|(1,3)\|} \simeq (0.316 , 0.949 )
\end{equation*}
and
\begin{equation*}
e_2 = \frac{v_2-\langle v_2,e_1\rangle e_1}{\|v_2-\langle v_2,e_1\rangle e_1\|} \simeq \frac{(2,4) - 4.43(0.316 , 0.949)}{\|(2,4) - 4.43(0.316 , 0.949)\|} \simeq (0.948,-0.319).
\end{equation*}
Notice that
\begin{equation*}
\|e_1\|\simeq1.00,
\;
\langle e_1,e_2\rangle\simeq0.00,
\;
\|e_1\|\simeq1.00.
\end{equation*}
Hence
\begin{equation*}
Q = \begin{pmatrix}0.316&\phantom{-}0.948\cr 0.949&-0.319\cr\end{pmatrix}.
\end{equation*}
Now we can find \(R=Q^*A\text{.}\) Let us do the math to Python: To keep our implementation of the QR algorithm light, rather than writing our QR decomposition general code we will use SciPy's one: