# Spectral Theorem

##### Linear Algebra

We have studied eigendecomposition in-depth earlier.

For the case of symmetric matrices, eigendecomposition is special. In this article, we will try to provide a comprehensive overview of the Spectral Theorem — a guarantee about decomposition of symmetric matrices.

## Prerequisites

To understand the Spectral Theorem, we recommend familiarity with the concepts in

Follow the above links to first get acquainted with the corresponding concepts.

## Spectral Theorem

If $\mA = \mA^T$, a symmetric matrix, then $\mQ^T = \mQ^{-1}$. So,

$$\mA = \mQ \mLambda \mQ^T$$

In other words, the eigendecomposition of a symmetric matrix leads to an orthogonal matrix.

In the next demo, a replica of the previous transformation recovery demo, but with symmetry constraints on $\mA$, check out the first step. $\mQ^T \vx$. Since $\mQ$ is orthogonal, and so is its inverse, there is no stretching/shrinking in the first step. Only rotation. Any stretching or shrinking is happening in the second step $\mLambda \mQ^T \vx$.

Check also that the eigenvectors (eig row 1 and eig row 2) are always orthonormal.

## Eigendecomposition recovery demo

Check out this interactive demo to understand Spectral Theorem visually. The column on the right shows the space transformed by $\mA$. The bottom row shows the same transformation, arrived at by sequentially multiplying with SVD components. It is presented in the following order: $\vx \rightarrow \mQ^{-1}\vx \rightarrow \mLambda\mQ^{-1}\vx \rightarrow \mQ \mLambda \mQ^{-1} \vx$.

Note how the first transformation $\mQ^{-1}\vx$ mostly rotates the space into an orientation that is amenable to the appropriate stretch or shrinkage by the $\mLambda$ in the second step. In the final step, the transformation matrix $\mQ$, being an inverse of the first step, reverses the rotation from the first step.

Try to modify the matrix such that one of the elements of the diagonal matrix $\mLambda$ becomes almost zero. Notice how the space gets squashed along that direction. Also, observe what happens when you end up with a zero eigenvalue in the $\mLambda$ matrix.

## Where to next?

In this section, we have merely defined the various matrix types. To really build intuition about what these actually mean, we first need to understand the effect of multiplying a particular type of matrix. We present this in matrix as a transformer.

You may also choose to explore other advanced topics linear algebra.

Already feeling like an expert in linear algebra? Move on to other advanced topics in mathematics or machine learning.