The good, the bad, and the ugly matrix
You will notice in most machine learning literature, that the authors squeal with delight if they are working with certain matrices.
Why? Because, as you have noticed so far, certain operations, especially inversion, becomes possible, sometimes easier, for some types of matrices.
First, let's talk about the ugliest of all.
A rectangular matrix that is also singular.
We cannot invert it.
Eigendecomposition is not defined for it.
Only thing we can definitely do is a singular value decomposition on it.
Among bad matrices, there are levels of bad.
The worst are those that are singular, even if they are square or symmetric.
The better ones are non-singular matrices that are also symmetric or at least square.
They are invertible and also have an eigendecomposition.
The best matrices are the ones that are easier to work with.
Identity, diagonal, orthogonal, and positive definite matrices are the best because of their rich properties and theory.
So, whenever you encounter a matrix, be on the lookout for some of these properties.
If you are not dealing with a desirable one, try to take the extra effort to bring it closer to a better one.
Check out this next chart which shows a rough matrix hierarchy to identify which ones are invertible, symmetric, and positive definite.
It is not a Venn-diagram in the true sense, but just shows containment. For example, all orthogonal matrices are invertible.