Linear Algebra was initially developed to solve systems of linear equations. In this article, we will try to understand the rank of a matrix in this context.
Linear Algebra was initially developed to solve systems of linear equations. In this article, we will try to understand the rank of a matrix in this context.
To understand matrix rank, we recommend familiarity with the concepts in
Follow the above links to first get acquainted with the corresponding concepts.
We saw in the in-depth article on systems of linear equations that every given set of equations cannot be solved. Here are the obvious outliers
Even if you have the same number of unknowns as the number of equations, a solution is not guaranteed.
To be able to solve equations in \( n \) unknown variables, you need exactly \( n \) independent and consistent equations.
To understand this in further better, try our interactive demo for solving linear equations.
We know that systems of linear equations can be represented in a concise form as a matrix-vector product.
$$ \mA \vx = \vb $$
where, \( \mA \) is a matrix such that the elements of each row correspond to the coefficients of the variables in the corresponding equation, \( \vx \) is the vector of variables, and \( \vb \) is the output vector.
It should be noted that the number of results in \( \vb \) is the same as the number of rows of \( \mA \). Also, the number of columns of \( \mA \) is the same as the number of variables in \( \vx \).
Gaussian elimination can be easily re-implemented for matrices as simple row-operations. Multiply a row with an appropriate scalar and subtract it from another row to eliminate a variable and continue to the solution as we demonstrated earlier.
What about the independence requirements we mentioned earlier? Well, they still apply. More succinctly so.
To solve equations in \( n \) variables, you need \( n \) linearly independent rows and \( n \) linearly independent columns in the matrix \( \mA \).
We have explained linear independence in linear combination of vectors.
The number of linearly independent rows of a matrix is known as the row rank of the matrix.
Conversely, since each column is also a vector, the number of linearly independent columns of a matrix is known as the column rank of the matrix.
A fundamental result of linear algebra states that The row rank and column rank of any matrix are always equal. This common number of independent rows or columns is simply referred to as the rank of the matrix. The proof is out of scope here, but it is worth checking out.
So, getting back to solving linear equations, to solve \( \mA \vx = \vb \), where \( \vx \in \real^n \), \( \mA \) needs to have \( n \) linearly independent rows. In other words, \( \textbf{rank} ( \mA ) = n \).
Some interesting facts about matrix rank.
$$ \text{rank}(\mA) \le \min(m,n) , \mA \in \real^{m \times n} $$
If every row of a matrix is linearly independent of the rest, then it is called a full row rank matrix. Similarly, if every column of a matrix is linearly independent of the rest, then it is referred to as a full column rank matrix.
The matrix is said to be full rank if it has the maximum achievable rank for a matrix of its size, limited only by the number of rows or columns. In other words, a full rank matrix has its rank equal to its number of rows and number of columns.
A matrix that is not full rank , is considered to be rank deficient.
We know from our previous discussion on Gaussian elimination that we need a full-rank matrix to arrive at a solution.
With a sound understanding of matrix rank, you are ready to explore other advanced topics in linear algebra.
Already feeling like an expert in linear algebra? Move on to other advanced topics in mathematics or machine learning.
Help us create more engaging and effective content and keep it free of paywalls and advertisements!
Please share your comments, questions, encouragement, and feedback.