# Matrix rank

##### Linear Algebra

Linear Algebra was initially developed to solve systems of linear equations. In this article, we will try to understand the rank of a matrix in this context.

## Prerequisites

To understand matrix rank, we recommend familiarity with the concepts in

Follow the above links to first get acquainted with the corresponding concepts.

## Systems of linear equations: A recap

We saw in the in-depth article on systems of linear equations that every given set of equations cannot be solved. Here are the obvious outliers

• If the number of unknowns is more than the number of equations
• If the number of equations is more than the number of unknowns

Even if you have the same number of unknowns as the number of equations, a solution is not guaranteed.

To be able to solve equations in $n$ unknown variables, you need exactly $n$ independent and consistent equations.

To understand this in further better, try our interactive demo for solving linear equations.

## A concise representation for systems of linear equations

$$\mA \vx = \vb$$

where, $\mA$ is a matrix such that the elements of each row correspond to the coefficients of the variables in the corresponding equation, $\vx$ is the vector of variables, and $\vb$ is the output vector.

It should be noted that the number of results in $\vb$ is the same as the number of rows of $\mA$. Also, the number of columns of $\mA$ is the same as the number of variables in $\vx$.

Gaussian elimination can be easily re-implemented for matrices as simple row-operations. Multiply a row with an appropriate scalar and subtract it from another row to eliminate a variable and continue to the solution as we demonstrated earlier.

What about the independence requirements we mentioned earlier? Well, they still apply. More succinctly so.

To solve equations in $n$ variables, you need $n$ linearly independent rows and $n$ linearly independent columns in the matrix $\mA$.

We have explained linear independence in linear combination of vectors.

### Matrix rank

The number of linearly independent rows of a matrix is known as the row rank of the matrix.

Conversely, since each column is also a vector, the number of linearly independent columns of a matrix is known as the column rank of the matrix.

A fundamental result of linear algebra states that The row rank and column rank of any matrix are always equal. This common number of independent rows or columns is simply referred to as the rank of the matrix. The proof is out of scope here, but it is worth checking out.

So, getting back to solving linear equations, to solve $\mA \vx = \vb$, where $\vx \in \real^n$, $\mA$ needs to have $n$ linearly independent rows. In other words, $\textbf{rank} ( \mA ) = n$.

## Properties of matrix rank

Some interesting facts about matrix rank.

• If a matrix has any row or column with an element, then it's rank is at least $1$.
• Empty matrices have zero rank.
• The maximum possible rank of a matrix is less than the lesser of the number of columns or rows of that matrix.

$$\text{rank}(\mA) \le \min(m,n) , \mA \in \real^{m \times n}$$

## Fullness and deficiency

If every row of a matrix is linearly independent of the rest, then it is called a full row rank matrix. Similarly, if every column of a matrix is linearly independent of the rest, then it is referred to as a full column rank matrix.

The matrix is said to be full rank if it has the maximum achievable rank for a matrix of its size, limited only by the number of rows or columns. In other words, a full rank matrix has its rank equal to its number of rows and number of columns.

A matrix that is not full rank , is considered to be rank deficient.

We know from our previous discussion on Gaussian elimination that we need a full-rank matrix to arrive at a solution.

## Where to next?

With a sound understanding of matrix rank, you are ready to explore other advanced topics in linear algebra.

Already feeling like an expert in linear algebra? Move on to other advanced topics in mathematics or machine learning.