# Rank–nullity theorem

In mathematics, the **rank–nullity theorem** of linear algebra, in its simplest form, states that the rank and the nullity of a matrix add up to the number of columns of the matrix. Specifically, if *A* is an *m*-by-*n* matrix (with *m* rows and *n* columns) over some field, then^{[1]}

This applies to linear maps as well. Let *V* and *W* be vector spaces over some field and let *T* : *V* → *W* be a linear map. Then the rank of *T* is the dimension of the image of *T* and the nullity of *T* is the dimension of the kernel of *T*, so we have

or, equivalently,

One can refine this statement (via the splitting lemma or the below proof) to be a statement about an isomorphism of spaces, not just dimensions.

More generally, one can consider the image, kernel, coimage, and cokernel, which are related by the fundamental theorem of linear algebra.

## Proofs

We give two proofs. The first proof uses notations for linear transformations, but can be easily adapted to matrices by writing *T*(**x**) = **Ax**, where **A** is *m* × *n*. The second proof looks at the homogeneous system **Ax** = **0** associated with an *m* × *n* matrix **A** of rank *r* and shows explicitly that there exist a set of *n* − *r* linearly independent solutions that span the null space of **A**. These proofs are also available in the book by Banerjee and Roy (2014) ^{[2]}

**First proof:** Suppose forms a basis of *ker T*. We can extend this to form a basis of *V*: . Since the dimension of ker *T* is *m* and the dimension of *V* is *m* + *n*, it suffices to show that the dimension of image of *T* (im T) is *n*.

Let us see that is a basis of im *T*. Let *v* be an arbitrary vector in *V*. There exist unique scalars such that:

Thus, spans im *T*.

Now, we need only show that this list is not redundant; that is, that are linearly independent. We can do this by showing that a linear combination of these vectors is zero if and only if the coefficient on each vector is zero. Let:

Then, since **u**_{i} span ker *T*, there exists a set of scalars *d _{i}* such that:

But, since form a basis of *V*, all *c _{i}*,

*d*must be zero. Therefore, is linearly independent and indeed a basis of im

_{i}*T*. This proves that the dimension of im

*T*is

*n*, as desired.

In more abstract terms, the map *T* : *V* → im *T* *splits*.

**Second proof:** Let **A** be an *m* × *n* matrix with *r* linearly independent columns (i.e. rank of **A** is *r*). We will show that: (i) there exists a set of *n* − *r* linearly independent solutions to the homogeneous system **Ax** = **0**, and (ii) that every other solution is a linear combination of these *n* − *r* solutions. In other words, we will produce an *n* × (*n* − *r*) matrix **X** whose columns form a basis of the null space of **A**.

Without loss of generality, assume that the first *r* columns of **A** are linearly independent. So, we can write **A** = [**A**_{1}:**A**_{2}], where **A**_{1} is *m* × *r* with *r* linearly independent column vectors and **A**_{2} is *m* × (*n* − *r*), each of whose *n* − *r* columns are linear combinations of the columns of **A**_{1}. This means that **A**_{2} = **A**_{1} **B** for some *r* × (*n* − *r*) matrix **B** (see rank factorization) and, hence, **A** = [**A**_{1}:**A**_{1}**B**]. Let , where is the (*n* − *r*) × (*n* − *r*) identity matrix. We note that **X** is an *n* × (*n* − *r*) matrix that satisfies

Therefore, each of the *n* − *r* columns of **X** are particular solutions of **Ax** = **0**. Furthermore, the *n* − *r* columns of **X** are linearly independent because **Xu** = **0** will imply **u** = **0**:

Therefore, the column vectors of **X** constitute a set of *n* − *r* linearly independent solutions for **Ax** = **0**.

We next prove that *any* solution of **Ax** = **0** must be a linear combination of the columns of **X** For this, let
be any vector such that **Au** = **0**. Note that since the columns of **A**_{1} are linearly independent, **A**_{1}**x** = **0** implies **x** = **0**. Therefore,

This proves that any vector **u** that is a solution of **Ax** = **0** must be a linear combination of the *n* − *r* special solutions given by the columns of **X**. And we have already seen that the columns of **X** are linearly independent. Hence, the columns of **X** constitute a basis for the null space of **A**. Therefore, the nullity of **A** is *n* − *r*. Since *r* equals rank of **A**, it follows that rk(**A**) + nul(**A**) = *n*. QED.

## Reformulations and generalizations

This theorem is a statement of the first isomorphism theorem of algebra for the case of vector spaces; it generalizes to the splitting lemma.

In more modern language, the theorem can also be phrased as follows: if

- 0 →
*U*→*V*→*R*→ 0

is a short exact sequence of vector spaces, then

- dim(
*U*) + dim(*R*) = dim(*V*).

Here *R* plays the role of im *T* and *U* is ker *T*, i.e.

In the finite-dimensional case, this formulation is susceptible to a generalization: if

- 0 →
*V*_{1}→*V*_{2}→ ... →*V*_{r}→ 0

is an exact sequence of finite-dimensional vector spaces, then

^{[3]}

The rank–nullity theorem for finite-dimensional vector spaces may also be formulated in terms of the *index* of a linear map. The index of a linear map *T* : *V* → *W*, where *V* and *W* are finite-dimensional, is defined by

- index
*T*= dim(ker*T*) − dim(coker*T*).

Intuitively, dim(ker *T*) is the number of independent solutions *x* of the equation *Tx* = 0, and dim(coker *T*) is the number of independent restrictions that have to be put on *y* to make *Tx* = *y* solvable. The rank–nullity theorem for finite-dimensional vector spaces is equivalent to the statement

- index
*T*= dim(*V*) − dim(*W*).

We see that we can easily read off the index of the linear map *T* from the involved spaces, without any need to analyze *T* in detail. This effect also occurs in a much deeper result: the Atiyah–Singer index theorem states that the index of certain differential operators can be read off the geometry of the involved spaces.

## Notes

- ↑ Meyer (2000), page 199.
- ↑ Banerjee, Sudipto; Roy, Anindya (2014),
*Linear Algebra and Matrix Analysis for Statistics*, Texts in Statistical Science (1st ed.), Chapman and Hall/CRC, ISBN 978-1420095388 - ↑ Zaman, Ragib. "Dimensions of vector spaces in an exact sequence".
*Mathematics Stack Exchange*. Retrieved 27 October 2015.

## References

- Banerjee, Sudipto; Roy, Anindya (2014),
*Linear Algebra and Matrix Analysis for Statistics*, Texts in Statistical Science (1st ed.), Chapman and Hall/CRC, ISBN 978-1420095388 - Meyer, Carl D. (2000),
*Matrix Analysis and Applied Linear Algebra*, SIAM, ISBN 978-0-89871-454-8.