Page 16 - Vector Analysis
P. 16

12 CHAPTER 1. Linear Algebra

    First, we claim that ␣Avk+1, ¨ ¨ ¨ , Avn( is a linearly independent set of vectors. To see
this, suppose that αk+1, ¨ ¨ ¨ , αn P F such that

                αk+1Avk+1 + ¨ ¨ ¨ + αnAvn = 0 .

Then A(αk+1vk+1 + ¨ ¨ ¨ + αnvn) = 0 which implies that αk+1vk+1 + ¨ ¨ ¨ + αnvn P null(A).
Since ␣v1, ¨ ¨ ¨ , vk( is a basis of null(A), there exist α1, ¨ ¨ ¨ , αk P F such that

                                  α1v1 + αkvk = αk+1vk+1 + ¨ ¨ ¨ + αnvn .

By the linear independence of ␣v1, ¨ ¨ ¨ , vn(, we must have α1 = ¨ ¨ ¨ = αn = 0 which shows
the linear independence of ␣Avk+1, ¨ ¨ ¨ , Avn(.

    Let w P R(A). Then w = Av for some v P Fn. Since ␣v1, ¨ ¨ ¨ , vn( is a basis of Fn, there
exist β1, ¨ ¨ ¨ , βn P F such that v = β1v1 + ¨ ¨ ¨ + βnvn. As a consequence, by the fact that
Avj = 0 for 1 ď j ď k,

w = Av = A(β1v1 + ¨ ¨ ¨ + βnvn) = β1Av1 + ¨ ¨ ¨ βnAvn = βk+1Avk+1 + ¨ ¨ ¨ + βnAvn ;

thus w can be written as a linear combination of ␣Avk+1, ¨ ¨ ¨ , Avn(.                  ˝

Theorem 1.48. The rank of a matrix is the same as the rank of its transpose. In other
words, for a given matrix the row rank equals the column rank.

Proof. Let A be a m ˆ n matrix, and (¨, ¨)Fn, (¨, ¨)Fm be the standard inner products on Fn,
Fm, respectively. Then Proposition 1.43 implies that

y P R(A)K ô (y, Ax)Fm = 0 for all x P Fn ô (ATy, x)Fn = 0 for all x P Fn
               ô ATy = 0 ô y P null(AT) .

In other words, R(A)K = null(AT). Since the column rank of A is the dimension of R(A),
we must have

nullity(AT)  =  nullity(AT)  =  dim  ()       =  m  ´  the   column     rank  of  A  .
                                      R(A)K

On the other hand, Theorem 1.47 implies that

                rank(AT) + nullity(AT) = m ;

thus the column rank of A is the same as the row rank of A.                             ˝
   11   12   13   14   15   16   17   18   19   20   21