Page 33 - Vector Analysis
P. 33
§1.8 Matrix Diagonalization and the Jordan Forms 29
Let dim(V) = n and B be a basis of V. Then if λ P F is an eigenvalue of L P B(V),
there exists non-zero vector v P V such that
[L]B[v]V = [Lv]B = λ[v]B ;
thus the matrix representation [L]B of L satisfies that [L]B ´ λIn is singular (not invertible).
Therefore, det([L]B ´ λIn) = 0 which motivates the following
Definition 1.93. Let A P M(n, n; F) be a n ˆ n matrix over scalar field F. An eigenvalue
of A is a scalar λ P F such that det(A ´ λIn) = 0.
Theorem 1.94. Let L P B(Fn) be symmetric. Then σ(L) Ď R.
Proof. Let λ P σ(L), and v be an eigenvector associated with λ. Then
λ(v, v)Fn = (λv, v)Fn = (Lv, v)Fn = (v, L˚v)Fn = (v, Lv)Fn = (v, λv) = λs(v, v)Fn
which implies that λ P R. ˝
Lemma 1.95. Let L P B(Fn) be symmetric, and (¨, ¨)Fn be the standard inner product on
Fn. Then the two numbers
m ” inf (Lu, u)Fn and M ” sup (Lu, u)Fn
}u}Fn =1
}u}Fn =1
belong to σ(L).
Proof. Suppose that M R σ(L). Let [u, v] = (M u ´ Lu, v)Fn. Then [¨, ¨] is an inner product
on Fn; thus the Cauchy-Schwarz inequality (Proposition 1.16) implies that
ˇˇ[u, v]ˇˇ ď ˇˇ[u, u]ˇˇ1/2ˇˇ[v, v]ˇˇ1/2 .
By Theorem 1.25, we find that
}M u ´ Lu}Fn = sup ˇˇ(M u ´ Lu, v)Fn ˇ = sup ˇˇ[u, v]ˇˇ ď sup ˇˇ[u, u]ˇˇ1/2ˇˇ[v, v]ˇˇ1/2
ˇ
}v}Fn =1 }v}Fn =1 }v}Fn =1
ď (M ´ m)1/2(M u ´ Lu, u)F1/n2 @ u P Fn , (1.5)
where we use the fact that sup ˇˇ[v, v]ˇˇ1/2 = (M ´ m)1/2 to conclude the last inequality.
}v}Fn =1