Question from a proof in Axler 2nd Ed, 'Linear Algebra Done Right'

In summary, the conversation revolves around the proof of TH 5.13 in the 2nd edition of Linear Algebra Done Right, which differs from the proof in the 4th edition. The proof involves a linear operator ##T## on a finite dimensional complex vector space ##V## and an eigenvalue ##\lambda## of ##T##. The subspace ##U## is the range of ##T-\lambda I##, and a basis ##\{ u_1,u_2,...,u_m, v_1,v_2,...v_k\}## is used to show that the matrix of ##T-\lambda I## restricted to ##U## is upper triangular. The identity ##Tv_j = (T-\
  • #1
Stephen Tashi
Science Advisor
7,861
1,598
TL;DR Summary
A question about the final step in a proof (by induction) that each linear transformation in a finite dimensional complex vector space has a basis in which its matrix is upper triangular.
My question is motivated by the proof of TH 5.13 on p 84 in the 2nd edition of Linear Algebra Done Right. (This proof differs from that in the 4th ed - online at: https://linear.axler.net/index.html chapter 5 )

In the proof we arrive at the following situation:
##T## is a linear operator on a finite dimensional complex vector space ##V## and ##\lambda ## is a an eigenvalue of ##T\ ##. The subspace ##U## is the range of the linear operator ##T−λI## The set of vectors ##\{ u_1,u_2,...,u_m, v_1,v_2,...v_k\}## is a basis for ##V## such that ##\{ u_1,u_2,...u_m\}## is a basis for ##U## and such that the matrix of the operator ##T - \lambda I ## restricted to ##U## is upper triangular in that basis.

We have the identity ## Tv_j = (T - \lambda I) v_j + \lambda v_j##. Since ##(T-\lambda I) v_j \in U##, this exhibits ##Tv_j## as the sum of two vectors where the first can be expressed as a linear combination of the vectors ##u_i## and the second is ##\lambda v_j##.

My question: (for example in the case Dim ##U = m = 2,\ ##, Dim ## V = 5## ) Does this show the matrix of ##T## has the form
##\begin{pmatrix} a_{1,1}&a_{1,2}& a_{1,3} & a_{1,4} & a_{1,5} \\ 0 & a_{2,2}, & a_{2,3} & a_{2,4} & a_{2,5} \\ 0 & 0 & \lambda & 0 & 0 \\ 0 & 0 &0 & \lambda &0\\ 0 & 0 & 0 & 0 & \lambda \end{pmatrix} ## ?

This is not how Axler ends the proof. He makes the less detailed observation that ##v_j \in ## Span ## \{u_1,u_2,...v_j\} ## for ##j = 1, k\ ##. That property characterizes an upper triangular matrix by a previously proved theorem, TH 5.12.
 
Last edited:
Physics news on Phys.org
  • #2
pardon me, I did not try read axler, but this theorem seems off hand to have a trivial proof. just take v1 as an eigenvector for T, then mod out V by the space spanned by v1, and take v2 to be a vector representing an eigenvector for the induced map on V/<v1>. Then take v3 to be a vector representing an eigenvector for the induced map on V/<v1,v2>, ...... I.e. at each stage, vk is an eigenvector mod the previous vectors, so T(vk)-ck.vk is a linear combination of v1,...,vk-1. Is this nonsense?

but I see your questions is otherwise. I think the answer to it is yes.
 
Last edited:
  • #3
Here's my discomfort with the zeroes: If all those zeroes appear, why do we ever need generalized eigenvectors? This is just an intuitive discomfort. I haven't figured out whether a defective matrix couldn't have them.
 
  • #4
I see why my intuition is wrong. For example, ##Tv_1 = \begin{pmatrix} a_{1,3} \\ a_{2,3} \\ \lambda,\\0,\\0 \end{pmatrix} ## , which isn't equal to ##\lambda v_1##. So ##\lambda v_1## isn't necessarily an eigenvector.
 

Similar threads

Replies
3
Views
1K
Replies
4
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
873
  • Linear and Abstract Algebra
Replies
1
Views
986
  • Linear and Abstract Algebra
Replies
5
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Precalculus Mathematics Homework Help
Replies
4
Views
1K
  • Linear and Abstract Algebra
Replies
4
Views
1K
  • Calculus and Beyond Homework Help
Replies
8
Views
537
  • Linear and Abstract Algebra
Replies
5
Views
1K
Back
Top