Advanced Linear Algebra - Lecture 25: Schur Triangularization

  Рет қаралды 12,879

Nathaniel Johnston

Nathaniel Johnston

4 жыл бұрын

We learn about Schur triangularization, which tells us how simple we can make matrices under unitary similarity transformations. We then use it to prove that the determinant of a matrix is the product of its eigenvalues, and the trace of a matrix is the sum of its eigenvalues.
Textbook: www.njohnston.ca/publications/...
Blank course notes (lectures 25-29): www.njohnston.ca/ala_week7.pdf
Annotated course notes (lectures 25-29): www.njohnston.ca/ala_week7_ann...
Please leave a comment below if you have any questions, comments, or corrections.

Пікірлер: 27
@serajabusalah3393
@serajabusalah3393 Күн бұрын
I don't know how this man ISN'T FAMOUS!. great job Mr.Nathaniel. keep it going!!!
@mr-vj6do
@mr-vj6do 2 жыл бұрын
Very clear and rigorous. Really well done!
@caseyj1144
@caseyj1144 3 жыл бұрын
This is such an amazing explanation! Thank you 🙏🏾🙏🏾
@user-tq1re1es6x
@user-tq1re1es6x 7 ай бұрын
你真是一個非常優秀的老師! You are such a brilliant teacher!
@boeingb0y0rushia19
@boeingb0y0rushia19 Жыл бұрын
bro is a legend. He can rap the whole lecture while non-native speaker like me can still understand the content.
@hamzaahmedmir
@hamzaahmedmir 3 жыл бұрын
Thank you so much for this.
@Noah-jz3gt
@Noah-jz3gt Жыл бұрын
great! thanks a lot for amazingly clear and well-explained lecture!
@user-uc6qp6cp4x
@user-uc6qp6cp4x Жыл бұрын
Marvelous explanation!
@elotimmi4942
@elotimmi4942 2 ай бұрын
He looks like Alexander Mcqueen!! Love the lectures
@MathForLife
@MathForLife 2 жыл бұрын
Amazing video! Thank you!
@parthagrawal2177
@parthagrawal2177 3 жыл бұрын
Thanks for the video
@advancedappliedandpuremath
@advancedappliedandpuremath Жыл бұрын
Great Explanation Sir thanks
@abbeleon
@abbeleon 11 ай бұрын
Block matrices are gifts that keep on giving. Proofs here would otherwise be obscured by a sea of sigmas and indices
@jaweriarafiq.
@jaweriarafiq. Жыл бұрын
you r just great
@sheldonchou8830
@sheldonchou8830 11 ай бұрын
Hi, why is the trace operation communitive to matrices?
@elhoplita69
@elhoplita69 Жыл бұрын
Hi, how do you know that the remaining n-1 matrix is unitary ??
@matthewtandingan9704
@matthewtandingan9704 3 жыл бұрын
hello, is there another way to prove this by using the direct sum decomposition of the eigenspace corresponding to A and its orthogonal complement?
@NathanielMath
@NathanielMath 3 жыл бұрын
Yep. In a sense, that's what we're doing when we construct the unitary matrix U = [v | V]. The first column, v, spans the eigenspace, and the rest of the columns, V, span the orthogonal complement of that eigenspace.
@eddieseabrook8614
@eddieseabrook8614 3 жыл бұрын
Hi there. This video was incrediby helpful, thanks so much for uploading it. I have a question which I have tried to find the answer to from multiple sources online but haven't had any luck so far. As you mentioned in the video, the detailed structure of T and Q is not unique for the Schur Triangularization. One implication of this is that the ordering of eigenvalues along the diagonal of T is not fixed. But, how about the case when A has repeated eigenvalues (i.e. a double eigenvalue of 2). will these two eigenvalues always be neighbours on the diagonal of T, regardless of which algorithm we use to compute the decomposition? In particular, I am considering the real schur decomposition, where a real matrix A=QTQ^T for any orthogonal matrix Q. Then, the rule is that along the diagonal of T we now have 1x1 and 2x2 blocks, the former being real eigenvalues of A and the latter being 2x2 matrices corresponding to any complex eigenvalues of A. However, I think the same answer to the above question would apply to this case, except obviously that you cannot split up any of the 2x2 blocks that represent complex conjugate pairs of eigenvalues
@NathanielMath
@NathanielMath 3 жыл бұрын
Hi Eddie, Even if the eigenvalues are repeated, they need not be next to each other in a Schur triangularization (or in a 2x2 block diagonalization). The reason for this is simply that any triangular matrix is already in Schur triangular form, so if you purposely create a triangular matrix with repeated entries not next to each other, then that's a valid counterexample. However, particular implementations of algorithms that are used to compute Schur triangularizations might favor putting repeated eigenvalues next to each other (I wouldn't be surprised if most do, but I don't know for certain that *all* do).
@eddieseabrook8614
@eddieseabrook8614 3 жыл бұрын
@@NathanielMath Thanks a lot for the swift response. This explains the issue fully for me, much appreciated! One other small question. As I understand, there are some algorithms for ordering Q and T so that the eigenvalues on the diagonal of T become ordered. If we were to do this, for example so that the eigenvalues are ordered in decreasing order, is there any relationship between the subspace spanned by the top schur vectors and the subspace spanned by the top eigenvectors?
@NathanielMath
@NathanielMath 3 жыл бұрын
@@eddieseabrook8614 I don't believe so. I think it would be very difficult to get a relationship like that since, for example, the matrix might no have a full set of n linearly independent eigenvectors. So the sum total of all eigenspaces might just be 1-dimensional, for example, while the set of all Schur vectors spans the entire n-dimensional space.
@eddieseabrook8614
@eddieseabrook8614 3 жыл бұрын
@@NathanielMath Okay, that makes sense. Thanks a lot for your comments! the video also was very helpful :)
@advancedappliedandpuremath
@advancedappliedandpuremath Жыл бұрын
Sir can we have a lower triangulization
@NathanielMath
@NathanielMath Жыл бұрын
Yep, to get a lower triangularization of a matrix A, just compute an upper triangularization of its transpose A^T.
@advancedappliedandpuremath
@advancedappliedandpuremath Жыл бұрын
@@NathanielMath great Sir thanks
@advancedappliedandpuremath
@advancedappliedandpuremath Жыл бұрын
one last thing i want to ask Sir is that do we have another way to get an upper triangular similar matrix say for diagonalizable matrices can we find T some other way easily?
Advanced Linear Algebra - Lecture 26: The Cayley-Hamilton Theorem
21:38
Nathaniel Johnston
Рет қаралды 7 М.
Singular Value Decomposition (the SVD)
14:11
MIT OpenCourseWare
Рет қаралды 601 М.
Fast and Furious: New Zealand 🚗
00:29
How Ridiculous
Рет қаралды 45 МЛН
Mom's Unique Approach to Teaching Kids Hygiene #shorts
00:16
Fabiosa Stories
Рет қаралды 38 МЛН
Llegó al techo 😱
00:37
Juan De Dios Pantoja
Рет қаралды 60 МЛН
Пранк пошел не по плану…🥲
00:59
Саша Квашеная
Рет қаралды 7 МЛН
Advanced Linear Algebra 17: Schur's Unitary Triangularization
44:53
Math at Andrews University
Рет қаралды 4,2 М.
Mathematicians vs. Physics Classes be like...
7:55
Flammable Maths
Рет қаралды 2,9 МЛН
What is a tensor anyway?? (from a mathematician)
26:58
Michael Penn
Рет қаралды 172 М.
28. Similar Matrices and Jordan Form
45:56
MIT OpenCourseWare
Рет қаралды 112 М.
The World's Best Mathematician (*) - Numberphile
10:57
Numberphile
Рет қаралды 7 МЛН
I'm Starting A Revolution
10:30
Bryan Johnson
Рет қаралды 266 М.
Visualize Spectral Decomposition | SEE Matrix, Chapter 2
15:55
Visual Kernel
Рет қаралды 64 М.
Visualize Different Matrices part1 | SEE Matrix, Chapter 1
14:51
Visual Kernel
Рет қаралды 55 М.
ML Was Hard Until I Learned These 5 Secrets!
13:11
Boris Meinardus
Рет қаралды 258 М.
Fast and Furious: New Zealand 🚗
00:29
How Ridiculous
Рет қаралды 45 МЛН