The deeper meaning of matrix transposition

Channel Avatar
Comment
X
Share
The deeper meaning of matrix transposition
The deeper meaning of matrix transposition
100,000 questions and answers: https://forms.gle/dHnWwszzfHUqFKny7

Transposing isn't just about swapping rows and columns, it's more about changing perspectives to get the same measurements. By understanding the general idea of transposing a linear map, we can use it to visualize transposition much more directly. We will also rely heavily on the concept of covectors, and touch lightly on metric tensors in special/general relativity and adjoints in quantum mechanics.

To my knowledge, this way of visualizing the transposition is original. Most people use SVD (singular value decomposition) for such visualization, but I think it's much less direct than this, and SVD is also mainly used for numerical methods, so it seems somewhat unnatural to use a numerical method to explain linear transformations (of course, SVD is extremely useful). Please let me know if you know of other people having this specific visualization.

The concept I'm introducing here is usually called "pullback" (and in fact the original linear transformation would be called "pushforward"), but as shown in the video, another way of thinking about transposition is the notion of "adjoint" .

Remarks:
(1) I call covectors a "measuring device", not only because the representation of covector levels looks like a ruler when you take a strip of the plane, but also because of its connections to quantum mechanics. A “bra” in quantum mechanics is a covector and can be thought of as a “measurement”, in the sense of “how likely is it that you measure this state” (sort of).

(2) I deliberately do not use line vectors to describe covectors, not only because it only works in finite-dimensional spaces, but also because it is awkward for order when we say that a transposition matrix *acts* on the covector. We usually apply transformations to the *left*, but if you treat the covector as a row vector, you need to act on the transpose matrix to the *right*.

(3) You can do a sort of "exercise" to check this transpose visualization for all (non-singular) matrices, but I think the algebra is a bit too tedious. This is why I spent a lot of time talking about the overview of transposition – to make the explanation as natural as possible.

Further reading:

**GENERAL**

(a) Transposition of a linear map (Wikipedia)
https://en.wikipedia.org/wiki/Transpose_of_a_linear_map

(b) Vector space not isomorphic to its dual (for infinite-dimensional vector spaces):
https://math.stackexchange.com/questions/35779/what-can-be-said-about-the-dual-space-of-an-infinite-dimensional-real-vector-spa
https://math.stackexchange.com/questions/58548/why-are-vector-spaces-not-isomorphic-to-their-duals/58598#58598

**RELATIVITY**

(a) Metric/inverse metric as vector-covector correspondence: https://en.wikipedia.org/wiki/Raising_and_lowering_indices
https://en.wikipedia.org/wiki/Minkowski_space#Raising_and_lowering_of_indices

**DEPUTY**

(a) Dot product (the very prerequisite for the definition of adjoints, the analogue of dot products in Euclidean space): https://en.wikipedia.org/wiki/Inner_product_space

(b) Adjoints (another way of thinking about transpositions, but I think it's mainly the complex analogue of transposition): https://en.wikipedia.org/wiki/Hermitian_adjoint

(c) Reisz's representation theorem (more relevant for the adjoints, but regarding the statement that "we choose certain covectors to act on": here it is the "continuous" dual, very relevant in QM): https://en.wikipedia.org/wiki/Riesz_representation_theorem

(d) Self-adjoint operators (Hermitian operators in QM, but also useful in Sturm-Liouville theory in ODEs):
https://en.wikipedia.org/wiki/Self-adjoint_operator

Video chapters:
00:00 Presentation
00:56 Chapter 1: Overview
04:29 Chapter 2: Visualizing covectors
09:32 Chapter 3: Visualizing the transposition
16:18 Two more examples of transposition
19:51 Chapter 4: Subtleties (special relativity?)

In addition to commenting on the video, you are welcome to fill out a Google Form linked below, which helps me create better videos taking into account your math level:
https://forms.gle/QJ29hocF9uQAyZyH6

If you want to learn more about interesting math, stay tuned for the next video!

Subscribe and see you in the next video!

If you're wondering how I made all of these videos, even though they are stylistically similar to 3Blue1Brown, I don't use his Manim animation engine, but I'll probably reveal how I did it in a step-by-step tutorial. potential subscriber, then subscribe!

Social networks:

Facebook: https://www.facebook.com/mathemaniacyt
Instagram: https://www.instagram.com/_mathemaniac_/
Twitter: https://twitter.com/mathemaniacyt
Patreon: https://www.patreon.com/mathemaniac (support if you want it and can afford it!)
Derived products: https://mathemaniac.myspreadshop.co.uk
Ko-fi: https://ko-fi.com/mathemaniac [for occasional assistance]

For my contact email, see my About page on a PC.

See you next time!

Please take the opportunity to connect and share this video with your friends and family if you find it useful.

Read Also

Leave a Reply

Your email address will not be published. Required fields are marked *