What is a tensor anyway?? (from a mathematician)

  Рет қаралды 167,956

Michael Penn

Michael Penn

2 жыл бұрын

Suggest a problem: forms.gle/ea7Pw7HcKePGB4my5
Please Subscribe: kzfaq.info...
Patreon: / michaelpennmath
Merch: teespring.com/stores/michael-...
Personal Website: www.michael-penn.net
Randolph College Math: www.randolphcollege.edu/mathem...
Randolph College Math and Science on Facebook: / randolph.science
Research Gate profile: www.researchgate.net/profile/...
Google Scholar profile: scholar.google.com/citations?...
If you are going to use an ad-blocker, considering using brave and tipping me BAT!
brave.com/sdp793
Buy textbooks here and help me out: amzn.to/31Bj9ye
Buy an amazon gift card and help me out: amzn.to/2PComAf
Books I like:
Sacred Mathematics: Japanese Temple Geometry: amzn.to/2ZIadH9
Electricity and Magnetism for Mathematicians: amzn.to/2H8ePzL
Abstract Algebra:
Judson(online): abstract.ups.edu/
Judson(print): amzn.to/2Xg92wD
Dummit and Foote: amzn.to/2zYOrok
Gallian: amzn.to/2zg4YEo
Artin: amzn.to/2LQ8l7C
Differential Forms:
Bachman: amzn.to/2z9wljH
Number Theory:
Crisman(online): math.gordon.edu/ntic/
Strayer: amzn.to/3bXwLah
Andrews: amzn.to/2zWlOZ0
Analysis:
Abbot: amzn.to/3cwYtuF
How to think about Analysis: amzn.to/2AIhwVm
Calculus:
OpenStax(online): openstax.org/subjects/math
OpenStax Vol 1: amzn.to/2zlreN8
OpenStax Vol 2: amzn.to/2TtwoxH
OpenStax Vol 3: amzn.to/3bPJ3Bn
My Filming Equipment:
Camera: amzn.to/3kx2JzE
Lense: amzn.to/2PFxPXA
Audio Recorder: amzn.to/2XLzkaZ
Microphones: amzn.to/3fJED0T
Lights: amzn.to/2XHxRT0
White Chalk: amzn.to/3ipu3Oh
Color Chalk: amzn.to/2XL6eIJ

Пікірлер: 630
@qm_turtle
@qm_turtle 2 жыл бұрын
I always loved the definition for tensors in many physics textbooks: "a tensor is a mathematical object that transforms like a tensor".
@Erik_Caballero
@Erik_Caballero 2 жыл бұрын
Wait! You mean to say that the floor here is made of floor?!
@qm_turtle
@qm_turtle 2 жыл бұрын
@@Erik_Caballero precisely that
@theadamabrams
@theadamabrams 2 жыл бұрын
I mean, a "vector" in mathematics is "an element of a vector space", so that tensor definition seems perfect to me.
@iheartalgebra
@iheartalgebra 2 жыл бұрын
@@theadamabrams And let's not forget that a "vector space" in turn is simply a "space consisting of vectors" xD
@jankriz9199
@jankriz9199 2 жыл бұрын
my friend used to always say aloud when heard this definition "ok boomer" :D
@AndrewDotsonvideos
@AndrewDotsonvideos 2 жыл бұрын
I'm gonna have to come back to this one for sure
@vinvic1578
@vinvic1578 2 жыл бұрын
Hey Andrew so cool to see you here! You should cover this in a drink and derive video ;)
@cardboardhero9950
@cardboardhero9950 2 жыл бұрын
ayyy smart people
@axelperezmachado3500
@axelperezmachado3500 2 жыл бұрын
Wait. Wasn't a tensor just something that transforms like a tensor? We have been lied!
@captainsnake8515
@captainsnake8515 2 жыл бұрын
@@axelperezmachado3500 physicists have subtly different definitions for vectors and tensors than mathematicians do.
@rickdoesmath3945
@rickdoesmath3945 2 жыл бұрын
By the way i loved the video on motivation
@hensoramhunsiyar3431
@hensoramhunsiyar3431 2 жыл бұрын
Great exposition. A video on how this concept is related to the concept of tensors in physics and other fields would be great.
@k-theory8604
@k-theory8604 2 жыл бұрын
Hey if spamming videos in the comments isn't allowed, I'm happy to not do it in the future, but I just made a short video about this on my channel.
@poproporpo
@poproporpo 2 жыл бұрын
@@k-theory8604 if it is related to the topic at hand, it’s not really shameless advertisement and we appreciate the reference. In this case it’s fine.
@athens9k
@athens9k 2 жыл бұрын
Tensors as defined in this video are objects defined over a vector space. Tensors used in physics are really tensor fields on a manifold M. For every point p on a manifold, we have a natural vector space given by the tangent space TpM. On each tangent space, we can define some sort of tensor. A physicist's tensor is a smoothly varying field of tensors. A nice fact about manifolds is that they can be locally parameterized by Euclidean space. These parameterizations are usually called charts. We can use charts to parameterize all tangent spaces within a neighborhood of any given point. This allows us to write down the components of a smoothly varying field of tensors using the coordinates given by the chart. Then comes the "tensor transformation law." If some point p sits in the intersection of two different charts, we have two sets of local coordinates coming from each chart. The tensor transformation law simply says that changing from one set of coordinates A to the other set of coordinates B should change the components of the tensor in such a way that the local description of the tensor in the B coordinates matches the transformed version of the tensor in the A coordinates. You can think about this compatability condition as gluing the local descriptions of the tensor together to form a global tensor field on the whole manifold. The concepts are related but not identical. To summarize: a mathematician's tensor is an object defined over a vector space, a physicist's tensor is a field of mathematician's tensors on a manifold.
@mikhailmikhailov8781
@mikhailmikhailov8781 2 жыл бұрын
Its the same thing, this is just the definition.
@mikhailmikhailov8781
@mikhailmikhailov8781 2 жыл бұрын
It is related by being the same fucking concept.
@cicciocareri85
@cicciocareri85 2 жыл бұрын
I'm an engineer and I've used tensors and dyadic product in mathematical physics, electromagnetism and continuum mechanics. I'd like to see more of this!
@snnwstt
@snnwstt 2 жыл бұрын
I second that. Maybe with implication of a base change for the vectors (like cartesian to oblique, or polar (2D), or curved). Sure that will be even greater with derivatives (first and second order).
@mastershooter64
@mastershooter64 2 жыл бұрын
lol I thought the most advanced math engineers used was basic math like vector calc and linear algebra and some integral transforms
@cicciocareri85
@cicciocareri85 2 жыл бұрын
@@mastershooter64 it depends where you study :-)
@mastershooter64
@mastershooter64 2 жыл бұрын
@@cicciocareri85 which field of engineering uses the most advanced math?
@snnwstt
@snnwstt 2 жыл бұрын
@@mastershooter64 Finite elements technique often use simple "reference elements" which are then ... tortured. Depending of the desired "continuity", the numerical derivatives implied by the (partial) differential equation(s) borrow heavily from tensor theory.
@buxeessingh2571
@buxeessingh2571 2 жыл бұрын
I absolutely want to see a video comparing different types of tensors. I got confused in a graduate algebra class and a graduate geometry class because I was first exposed to tensors in mechanical engineering and then in modern physics.
@JM-us3fr
@JM-us3fr 2 жыл бұрын
My understanding is that the tensors in physics are just these simple tensors that Michael is talking about, but using its matrix representation instead. But just like you can talk about a whole matrix (a_ij) by only considering a single general entry a_ij, physicists prefer to only talk about a single entry in their tensors, which may also require superscripts.
@TheElCogno
@TheElCogno 2 жыл бұрын
I think tensors in physics, at least in general relativity, are really tensor fields from differential geometry.
@rtg_onefourtwoeightfiveseven
@rtg_onefourtwoeightfiveseven 2 жыл бұрын
​@@JM-us3fr AFAIK in physics there's an explicit restriction with how tensors transform under coordinate transformations. Is this restriction also implicit in the mathematical definition, or no?
@JM-us3fr
@JM-us3fr 2 жыл бұрын
@@rtg_onefourtwoeightfiveseven I’m not sure if these are the same, but I know when you want to change the basis of your vector space, and you want to represent an alternating tensor with respect to the new basis, then you can just multiply the 1st representation by the determinant of the basis transformation matrix. But you’re probably referring to the reference frame transformation in general relativity. I’m not quite sure how this works.
@schmud68
@schmud68 2 жыл бұрын
@@rtg_onefourtwoeightfiveseven Yes it is implicit in the mathematical definition. You start with a differentiable manifold which has local coordinate charts, but you define tensor fields on the manifold independently of coordinate charts. So, they don't depend on the choice of coordinates. Let us work point-wise on the manifold. When you see the tensors in GR, what you are looking at is a choice of coordinate chart which determines a basis for the tangent/cotangent spaces (these are vector spaces associated to points on the manifold), which then allows you to express tensors as linear combinations of tensor products of these basis vectors. The coefficients of these linear combinations are the coefficients you see in Einstein notation in GR; these are, mathematically, NOT tensors. Now, I said the tensors themselves are completely coordinate independent. What this means is that the basis vectors (determined by a coordinate transformation) will transform inversely to the coefficients so that they cancel out. This is where the Jacobian/inverse Jacobian come in for the coordinate transformations. Finally, the upper and lower indices of tensors really refer to the tangent and cotangent spaces, where the cotangent space is the dual of the tangent space. A rank (1,1) tensor is built from a linear of combination of tangent vectors tensor producted with cotangent vectors. Now, if the manifold has a Riemannian metric, then it can be used to provide a canonical isomorphism between tangent and cotangent spaces (given by raising/lowering indices with the metric in GR notation). Hopefully this gives some ideas. There is quite a lot going on and I have swept some things under the rug, but if you want to formalise this further you need: vector bundles, tensors products of vector bundles, sections of bundles, and universal property of tensor products
@TIMS3O
@TIMS3O 2 жыл бұрын
Really nice video. One thing that is worth mentioning is that the reason why one wants the relations mentioned is that they are the relations needed to turn bilinear maps into linear maps. This essentialy reduces the study of bilinear maps to the study of linear maps which one already knows from linear algebra. This property of turning bilinear maps to linear maps is called the universal property of the tensor product.
@schmud68
@schmud68 2 жыл бұрын
Yeah, I thought this was very important too
@alonamaloh
@alonamaloh 2 жыл бұрын
This is exactly how the tensor product was introduced to me in my first year Linear Algebra class in college, but I think your exposition is more clear than what I remember getting at school. At the time I couldn't understand what this was about at all. The definition through the universal property regarding bilinear maps made a lot more sense to me. You may want to mention it if you make further videos on this subject.
@sslelgamal5206
@sslelgamal5206 2 жыл бұрын
Nice, thank you! All the guide on internet only look at it through physicist POV, an abstract mathematical construction was missing. I hope this series continues 👍👌
@lexinwonderland5741
@lexinwonderland5741 2 жыл бұрын
PLEASE post more!! Especially loved how you went through the formal product and coset (although I would like some more background/elaboration there), and I would love to see the other proofs and the isomorphism to R6 you teased at the end!
@pablostraub
@pablostraub 2 жыл бұрын
I would have liked the critical concept of "span" to be explained. Also, at the beginning there was a comment that "obviously" the dimension of the star product would be infinite (I was guessing 5 or 6, but no infinite) so I got lost.
@TheElCogno
@TheElCogno 2 жыл бұрын
The way I understand it, the span of a set is the set of all finite linear combinations of elements of that set, and the formal product is infinite dimensional because any v*w is a basis vector, since it cannot be expressed as a linear combination of other elements of that type.
@user-jc2lz6jb2e
@user-jc2lz6jb2e 2 жыл бұрын
The dimension is the maximum number of independent vectors you can have. R^3 has at most 3: for example, [1,0,-9], [0,4,0], [1,2,3]. You put any more vectors, and you get relations between them. For the star product, every v*w is independent from every r*s (where either component is different). You have infinitely many options for the first component, then infinitely many options for the second, so the dimension is infinite.
@brooksbryant2478
@brooksbryant2478 2 жыл бұрын
Is the formal product essentially just the Cartesian product or is there a difference I’m missing?
@user-jc2lz6jb2e
@user-jc2lz6jb2e 2 жыл бұрын
@@brooksbryant2478 The basis of the star product is just the Cartesian product. But the span is much bigger. You're taking every (finite) linear combination from this basis to make the star product space.
@pbroks13
@pbroks13 2 жыл бұрын
Formal product is basically a “dummy” operation. You basically just stick the events together, but you are not allowed to do anything to them. So, for example, if * is a formal product, you could NOT say that 2*3=3*2 or anything like that. 2*3 and 3*2 are their own separate things.
@renerpho
@renerpho 2 жыл бұрын
Yes, more on this, please! Particularly on tensors in physics and how they do (or don't) relate to your formal definition.
@MasterHigure
@MasterHigure 2 жыл бұрын
The short version is, the "tensors" (letters with upper and lower indices) that appear in physics (at least in general relativity; I know little about, say, material stresses) are the coefficients of the linear combinations we see here. The basis vectors and the tensor product symbol are elided.
@k-theory8604
@k-theory8604 2 жыл бұрын
​@@MasterHigure No, the tensors are not the coefficients. In physics, we are often concerned with vector fields. Essentially, we imagine these as some sort of vector sitting at each point in space, which vary continuously. However, in order to formally define such a thing, one first needs to define the tangents space to a (manifold) space at every point. Then, one thinks about all of these tangent spaces together as forming one large geometric object, called the tangent bundle Even more generally, one takes the tensor products of each tangent space with itself a number of times, and then consider that as one big collection of objects. Then, tensor fields in physics are just maps from the underlying space to the (generalized) tangent bundle. TL;DR: They are related, but the physics tensor is much more complicated. One needs the definition in this video to define the version from physics.
@schmud68
@schmud68 2 жыл бұрын
Yeah exactly, then you also need the cotangent bundle and the notion of the universal property of tensor products. Then, after specifying a coordinate chart, you can evaluate everything using the basis provided by the chart and get the usual physics Einstein notation. Quite a journey lol
@Handelsbilanzdefizit
@Handelsbilanzdefizit 2 жыл бұрын
MachineLearning-Student: "Hey, look at my tensors!" Physics-Student: "No, better not. But look at my tensors!" Maths-Student: "No, better not."
@thenateman27
@thenateman27 2 жыл бұрын
Don't fear my Levi-Civita operator and my Electromagnetic tensor
@mikhailmikhailov8781
@mikhailmikhailov8781 2 жыл бұрын
@@thenateman27 Electromagnetic 2-form*
@Fetrovsky
@Fetrovsky 2 жыл бұрын
It all makes perfect sense, but it could've helped to define the star operation between bectors, and then what a tensor is and why that name was chosen right around the time it was defined.
@kenzou776
@kenzou776 Жыл бұрын
What is the star operation between vectors??
@mr.es1857
@mr.es1857 Жыл бұрын
What is the star operation between vectors?? Please develop
@philippg6023
@philippg6023 Жыл бұрын
The small star between two vectors is a 'dummy' Operation. So * Maps v and w to v * w wich is an Element in V * W. Therefore the Star is just notation. Then he says all these star products are defined to be linear Independent. Hence all Elements v * w are a Basis of V * W. (This yields for example immediatly, that V*W is Infinite dimensional). I hope this helps you. Edit: an example: in the karthesian product (a, b) + (c, d) = (a+c, b+d), however a*b + c*d is not equal to (a+c)*(b+d).
@Fetrovsky
@Fetrovsky Жыл бұрын
@@philippg6023 So, if I get you correctly, there is no "star" operation per se but it's a representation of any linear operation that has certain characteristics. Is that correct?
@philippg6023
@philippg6023 Жыл бұрын
@@Fetrovsky maybe you mean the correct Thing but you say it a little bit wrong. v * w is Just a symbol, which has No meaning at all. Hence the Star product is called formal product If you have a good understanding of Basis, then you might Like this explenation. V*W is a vectorspace with the karthesian product as Basis.
@dhaka_mathematical_school
@dhaka_mathematical_school Жыл бұрын
This lecture is insanely beautiful. Please prepare/make a couple more videos on tensor products: (1) Tensors as multilinear operators, (2) tensor component transformation (covariant and contravariant), and (3) tensors as invariant/geometric objects. Thanks so much.
@user-ys3ev5sh3w
@user-ys3ev5sh3w Жыл бұрын
@rachmondhoward2125 Insanely beautiful construction example. Let's (A)(D)simplicyan be positional natural A-ary SUM(D)-digit number system. where A,D -some natural number sequences arbitrary, but equal length. Then: d-vertex simplex is a (2)(d)simplicyan. square is a (3)(2)simplicyan. cube is a (3)(3)simplicyan. d-dimensional (n-1)*2 -gon is a (n)(d)simplicuan pentagon is a (11)(1)simplicyan. d-dimensional (n-1)/2 -gon is a (n)(d-1)simplicuan Maya/Egypt pyramid is a (2,3)(1,2)simplicyan, let's enumerate it: 100 is a vertex above square. 022 (=NOT 100) is opposite edge 020 002 is are 2vertices between wich lies edge 022=002+020. 120 (=100+020) is edge between verices 100 and 020. 102 (=100+002) is edge between verices 100 and 002. 122 (100+020+002) is infinity(content) located between 3 vertices 100 020 002 and 3 edges 120 102 022. from that moment we have 2 variants: a)vertex 001 connected to vertex 020.(i choose it) b)vertex 001 connected to vertex 002. 001 (=100+100) is vertex connected to vertex 100. 021 (=001+020) is edge between vertices 001 and 020. 010 (=001+001) is vertex connected to vertex 001. 011 (=001+010) is edge between vertices 001 and 010. 012 (=002+010) is edge between vertices 002 and 010. 101 (=001+100) is edge between vertices 001 and 100. 110 (=100+010) is edge between vertices 100 and 010. 111 (=100+010+001) is triangle between vertices 100 010 001. 121 (=100+020+001) is triangle between vertices 100 020 001. 112 (=100+010+002) is triangle between vertices 100 010 002. 000 is zero is externity of pyramid . As you see: 1. infinity(content) is like triangle. 2. square doesn't exists, sguare + volume above it upto vertex 100 is Zero.
@Mr.Not_Sure
@Mr.Not_Sure Жыл бұрын
@@user-ys3ev5sh3w Your construction has numerous mistakes and inconsistencies: 1. You count interior of a k-simplex, but you don't count interiors of a square and cube. 2. Compared with (n-1)/2-gon, your (n-1)*2-gon has 4 times more "gon"s (whatever they are), but its corresponding simplician has n times more indices. 3. Equation "001=100+100" is false. 4. And what about dodecahedron? It's 3-dimensional 12-hedron with 30 edges, but doesn't correspond neither to (7)(3) nor (16)(3) simplician.
@user-ys3ev5sh3w
@user-ys3ev5sh3w Жыл бұрын
@@Mr.Not_Sure you are right. In equation "001=100+100" i mean that carry rotate. Preface. Positional natural a-ary (az^m) d-digit number systems can represent some kind of d-dimensional polytopes in geometry. If a=z^m then a-ary d-digit number system equals z-ary (m*d)-digit number system, because z^m^d=z^(m*d). For example: binary d-digit number system is a d-vertex simplex.(vertices is a numbers with digital root=1, edges is a numbers with digital root=2 and so on). 3-ary d-digit number system is a d-cube. Let d=2: if 1-chain (2 vertices + 1 edge) shift 1 times we receive 2-cube with 3^2=9 faces, i.e. square. 6-ary d-digit number system is a d-thorus. Let d=2: If 3-ring (1D triangle) shift in 3D 3 times and "press"(glue) first and last 1D triangles we recieve 2-thorus with 6^2=36 faces = 3*3 square + (3+3)*3 edges + 3*3 vertices. Conclusion. All positional natural a-ary (az^m) d-digit number systems with a^d numbers are represented by 3 types of d-polytopes (simplex (binary), cube (odd-ary) , thorus (even-ary) ) with a^d faces . For simplex d means amount of vertex or hyperplanes, for any other polytopes dimension. For me as a programmer, it's curious to know that difference in faces between consequent such polytopes is hexagonal numbers. To enumerate more complex polytope may be used positional natural A-ary D-digit number system , (A)(D)simplician. Where A,D - some natural number sequence arbitrary but equal length. Then 2D Maya/Egypt pyramid is are positional natural (2,3^2)-ary (1,1)-digit number system or (2,3^2)(1,1)simplician. digital root of (1,1)=2 is dimension of pyramid. 10 is a vertex above square. 08 (=NOT 10) is one of opposite edges. 06 02 is are 2 vertices between which lies edge 08=02+06. 16 (=10+06) is edge between vertices 10 and 06. 12 (=10+02) is edge between vertices 10 and 02. 18 (10+06+02) is triangle, infinity(content) located between 3 vertices 10 06 02 and 3 edges 16 12 08. 01 (=10+10) is vertex connected to vertex 10. 07 (=01+06) is edge between vertices 01 and 06. 03 (=01+opposite vertex 02) is vertex connected to vertex 01 and 02 instead of absent edge between 01 and 02.(exeption from rule?). 04 (=01+03) is edge between vertices 01 and 03. 05 (=02+03) is edge between vertices 02 and 03. 11 (=01+10) is edge between vertices 01 and 10. 13 (=10+03) is edge between vertices 10 and 03. 14 (=10+03+01) is triangle between vertices 10 03 01. 17 (=10+06+01) is triangle between vertices 10 06 01. 15 (=10+03+02) is triangle between vertices 10 03 02. 00 is square(zero). 18 faces=5 vertices+8 edges+4 triangles+1 square=3^2*2^1 numbers. 2D dodecahedron has 12 pentagons+30 edges+20 vertices=62 faces= 31^1*2^1 numbers of (2,31)(1,1)simplician (even thorus-like 2-polytope). 3D dodecahedron has 62+I=63 faces =7*3*3 numbers of (7,3)(1,2)simplician (odd cube-like 3-polytope). Simplex differs from any other polytope. Simplex is only "not preesed" polytope. Outside and Inside parts of this polytope is considered by me to be faces. As result simplex has 2^d faces (no 2^d-1). If to increment dimension they became from potentional to real faces. Outside and Inside parts both are counted. In other polytopes 1 or 1/2 of dimension is "pressed"("curved") in variouse ways. For example in even(thorus-like) 2-dimensional in 3D (no 3-dimensional) M aya/Egypt pyramid Outside part of "pressed" dimension is square, Inside part is triangle, so both Outside and Inside parts (interior and exterior) are not counted. In odd (cube-like, in A odd numbers only) polytope only Outside part (zero) is pressed in one of hyperplanes, so Outside part is not counted, only interior is counted, . And so on. "No distinction between numbers and shape. Numbers could not exist without shape." Pythagoras (reincarnation of Euphorbos).
@user-ys3ev5sh3w
@user-ys3ev5sh3w Жыл бұрын
@@Mr.Not_Sure Lets enumerate 3D dodecahedron ,wich is (7,3)(1,2)simplician with 7^1*3^2=63 faces. It's evident: 1. (m+n,..)(1,..)=(m,..)(1,..)+(n,..)(1,..) because of (m+n)^1=m^1+n^1. 2. (m,..)(i+j,..)=(m,..)(i,..)*(m,..)(j,..) because of m^(i+j)=m^i*m^j. 3. (m,m,..)(i,j,..)=(m,..)(i+j,..) because of m^i*m^j=m^(i+j). 4. (m^n,..)(i,..)=(m,..)(n*i,..) because of m^n^i=m^(n*i). 5. (m,n,..)(i,i,..)=(m*n,..)(i,..) because of m^i*n^i=(m*n)^i. 6. ( m ,1,..)(i,j,..)=(m..)(i,..) because of m^i*1^j=m^i.. Use 1..4: (7,3)(1,2)=(4+3,3)(1,2)=(4,3)(1,2)+(3,3)(1,2)=(2,3)(2,2)+(3)(3)=(6)(2)+(3)(3); So 3D dodecahedron equals 3D cube (3)(3) + 2D thor (6)(2). Lets enumerate 1D chain (3)(1). It's easy. Let edge be 2 because 2 is maximal number(infinity) Let rest two vertices be 0 1. Lets enumerate 2D square (3)(2)=(3)(1)*(3)(1). It's easy. Let 2D chain in center be located in 2 of first digit. Let rest two 1D chains be located in 0 1 of first digit. Lets enumerate 3D cube (3)(3)=(3)(1)*(3)(2). It's easy. Let 3D chain in center be located in 2 of first digit. Let rest two 2D chains be located in 0 1 of first digit. Lets enumerate 1D ring(triangle) (6)(1)=(3)(1)+(3)(1). It's easy. Let 1D chain be located in 0..2 of digit Let rest inversed 1D chain located in 3..5 of digit. Lets enumerate 2D thor (6)(2)=(6)(1)*(6)(1). It's easy. Let 3 1D rings be located in 0..2 of first digit Let 3 2D rings be located in 3..5 of first digit. Lets enumerate 3D dodecahedron (7,3)(1,2)=(6)(2)+(3)(3). It's easy. Let 3D cube (3)(3) be located in 0..2 of first digit unmodified as above. Let 2D thor (6)(2) be located in 3..6 of first digit in such way: (6)(2)=(4,3)(1,2)=(2,3)(1,2)+(2,3)(1,2)= 2D maya/egypt pyramid +2D maya/egypt pyramid. Let 2D maya/egypt pyramid be located in 3..4 of first digit Let 2D maya/egypt pyramid be located in 5..6 of first digit At last we must "press" 6 vertex of cube to triangles. As result 6 vertex of cube became interiors of 6 hedrons. 6 rings of thor are added to close this 6 new interiors. We receive 12 hedrons=6 old + 6 new. 30 edges=12 edges of cube + 3*3 edges + 3*3 square of thor 20 vertices=(8-6) vertices of cube + 3*3 vertices of 3 1D rings +3*3 edges of 3 2D rings of tohr.
@Julian-ot8cs
@Julian-ot8cs 2 жыл бұрын
I've been dying to get a good introduction to tensors from a mathematical perspective, and this video really does it for me! This was very fun to watch, and I'd love to see more!!!!
@TheMauror22
@TheMauror22 2 жыл бұрын
Please keep these series going! It was very cool!
@azeds
@azeds 2 жыл бұрын
Content. Video quality. Voice All this deserves 100/100
@jacobadamczyk3353
@jacobadamczyk3353 2 жыл бұрын
"If you're in I, in the quotient space you're equal to zero" never thought of cosets/quotients like this but very helpful!
@andreyv3609
@andreyv3609 2 жыл бұрын
What a pleasure to finally heave a clear, pure (actual) math exposition on the subject. Good job!
@descheleschilder401
@descheleschilder401 10 ай бұрын
Hi there! I love your way of making things clear! Very clear and in outspoken English! You make each step easy to follow, as if it all comes about very naturally! You're a great teacher! So keep up the good work!
@bilmoorhead7389
@bilmoorhead7389 2 жыл бұрын
Yes, please, more on this topic.
@markkennedy9767
@markkennedy9767 2 жыл бұрын
I would definitely look forward to more stuff on tensors. Your exposition is really good.
@caldersheagren
@caldersheagren 2 жыл бұрын
Love this, straight out of my abstract algebra class - although we did tensor modules instead of vector spaces
@Racnive
@Racnive 2 жыл бұрын
Thank you! This helped demystify tensors for me. I appreciate starting from the generic span of infinite possibilities, specifying the basic necessary properties for this object, and showing what emerges from those conditions.
@LucienOmalley
@LucienOmalley 2 жыл бұрын
Beautiful presentation as usual... thanks for your work !
@RichardTalcott
@RichardTalcott 2 жыл бұрын
EXCELLENT! Please MORE videos on this & related topics.
@ed.puckett
@ed.puckett 2 жыл бұрын
Thank you, your videos are so good. This is my vote for more on this subject.
@whispercat56235
@whispercat56235 2 жыл бұрын
Thank you very much for this, I hope to see more tensor related content from you !
@wongbob4813
@wongbob4813 2 жыл бұрын
More please! Loved this gentle introduction to tensors :)
@glenm99
@glenm99 2 жыл бұрын
This is incidental to the main topic, but I love the way that's worded at 15:36 ... "But now we can extract something from I, at no cost, because something from I is deemed as being equal to zero in the quotient." When I took algebra (so many years ago), despite knowing the statements of and being able to prove the theorems, I struggled for a long time with the practical application of this idea. And then, when you put it like that... it's so simple!
@rickdoesmath3945
@rickdoesmath3945 2 жыл бұрын
10:50 this always happens in mathematics: Analysis: Ok so i have integrals now! I finally can define this seminorm on the set of integrable functions. I hope it's a norm. What the f? The seminorm of this function is 0 but the function is not identically 0 even though it SHOULD BE. Wait not all is lost. I can quotient my set with the equal almost everywhere relation f yeah! This is literally L^1. Algebra: Look at this cute integral domain! Let's build his fraction field! I can use ordered pairs. But wait, something is weird. Look: 2/3 and 4/6 SHOULD BE equal but they ain't. Wait not all is lost. I can quotient using the equivalence relation a/b equivalent to c/d when ad = bc. This is literally the smallest extension of an integral domain that is also a field.
@QWERTYUIOPASDFGH2675
@QWERTYUIOPASDFGH2675 2 жыл бұрын
I would love to see more of these kinds of videos and on this subject in particular!
@Nickelicious7
@Nickelicious7 2 жыл бұрын
Love this, please do more
@nosy-cat
@nosy-cat 2 жыл бұрын
Beautifully done. Watched this today for the second time, and this time around I feel like I really understand what's going on. Would love to see more on this topic!
@mattc160
@mattc160 2 жыл бұрын
Please do more of this!!!!
@samuelbam3748
@samuelbam3748 2 жыл бұрын
Another element which shows the difference between V*W and the tensor product of V and W really good is the vector 0*0 which is just another independent generator but NOT equal to the zero vector of V*W
@smoosq9501
@smoosq9501 2 жыл бұрын
very well presented lecture, thank you so much
@punditgi
@punditgi 2 жыл бұрын
Definitely want more of these videos!
@philippevandordrecht1619
@philippevandordrecht1619 2 жыл бұрын
This actually helped so much, thanks.
@AlisterMachado
@AlisterMachado 2 жыл бұрын
This is such a nice view on the Tensor product! Definitely hoping to see more :)
@ElPikacupacabra
@ElPikacupacabra 2 жыл бұрын
Thank you for posting this. Great video. Yes, I'd like more on this topic, and on abstract algebra in general.
@guyarbel2387
@guyarbel2387 2 жыл бұрын
What an amazing video ! Thank you so much :)
@theliminalsystem973
@theliminalsystem973 2 жыл бұрын
This was so clear!! Thank you :)
@elramon7038
@elramon7038 2 жыл бұрын
I just started studying physics and literally everybody is talking about tensors. I luckily stumbled across this video which happened to be really helpful for my understanding. Also if you could do videos on different uses of tensors in different fields, that would be awesome. Keep up the great work!
@LorenzoChiaveriniDBTF
@LorenzoChiaveriniDBTF 2 жыл бұрын
Great video. I have been studying tensors for continuum mechanics and I am definitely interested. It would be great if you could show the relationship between this more general definition and the other ones i saw (linear mapping, and multilinear mapping).
@cycklist
@cycklist 2 жыл бұрын
Wonderfully explained.
@AJ-et3vf
@AJ-et3vf Жыл бұрын
Awesome video. Thank you
@bandreon
@bandreon 2 жыл бұрын
Brilliant class! Another of those things you should know but did not really.
@dominikstepien2000
@dominikstepien2000 2 жыл бұрын
Great video, I had similar introduction in Differential Forms course, but your explanation was far better than my professor's one. I would love to see more about this. I don't know much about different perspectives on tensor products, but I have heard about usage of them in physics and machine learning and I have no clue how it could be related.
@IustinThe_Human
@IustinThe_Human 2 жыл бұрын
this video connects for me so many dots, it was amazing
@user_2793
@user_2793 2 жыл бұрын
Exactly what I need before looking at multiparticle states in QM! Thank you.
@FT029
@FT029 2 жыл бұрын
Good video! It's really funny how to get desired properties (scalar mult, distributivity), you just quotient out by a bunch of stuff! My abstract algebra professor constantly referred to tensor product as a "universal, bilinear map", and all our homework problems only used that definition-- nothing concrete. So it's great to see some examples! I would be curious to see more on the topic.
Жыл бұрын
I think getting desired properties via a quotient is a fairly general technique, isn't it? That's also how you can get complex numbers out of pairs of real numbers. Or how you can get interesting groups out of free groups etc.
@myt-mat-mil-mit-met-com-trol
@myt-mat-mil-mit-met-com-trol 2 жыл бұрын
I barely remember the similar approach from the perspective of my introductory notes to multilinear algebra, where category theory style proofs stood before the tensor product formal definition. Despite that I got a really good mark on the subject, I did not really grasped the idea, and how it is related to the way tensors are approached in other disciplines, say General Relativity. At first I promised myself to study the well-known Spivak's book (calculus on manifolds), but I couldn't keep the promise, as I got busy with obligations which, say, never required to fulfill it, neither asked me to use tensors. So now more than twenty years after, barely remembering, books still on shelves, obligations not really changed, hopefully I watch KZfaq videos enough to preserve my reflections alive. So you have just me as a keen viewer.
@Evan490BC
@Evan490BC 2 жыл бұрын
Read Spivak's book even now. It is the most intuitive (to me at least), abstract exposition of tensors, chains, differential forms, etc I could find.
@7177YT
@7177YT 9 ай бұрын
Excellent! Thank you! ❤
@krelly90277
@krelly90277 2 жыл бұрын
Thank you for this video. Please consider making more videos on tensor theory.
@cheeseburger118
@cheeseburger118 2 жыл бұрын
I've always been really curious what tensors were (never went deeper into math than multivariable calc). This was a little abstract for me, so I'd love to see how they are actually applied!
@thespiciestmeatball
@thespiciestmeatball 2 жыл бұрын
This was great. I’d definitely like to see more videos like this one
@Bbdu75yg
@Bbdu75yg 2 жыл бұрын
Amazing explanation thank u so much 💞💞💞
@Stobber1981
@Stobber1981 9 ай бұрын
I know it's a year later, but this construction of tensors really helped to back-fill the hackneyed explanations given by engineers and physicists all over YT. I especially appreciated the definition of the R2*R3 basis at the end and its comparison to the matrix space basis. I'd LOVE more takes on tensors from your perspective, please!
@Walczyk
@Walczyk Жыл бұрын
Hell yeah! This will be useful for all of us physicists needing a better understanding of tensor algebra
@abhinayaas422
@abhinayaas422 9 ай бұрын
Thanks a lot Michael🤗😊
@gnomeba12
@gnomeba12 2 жыл бұрын
Yes please. More takes on the tensor product!
@gabrielallerful
@gabrielallerful 2 жыл бұрын
Great video, i would love to see some more of those videos you mention, specially in physics!!!
@cesarjom
@cesarjom 2 жыл бұрын
This is awesome you are exploring this topic of Tensors. Please continue with plans to develop more topics on applications of Tensors, especially in physics and of course general relativity.
@sanch023
@sanch023 2 жыл бұрын
Fabulous, I want more on tensors and its applications
@halleritis
@halleritis 2 жыл бұрын
Looking forward to your promised follow up video!! :)
@m.walther6434
@m.walther6434 2 жыл бұрын
Great Video, please more about Tensors!
@GordonKindlmann
@GordonKindlmann 2 жыл бұрын
Thanks for making this great video. I am very curious how you would present the idea of covariant vs contravariant indices within a tensor.
@peizhengni1346
@peizhengni1346 2 жыл бұрын
such a good explanation, I would definitely like more about tensor algebra/calculus video! Also a video about why they are linearly independent would be nice!
@TranquilSeaOfMath
@TranquilSeaOfMath 2 жыл бұрын
I liked this video. You have a good presentation of the topic.
@scollyer.tuition
@scollyer.tuition 2 жыл бұрын
Very nice explanation. I'd be interested in seeing more along these lines. Maybe you could talk about how tensor products convert bilinear maps into linear maps.
@muenstercheese
@muenstercheese 2 жыл бұрын
i’ve only heard about tensors from machine learning, and this is blowing my mind. i love this, thanks for the awesome production
@alonamaloh
@alonamaloh 2 жыл бұрын
I think the word "tensor" in machine learning is closer to the programming notion of "array": The objects described in this video have more structure, in some sense.
@AdrianBoyko
@AdrianBoyko 2 жыл бұрын
“More structured” or “more constrained”?
@alonamaloh
@alonamaloh 2 жыл бұрын
@@AdrianBoyko I think I meant what I said. Tensors in math are associated with the notion of multilinear functions, but in a neural network that does image recognition, the input can be a "tensor" with dimensions width x height x 3(rgb) x minibatch_size. However, as far as I can tell, that's just an array, without any further structure.
@drdca8263
@drdca8263 Жыл бұрын
@@alonamaloh many of the operations done with them are multilinear maps. Specifically, the parts with the parameters that are trained are often multilinear maps with these, with non-linear parts between such multilinear maps. This seems to me somewhat tensor-y ? Like, yeah, it is all done with a basis chosen for each vector space, but, well, you *can* reason about it in a basis-free way ? Like, for an image with 3 color channels, the space of such images can be regarded as the tensor product of the space of greyscale images with the space of colors, or as the tensor product of greyscale 1D horizontal images with greyscale 1D vertical images with colors. The space of translation-invariant linear maps from greyscale images to greyscale images (of the same size) (so, convolutions) is a vector space, and I think it is the tensor product of the analogous spaces for the spaces of the horizontal and vertical 1D images. This all seems like tensors to me?
@XgamersXdimensions
@XgamersXdimensions 2 жыл бұрын
Awesome video! I just finished a second course in Linear Algebra that was abstract and proof based and I had not problems following along with this video- I would love more content like those (or recommendations for textbooks).
@evankalis
@evankalis 2 жыл бұрын
I know id watch more videos on this topic for sure!
@mattcarnevali
@mattcarnevali 2 жыл бұрын
Would love to see more videos about physics topics in general!
@jongraham7362
@jongraham7362 2 жыл бұрын
Great video... I have been trying to understand tensors for many years... I've seen it from the Physics viewpoint and the Abstract Algebra derivation. This helps, thanks. More would be good.
@SpoonPhysics
@SpoonPhysics 2 жыл бұрын
Would love to see more tensors, thanks.
@somgesomgedus9313
@somgesomgedus9313 2 жыл бұрын
If this construction scares you, don't worry. It's very technical and just made for having these nice properties we want the tensor product to have. What you need when you work with the tensor product is the universal property that gives you really control over the thing. Like many objects that are constructed for a universal property, the construction is very technical but once you know that it exists you can almost forget the explicit construction and just concentrate on the universal property! Great video btw I love it when I see stuff like that on KZfaq!
@FranciscoMNeto
@FranciscoMNeto 2 жыл бұрын
Michael doing a series on tensors?! HELL YEAH
@JimMcCann500
@JimMcCann500 2 жыл бұрын
If you could do more on Tensors, Covariant Derivatives, Yang Mills (maybe), d'alembertian operators, that would be great!
@antoniodisessa1469
@antoniodisessa1469 2 жыл бұрын
Absolutely clear. I'd hope a more deep learning about the subject.
@mohamadhoseynghazitabataba987
@mohamadhoseynghazitabataba987 Жыл бұрын
amazing , please teach more about Tensors
@ethandole2218
@ethandole2218 2 жыл бұрын
I am very interested in this topic atm, been studying Representationation theory, so I'd love more of this!
@stabbysmurf
@stabbysmurf 2 жыл бұрын
Thank you for this video. I've tried to learn tensors from mathematical physics texts, and I've had an awful time. It helps a great deal to see a mathematician providing a proper structural definition with a level of precision that I can understand.
@sergpodolnii3962
@sergpodolnii3962 2 жыл бұрын
Interesting topic, I hope there will be enough people to see more on this subject
@burkhardstackelberg1203
@burkhardstackelberg1203 2 жыл бұрын
Another perspective to look at tensors I like, is the one from multilinear maps. And this video is a good starting point. To every vector space, there exists a dual space of mappings of those vectors to real numbers (or whichever field you choose). A direct product of two mapping spaces allows you to map a direct product of vectors to your field, giving you the ability to create a metric of your vector space. A direct product of a vector space and its dual space creates very much of what we know as quadratic matrices - mappings of that vector space to itself. But without the need of spelling out components.
@Songfugel
@Songfugel 2 жыл бұрын
Yes more tensors please. This is one of the topics I personally feel most often confused about particularly with quaternions in linear algebra
@noahtaul
@noahtaul 2 жыл бұрын
I think this could be a good series. If you’re thinking about continuing in a physics sense, or an algebraic geometry sense, I think that would be useful, but I think it would make a good series to put together a set of videos on homological algebra. Exact sequences, hom-tensor adjoint, cohomology/homology of chain complexes, etc.
@keithvanantwerp3198
@keithvanantwerp3198 2 жыл бұрын
More please, nice job. Physics/continuum mechanics and then top it off by clarifying the relation to tensors in machine learning.
@MrBebopbob
@MrBebopbob 2 жыл бұрын
Excellent video (as usual). More sensors please.
@petergregory7199
@petergregory7199 Жыл бұрын
Merry Vector Michel, and a Happy New Tensor!
@Nicolas100399
@Nicolas100399 2 жыл бұрын
I would love to see a series made out of this, like the one on the wedge Produkt you made. That series was a favorite of mine
@net51cc
@net51cc 2 жыл бұрын
I would absolutely like to see more.
@ohno8774
@ohno8774 2 жыл бұрын
That was fascinating
@Hank-ry9bz
@Hank-ry9bz 2 ай бұрын
4:50 bookmark: examples in R2*R3 10:55 quotient space, 18:15 basis, 24:00 basis example
@kapoioBCS
@kapoioBCS 2 жыл бұрын
I an looking forward on more videos on more advanced subjects! I just finished a course on representation theory for my master.
@linuxrwanda
@linuxrwanda 2 жыл бұрын
Thank you Michael Penn. Any textbook you would recommend that has a similar presentation of this topic? Of course other viewers are welcome to make suggestions.
@thetrickster42
@thetrickster42 2 жыл бұрын
I would love to see more videos on the Tensor product. Didn’t see this construction until grad school, just saw the physics take on it. Would love to compare.
@Awesome20801
@Awesome20801 2 жыл бұрын
I learnt tensors a year ago, and this is a great review for me!
@tannerlawson1017
@tannerlawson1017 2 жыл бұрын
I seldom give a like to videos, but this one absolutely deserves it. Thank you!
The strange cousin of the complex numbers -- the dual numbers.
19:14
What's a Tensor?
12:21
Dan Fleisch
Рет қаралды 3,6 МЛН
Чай будешь? #чайбудешь
00:14
ПАРОДИИ НА ИЗВЕСТНЫЕ ТРЕКИ
Рет қаралды 2,9 МЛН
Balloon Stepping Challenge: Barry Policeman Vs  Herobrine and His Friends
00:28
The day of the sea 🌊 🤣❤️ #demariki
00:22
Demariki
Рет қаралды 33 МЛН
What is the free vector space??
23:02
Michael Penn
Рет қаралды 22 М.
The Meaning of the Metric Tensor
19:22
Dialect
Рет қаралды 204 М.
Lie algebras with @TomRocksMaths
52:40
Michael Penn
Рет қаралды 83 М.
What the HECK is a Tensor?!?
11:47
The Science Asylum
Рет қаралды 736 М.
Complete Derivation: Universal Property of the Tensor Product
36:08
Matrix exponentials, determinants, and Lie algebras.
25:47
Michael Penn
Рет қаралды 85 М.
Visualization of tensors  - part 1
11:41
udiprod
Рет қаралды 562 М.
Climbing past the complex numbers.
30:31
Michael Penn
Рет қаралды 116 М.
What is a hole?
9:24
Aleph 0
Рет қаралды 80 М.
How AI Discovered a Faster Matrix Multiplication Algorithm
13:00
Quanta Magazine
Рет қаралды 1,4 МЛН