[classical music] “Lisa: Well, where’s my dad? Frink: Well, it should be obvious to even the most dimwitted individual who holds an advanced degree in hyperbolic topology that Homer Simpson has stumbled into … (dramatic pause) … the third dimension.” Hey folks I’ve got a relatively quick

video for you today, just sort of a footnote between chapters. In the last two videos I talked about linear transformations and matrices, but,

I only showed the specific case of transformations that take

two-dimensional vectors to other two-dimensional vectors. In general throughout the series we’ll work

mainly in two dimensions. Mostly because it’s easier to actually

see on the screen and wrap your mind around, but, more importantly than that once you get all the core ideas in two

dimensions they carry over pretty seamlessly to higher dimensions. Nevertheless it’s good to peak our heads

outside of flatland now and then to… you know see what it means to apply these

ideas in more than just those two dimensions. For example, consider a linear transformation with three-dimensional vectors as inputs and three-dimensional vectors as outputs. We can visualize this by smooshing around

all the points in three-dimensional space, as represented by a grid, in such a

way that keeps the grid lines parallel and evenly spaced and which fixes

the origin in place. And just as with two dimensions,

every point of space that we see moving around is really just a proxy for a vector who

has its tip at that point, and what we’re really doing

is thinking about input vectors *moving over* to their corresponding outputs, and just as with two dimensions, one of these transformations is completely described by where the basis vectors go. But now, there are three standard basis

vectors that we typically use: the unit vector in the x-direction, i-hat; the unit vector in the y-direction, j-hat; and a new guy—the unit vector in

the z-direction called k-hat. In fact, I think it’s easier to think

about these transformations by only following those basis vectors since, the for 3-D grid representing all

points can get kind of messy By leaving a copy of the original axes

in the background, we can think about the coordinates of

where each of these three basis vectors lands. Record the coordinates of these three

vectors as the columns of a 3×3 matrix. This gives a matrix that completely describes the transformation using only nine numbers. As a simple example, consider,

the transformation that rotate space 90 degrees around the y-axis. So that would mean that it takes i-hat to the coordinates [0,0,-1]

on the z-axis, it doesn’t move j-hat so it stays at the

coordinates [0,1,0] and then k-hat moves over to the x-axis at

[1,0,0]. Those three sets of coordinates become

the columns of a matrix that describes that rotation transformation. To see where vector with coordinates XYZ

lands the reasoning is almost identical to what it was for two dimensions—each

of those coordinates can be thought of as instructions for how to scale each basis vector so that they add

together to get your vector. And the important part just like the 2-D case is

that this scaling and adding process works both before and after the

transformation. So, to see where your vector lands

you multiply those coordinates by the corresponding columns of the matrix

and then you add together the three results. Multiplying two matrices is also similar whenever you see two 3×3 matrices

getting multiplied together you should imagine first applying the

transformation encoded by the right one then applying the transformation encoded

by the left one. It turns out that 3-D matrix

multiplication is actually pretty important for fields like computer

graphics and robotics—since things like rotations in three dimensions can be

pretty hard to describe, but, they’re easier to wrap your mind around if

you can break them down as the composition of separate easier to think about

rotations Performing this matrix multiplication

numerically, is, once again pretty similar to the two-dimensional case. In fact a

good way to test your understanding of the last video would be to try to reason

through what specifically this matrix multiplication should look like thinking

closely about how it relates to the idea of applying two successive of

transformations in space. In the next video I’ll start getting

into the determinant.

Really good video! I would love to see a video about Quaternions as well! I think they're amazing mathematic tools for 3D rotation, and honestly, I don't really understand how they really work 😥. It would great if you explain it in one of your videos, I really enjoy them😄

This is pretty straightforward if one grasps the previous chapters. A (linear) change of basis is a linear transformation.

Just to be clear, the matrix represented in the video is a rotation matrix right?

i wish you make a video about the vectors transformation between cartesian coordinate system and cylindrical/spherical systems;but just from the same viewpoint which you had in this video.although i know mathematical aspect of these transformations,but i have a big problem at understanding them conceptually.of course,i think these transformations arent linear ones anymore??

Actually, I would like to think a non-square matrix to be a square one, by adding zeros into a new column or a new row. It will give a much better intuition into this kind of transformation, by applying knowledge from previous chapters.

Nooooo! There is not Russian subtitles more since the video! I'll never become a smart.

There's an old programming book titled "flights of fantasy". The author uses a lot of matrices for the 3d calculations and finally now, 20 years later, I understand why.

Just stay calm, Frinky. These babies will be in the stores while he's still grappling with the pickle matrix!! Bhay-gn-flay-vn!!

Amazing!

I can no longer watch these videos at night because I get too excited and forget to sleep

At this point in the tutorial, I just want to give you all my money. You've made my mathematics make so much sense, I feel like a child with an endless supply of candy!

Best series on youtube.

great math demonstration! BTW, do anyone know the name of intro music?

I got a question: at 2:00–2:15 you said that I should put the 3 basis vectors inside of a 3×3 matrix to make the matrix multiplication work later on, BUT does this also apply to row vectors, because you store them here as column vectors. I might be wrong because I've only started studying linear algebra recently, but wouldn't this mess things up if I would've wanted to use row vectors, right?

Ever thought about getting someone to make a VR 3D version of this video, Grant?

never is to late to learn that matrices make sense. thank you!

Why is there a negative sign in front of "y"?

thank you

at 1:52 you showed the basis vectors with the z-direction pointing up. Is that really true? Because in 2d, x goes from left to right and y goes from bottom to top. Adding the 3rd dimension and naming it z, shouldn't z then go from outside the screen to inside the screen (so to speak)? To put it another way: If in 2d i^ is the x-axis, going from left to right, and j^ is the y-axis, going from bottom to top, then why is everything named differently in 3d?

pls tell abt golden ratio n relations with Fibonacci series

Awesome ^_^

I learned linear algebra a long time ago and I still use matrices and transformations, but these videos have helped me visualize them in ways I never did. Thanks!

youtube>college

Simply put, you enlightened me, thank you

sir..what does this mean when we multiply 2*3 and 3*2..instead of 3*3 and 3*3….

what is the answer ??

I learned more in 20 minutes than I did in college

I have absolutely no need to know this stuff but it takes me to a world without anger, jealousy, hate, and greed.

where were you when I was taking these classes????????

Someone please explain the joke to me!

I wish I have knew these and other concepts earlier in time , may be I would see the world differently. Thanks for the most appreciated effort spent in theses videos. Hoping to see more

i enjoyed every second in this series. you explained the material better than any professor I have encountered during my bachelor and master degrees. well done.

I am like…the leader of the free world.

PEAK?!

pls pls pls make one moving from one dimension to another, like 2 to 3 dimensions. I know you briefly explained one moving to lower dimension( in the subspaces one).

Wow wow slow down egg head'

Also, thank you!!

these PI'es are overcute ^_^

so basically you always and only need n^2 numbers to definite that transformation, with n being the number of dimensions, right? E.g., a 2-D space would require 4 numbers, a 3-D space required 9 numbers, 4-D requires 16, etc.

Hey does this mean that getting a matrix into row-echelon form is the same as solving for where each to the basis vectors ends up?

Hello guys, I have a question:

For the thought experiment he put forth at the end of the video, does he simply end up with vectors that lie in the same plane? Because the first transformation puts the i, j, and k vectors in the same plane (so they no longer can span all of 3-D space). Then he has a second transformation which, if observed on its own, would still span all of 3-D space.

However, I do not believe any matrix multiplication will increase the span of a set of vectors. Is this correct? (I plotted the points on a 3-D graphing tool to aid in my evaluations).

I love u

But what about 4×4 or 5×5 matrices… is there any 4D or 5D representation for them?

you saved me! Thank you for your great visual works.

did anyone know why we use the letters i,j and k for basis vector , generally , instead of other letters ?

this is just so incredible explanations !

3:49 big data analytics and machine learning

So I am at this chapter and I can already visualize "How and why, maximum number of zeroes that a 3×3 matrix (or n*n) can have"…while many try to prove using 'determinant '.

I can never thank you enough.

This just made sense of the hours of struggling I experienced with the XNA game studio way back in 2010.

MANI wish I'd had this so clearly demonstrated back then!i love you

Bear my children

lol my cpu started screaming the moment the 3d grid appeared

So does this mean that a 10×10 matrix is essentially a vector 10 dimensional space? If so, that means multiplying two 10×10 matrices would result in the rotation of a 10 dimensional space. That's impossible to imagine.

Peek outside Flatland.

Obviously, if you ever had a peak anywhere, it wouldn't be flat, would it?

what is he doing at 31 seconds when he puts the transformed vectors tail to tail and draws the vertical line equivalent to (-1,2)??

This is the very core of video game development

Wonderful ^_^

Is "peak outside of flatland" a typo, or a joke? (00:46)

Great teacher!!!

This is a big service you've done for all generations… any way I could do a one time payment to support?

I learned this when I studied robotics. Sadly, I took a linear algebra course before, but the teacher didn't tell us what matrices represent (I believe he didn't even know) so the first classes of robotics were like black magic to me.

Life Saver

Excelentes vídeos! En este, los subtítulos en español están un poco cruzados

What about noon-square matrices?

same AB ==BA and () stuff

cool tut!!!

Thank you very much. The explanation is so nice.

so a linear transformation is just multivariable calculus with multi outputs with each output corresponding to a variable instead of one?

Why is linear algebra needed for machine learning?

Them 3 AM feels

There should be a global tax to support this guy and his like

Just the tip.

I'm so happy I'm not three years older!

This is the best explanation ever ..

Sir, How active and passive transformation are different and how to compute?

Can somebody explain to me how he did the question at the problem of the video?

nice idea to not have the standard grid as orientation but the qube

Unfortunately, computer graphics requires 4D matrices

This guy sucks… said no one ever. Honestly, you are an amazing teacher.

Finally I can make 100d Transformations

beautiful videos! You are the best, thanks for enlightening the World with these 🙂

No words 4 u 🙏👌

What about 4*4 matrix ……

How to imagin…. The 4th DIMENSION 😕😕😕

That was really helpful! Thank you

I was stuck trying to understand Linear Algebra without going into too many details until thankfully I found these videos, THANKS ALOT

Like for the simpsons quote

Finally, I don't understand reality.

Simply beautyful work done here. Thanks to 3Blue1Brown

I realise every time your video ends, there is a soft smile on my face and feeling of content. Your videos are SOUL FOOD

I count myself as one of the luckiest people to have stumbled on this playlist just before taking linear algebra in college

haha i'm in a lot of pain and this series is distracting me from my pain thank you

His voice is so calming I started watching the first video fell asleep and now I am working 5

Can you add translation and rotation in the same video

looking forward to understand how can I use this in real applications.

How do we relate 3D rotation matrix (3 x 3 matrix) and its 3D angular velocity matrix (skew-symmetric matrix)? It seems like all the explanations so far are awfully complicated.

how matrix multiplication related to vector

Ok you are spoiling my brain..

Matrix multiplication makes absolute sense to me after these videos. You basically apply a linear transformation to each vector of the first matrix which is your base, and the result is a new transformation for the input vector.

Wow, I'm speechless. In a matter of hours, these videos caused me to go from hating linear algebra to one of my favorite subjects. At my high school, since not many people are advanced enough to do linear algebra, we have to teach ourselves. There are about 20 or so kids in my class and we each alternate teaching the class, and everyone is new to teaching. So, we ended up doing about 6 numerical theorems for every section without thinking about it, many without proofs. I was desperate for some type of visualization of linear algebra, and I just found the complete gold mine.

I am going through these videos to refresh my memory on linear algebra from my second semester, because I will be doing machine and deep learning in my 7th semester starting last week. Thank you very much. So far it is a beautiful series. I am looking forward to go through all 4 seasons and hope by the end I will understand the basics of neural networks so I can follow the course in my university.