Sign In | Subscribe

Enter your Sign on user name and password.

Forgot password?
  • Follow us on:
Start learning today, and be successful in your academic & professional career. Start Today!
Loading video...
This is a quick preview of the lesson. For full access, please Log In or Sign up.
For more information, please see full course syllabus of Linear Algebra
  • Discussion

  • Download Lecture Slides

  • Table of Contents

  • Transcription

  • Related Books

Lecture Comments (10)

2 answers

Last reply by: chiedu egwuatu
Tue Sep 23, 2014 4:37 AM

Post by chiedu egwuatu on September 22, 2014

Good Evening Professor Hovasapain. I hope you well. I have really been enjoying your lectures and you have really helped me with my linear algebra. for some reason I just don't seem to understand the examples you have used for this lesson. I have really tried to understand but I just don't. can you solve this problem:  Suppose T : R2 ! R2 is such that T(1;ô€€€1) = (3;ô€€€1) and T(2; 3) = (ô€€€4; 18). Find T. Or can you solve more examples so that I can understand better. Thank you again for your lectures. I just feel frustrated that I cant grasp linear transformation yet

1 answer

Last reply by: Professor Hovasapian
Sat Sep 20, 2014 8:09 PM

Post by chiedu egwuatu on September 18, 2014

how exactly are you getting the 100,010 and 001 for  e1 e2 and e3

1 answer

Last reply by: Professor Hovasapian
Thu Jun 13, 2013 3:35 PM

Post by Manfred Berger on June 13, 2013

I'm a bit confused about your definition in example 2: Didn't you implicitly declare F to be linear by representing it as a matrix of real numbers? After all, the mechanics of addition and scalar multiplication of matrices do, by themselves, satisfy the definition of a linear transformation.

0 answers

Post by Manfred Berger on June 13, 2013

I'm not exactly sure how wise it is to use i as a symbol for a unit vector. By the time one is dealing with hermitian matrices this could come back to haunt you.

1 answer

Last reply by: Professor Hovasapian
Fri May 3, 2013 9:26 PM

Post by Jeroen Schmidt on May 2, 2013

Just a heads up, the lecture slide (image) links have a 404 error.

Linear Transformations, Part II

Lecture Slides are screen-captured images of important points in the lecture. Students can download and print out these lecture slide images to do practice problems as well as take notes while watching the lecture.

  • Intro 0:00
  • Linear Transformations 1:29
    • Linear Transformations
    • Theorem 1
    • Theorem 2
    • Example 1: Find L (-3, 4, 2)
    • Example 2: Is It Linear?
    • Theorem 3
    • Example 3: Finding the Standard Matrix

Transcription: Linear Transformations, Part II

Welcome back to and welcome back to linear algebra, in the last lesson we introduced the idea of a linear transformation or a linear mapping.0000

They are synonymous, I will often say linear mapping, occasionally how you transformation, but they are synonymous.0009

Today I am going to continue the discussion, a couple of more examples, just to develop more of a sensitive intuition about what it is that's going on.0015

This is a profoundly important concept, as we move on from here, we are going to move on into studying the structure, as actually after we discussed lines and planes.0022

We are going to talk about the structure of something called a vector space, and linear mappings are going to be profoundly important, and how we discuss the transformations from one vector space to another, so this idea of linear mapping for many of you really is the first introduction to this abstraction, you know up to now you have been dealing with functions.0033

X2, radical X, 3X + 5, but now we are going to take, make it a little bit more general, and make the spaces from which we pull something, manipulate it, and land someplace else.0052

A lot more abstract, we are not going to necessity, I mean we will with specific examples, namely N space, R2, R3, Rn, but the idea, the underlying notions are what we really want to study, the underlying structure that's what's important.0063

Let's go ahead and get started, and recap what we did with linear transformations, and do some more examples, okay, so recall what a linear map means.0081

And again we are using this language linear line, but as it turns out we are using it as a language, because historically we did lines before we came up with a definition of what linear means, that's the only reason we call it linear, linearity is an algebraic property, it actually has nothing to do with lines at all.0094

Something is linear, means if we have a mapping or transformation from RN to RM, it has to satisfy the following properties...0114

... A + B = L(A) + L(B), these are vectors of course, because we are taking something from RN and moving it over to RM and it also has to satisfy this other property, that if i multiply.0135

take a vector, multiply it by a scalar, and then do something to it, it's the same as taking the vector, doing something to it, and then multiplying it by that scalar, so these two properties have to be satisfied for any particular function or mapping that we are dealing with.0153

Let's show what that looks like pictorially, so remember we are talking about two spaces, one of them w call the departure, and we call tit the departure space because we are taking something from this space, fiddling around with it and landing someplace else.0171

Now they could be the same space, like for example the function F(X) = X2., I am pulling a number like 5 and i am squaring it and I am getting back another number, 25, so the two spaces are the same, but they don't necessarily have to be the same, that's what makes this beautiful.0190

Okay, so let's say we have the vector A, and we have the vector B, well in this space we can of course add, so let's say we end up with this vector A + B, and we know that we can do that with vectors.0205

Let's see, now when we add here, we, this, we addition in this space, might be defined a certain way, now mind you, it doesn't have to be the same as addition in this space, the operations are different, because the spaces may be different.0219

Okay, so addition in these two spaces may not necessarily be the same, usually they will be, won't be a problem, you know we will always specify when it is different, but understand it, there is no reason to believe that it has to be the same.0239

Okay, so in this case we take A, so this left part here, it means I add A + B first, and then I do L to it.0252

And I end up some place, well what this says is that if this is a linear transformation, it has to satisfy the following properties, that mean if I add these two first, and I, then I transform it and move it to this space, what I end up with.0266

I should be able to get the same thing if I take A first under L and then if I do B first under L, and of course I am going to end up in two different places, and he if I add these two, I should do the same thing.0280

In other words, adding first and then applying the transformation, or applying the transformation separately and then adding, if I can reverse those, and if I still end up in the same place, that's what makes this a linear transformation.0294

And again that's pretty extra ordinary, and the same thing, if I take a vector A, if I multiply it by some scalar 18, and then i operate on it with linear transformation, I am going to end up some place.0308

Let's say I end up here, that means I should, if I take the vector A, map it to L, and then multiply by 18, I should end up in the same place.0320

Again these two things have to be satisfied for something to make it linear, and again not all maps as we saw from the previous examples satisfy the property, this is a very special property, that the structure from one space to another, the relationship is actually maintained, that's what makes this beautiful, now we are getting into deep mathematics.0334

Okay, let's actually represent this a little bit better, so that you can see it, so A, I can transform A under, it becomes l(A), I can transform B under L, it becomes L(B).0354

Now, I can...0371

... Add these two in my departure space, so I get A + B, and then I can apply L to it, to get L(A + B) or, you can do L first, do L for B, and then add to get here.0375

This is more of an expanded, so this is an expanded version of what it is that i sort of drew up here, it's up to you, if you want to work pictorially, if you want to work algebraically, this is what's going on, again profoundly important concept.0400

And again addition in this space does not necessarily need to be the same in addition in the arrival space, they often will be like for example, if this is R2 and this is R3, well addition of vectors is the same, you know from space to space you are adding components, but it doesn't necessarily need to be that way.0415

And again that's the power of this thing...0432

... Okay let's state a theorem, so...0436

... We will let L...0446

... From RN to RM and you will notice, sometimes I will do a single line, sometimes a double line, it's just the real numbers.0452

Let it be a linear mapping...0460

... Excuse me...0468

... then L of C1 times A1 + C2 time A2 + and so on all the way to Ck times Ak = C1 time L, A1.0471

Oops, no yes that's correct, let me erase this here + C2 times LA2 and so on.0495

Essentially this is just an extension of linearity, so I can do, I can add more than just two things, you know A + B, I can add a whole bunch of things, then I can multiply each of those vectors by a constant, so essentially what's happening here.0507

If you think about this algebraically from what you remember as far as distribution, the linear mapping actually distributes over each of these.0522

It says that I can multiply, I and take K vectors, multiply each of them by some constant, not all of them 0, and then apply the linear transformation to it.0531

Or, well that, this theorem says that it is actually equal to taking each individual vector, applying the linear transformation to it, and then multiplying it by a constant.0543

It's just a generalization onto an infinite, any number of vectors that you take, that's all this says.0552

And the second theorem...0560

... Okay, again we will let RN to RM be a linear map, L of the 0 vector in RN...0565

... Maps to the 0 vector in RM, okay, this notation is very important, notice this 0 with a vector, this 0 is a vector, because we are talking about a particular space, let's say in this case R2.0580

this 0 point is actually considered a vector, well the 0 vector in RN and the 0 vector in RM are not the same, one is two vector, one is a three vector, it might be an N vector.0595

What this is saying that if I take the 0, and if I subject it to the linear transformation, it actually maps it to the 0 in the other space, that's kind of extraordinary, so again if i draw a quick little picture, you know two different spaces.0608

let's say this is R3, and let's say this is R4, 3 space and 4 space, if I have this 0 vector here, and the 0 vector here, they are not the same thing, they fulfil the same row, in their, in their perspective spaces, they are still the 0 vector.0623

The additive identity, but if I subject it to transformation L, i actually map the 0 in this space to the 0 in that space, again it's maintained, it doesn't just end up randomly some place, the 0 goes to the 0.0637

And another one which is actually pretty intuitive if I take the transformation of U - V...0653

That's the same as L(U) - L(V), and again you know that the (-) sign is basically the just the addition of the negatives, so it's not a problem, okay.0662

Lets see if we can do an example here...0677

... Should I go for it, yeah that's okay, we can, we can start over here, let me o that, let me change over to a red ink here, okay.0684

We will let L in this particular case be a transformation from R3 to R2 so a three vector, we are going to do something to it and we are going to end up with a two vector...0700

... Be defined by...0715

... L(1, 0, 0), so in this case my definition is, I don't actually have the specific mapping that I am doing, but in this case this example is going to demonstrate that I know something about the unit vectors in this particular space.0720

Or, in this, where you will see, i know something about three of the vectors and we will see what happens, equals 2 - 1...0738

... L (0, 1, 0) is equal to (3, 1), excuse me, and L(0, 0, 1) is equal to (-1, 2), so again it says that if I take the vector (1, 0, 0) in R3 in three space.0749

Under this transformation I am defining it, I am saying that it equals this, that the vector (0, 1, 0) under the transformation L is equal to this, so I have made a statement about three vectors.0772

Now recall...0782

... That (1, 0, 0), we have specific symbols for these, we call them E1, they are unit vectors, they are vectors of length 1, and we happen to give them special symbols because they are very important, (0, 1, 0) in three space.0790

They actually form the unit vectors that are mutually orthogonal, remember X coordinate, Y coordinate, Z coordinate, E1 , we also call it I.0809

Remember, and we call this one J, so there are different kinds of symbols that we can use, they all represent the same thing, and (0, 0, 1) is called E3, and it is represented by a K vector.0820

Okay, our task is to find L of the vector (-3, 4, 2), so again we are given that the three unit vectors map to these three points under the transformation.0836

Can we find where, if we take a random vector, (-3, 4, and 2), can we actually find the point in R2 that l map's knowing just about these three vectors, well as it turns out, yes we can.0852

Let me, over here, well let's see, now (-3, 4, and 2) can be written as...0869

... -3I + 4J + 2K, right, we are just representing them as a linear combination of the unit vectors I, J, K, so L...0883

... Of (-3. 4. 2) is equal to L of -3I + 4J + 2K.0898

Well that's equal to, and again this is linear, so I can just sort of distribute this linearity if you will, it is -3 times L(I)...0913

... + 4 times L(J) + 2 time L(K), well we already know what these are, we already know what L(I), L(J), L(K) is.0927

It's the L(1, 0, 0), this is L(0, 1, 0), this is L(0, 0, 1), so we write -3 times, and I am going to write these as vectors, column vectors + 4 times 3, 1.0939

+ 2 times -1, 2, because this 2, -1 is L(I), we defined it earlier, that was part of the definition.0958

We know that the linear transformation maps these three vectors to these three points, that much we know, no we just sort of set it up in such way, and now end up with -6 and 3.0968

I am going to write everything out here, 12 and 4, please check my arithmetic because i will often make arithmetic mistakes, -2 and 4.0985

And then when we add these together, we end up with 4 and 11, or we can do it in coordinate form, (4, 11), so there we go, knowing where the linear transformation actually maps the unit vectors , allows us to find the linear transformation of any other vector in that space.0996

That's kind of extraordinary, okay...1019

... Now let's do another example here...1029

... Okay...1035

... Let F, this time I use the capital F, be a mapping from R2 to R3, so I am mapping something from two space to three space, okay.1039

Be defined by...1053

... The following, F of the vector XY is equal to, now I am going to represent this as a matrix, so again this is just a mapping that I am throwing out there.1059

(1, 0, 0, 1, 1, -1), this is a 3 by 2...1075

... This matrix multiplication is perfectly well defined, so it says F is a mapping, notice I haven't said anything about it being linear, I just said it's a mapping, that takes a vector in R2, transforms it and turns it into a vector in R3.1087

Lets exactly what happens here, this is the definition that says take XY, some 2 vector, multiply on the left by this matrix, and you actually do end up getting, so this is 3 by 2, this is a 2 by 1.1100

Well sure enough, you will end up getting a 3 by 1 matrix, which is a 3 vector, so we have taken a vector in R2, mapped it to R3, now our question is, is it linear?...1115

... That's just kind of interesting, I have this matrix multiplication, now I want to find out if it's linear, again the power of linearity, this has nothing to do with lines at all.1130

Okay, so again when we check linearity, we check two things, we check the addition and we check the scalar multiplication, we will go through the addition here, I will have, you go ahead and check these scalar multiplication if you want to, so check this, check that F of...1139

... U + V for any two vectors equals F(U) + F(V), that we can exchange the addition and the linear, and the actual function itself, okay.1158

We will say that U is equal to U1, U2, oops, then make that like that, we will say that V is...1175

... V1 and V2, okay, now U + V is exactly what you think it is, it is U1 + V1, U2 + V2, okay.1188

I am going to write that, let me actually write it a little differently, let me write it as a, as a 2 vector, column vector, I think it might be a little bit clear.1206

I will do this, because we are dealing with matrix multiplication, when, we will just deal with matrices, so U1 + V1, U2 + V2.1219

L:et me make sure I have my indices correct, yes, okay, now...1229

I will do a little 1 here, and now let's transform, let's do F(U + V), okay, well that's going to equal the matrix...1236

... (1, 0, 0, 1, 1, -1) times...1252

... U1 + V1, U2 + V2, okay, so again it's, when we do this times that + this times that and then this times that + this times that.1261

And then this times that + this + this times that, that's how matrix multiplication works, you choose a row and you go down the column, there are two elements in this row, two elements in this column, you add them together.1274

What you end up with is the following, U1 + V1, U2 + V2, and you get...1286

... U1 + V1 - U2 + V2.1300

This is the three vector, that's the first entry, that's the second entry, that whole thing is the third entry, so we have done this first part, the left, okay.1313

Now let's do the right...1323

F(U) is equal to...1327

... (1, 0, 0, 1, 1, -1) times U1, U2, that's equal to U1, U2, U1 - U2...1333

... Okay, now let's move to the next page...1351

... We will do F(V)...1360

... That's equal to (1, 0, 0, 1, 1, -1) times V1, V2, that's equal to V1, v2, V1 - V2.1363

Now we have to add the F(U) and the F(V), so F(U), which we just did + F(V), which was the second thing we just did, is equal to U1, U2, U1 - U2.1380

V1V2, V1 - V2, that's equal to U1 + V1...1400

... U2 + v2...1411

... U1 + V1, I have just rearranged and put them, and grouped the U1 and, the U1 with the V1, the U2 and the V2 +....1417

... + U2 + V2, there we go, and as it turns out, F(U) + F(V), does in fact equal F(U + V), quantity, so yes, so let me write that out.1432

F(U + V), does in fact equal F(U) + F(V).1448

Now when we check the scalar multiplication, it will also check out, so yes this map is linear...1460

... This is rather extra ordinary, matrix multiplication is a linear mapping, matrix multiplication allows you to map something in N space, like R5, into, let's say seven space, R7.1472

And to retain the structure of being able to add the vectors in five spaces first, and then do a linear transformation, or do the linear transformation first, end u in seven space, and then ad, you end up in the same place.1490

That's extraordinary, matrix multiplication is a linear mapping, notice it has nothing to do with linear, with a line, this is an algebraic property.1505

An underlying structure of the mapping itself...1513

... Okay...1517

... Therefore if you have some mapping...1521

... L, defined by the following L of some vector X, is actually equal to some matrix, some M by N matrix, multiplied by X...1533

... Then L is linear...1548

... We just proved it, always...1552

... Okay, now let's state a theorem here...1557

... If L is a mapping from RN...1564

... To RM, is a linear mapping...1571

... here is what's amazing, then there exists a unique M by N matrix...1582

... A, such that the mapping, the, is actually equal to some matrix, times...1591

... For any vector in RN...1602

... Okay, this is profoundly important...1612

... We just proved that matrix multiplication is a linear mapping, the other way around i also true, if I have a linear mapping that has nothing to do with the matrix, because remember the examples that we have been dealing up to this point have nothing to do with matrices necessarily.1616

They were just mapping, function, if it turns out that, that mapping is linear, what this theorem tells me is that there is some matrix.1630

Some matrix somewhere, that actually represents that mapping, in other words I can always, I may not really need to find, but it tells me that the matrix actually exist, that every linear mapping is associated with some M by N matrix.1641

And some M by N matrix is associated with some linear mapping, that's extraordinary, there is a correspondence between the set of all linear mappings, and the set of all M by N matrices, that's extra ordinary.1657

Actually there is way to find the matrix and here is how it is, so...1673

... The matrix A...1678

... And it's quite beautiful, is found as follows...1683

... The matrix of A is equal to the matrix of...1691

... I take the unit vectors in my space, in my departure space, I subject them to transformation, whatever the linear mapping happens to be, and then the vectors that I get, i set them up as columns in a matrix.1700

And that's actually the matrix of my transformation, of my linear transformation, L of...1717

... E to the N, okay...1725

... Yes, alright...1730

... I will write it out, so here's what I am doing, so the ith column, let's say the fourth column is just them linear transformation of the fourth unit vector for that space, we should probably just do an example about, or work out much better.1735


... Let's...1756

... Let this be a mapping, defined, R3 to R2, so we are taking a three vector, mapping some, transforming it into a two vector.1764

Let it be defined by the following, L(XYZ) is equal to...1777

... I am sorry, no this is mapping from R3 to R3, so we are mapping, we are mapping it onto itself essentially, so it's mapping from three space onto three space, which by the way, when the spaces happen to be the same that you are mapping to and from, it's called an operator, a linear operator...1788

... X + Y is the first entry, Y- Z is the second entry, X + Z is the third entry, so I take a vector, do something to it, and arrange it like this, this is what the mapping is defined by.1809

Now the question is this, we said that any linear mapping has a matrix associated with it, I can always represent a linear mapping as a matrix multiplication, very convenient, let's find that matrix...1823

... And it's called the standard matrix by the way, I myself am not too big on nomenclature, am more interested that you actually understand what's happening, you could call it whatever name you want.1839

Okay, we said that all we have to do is take the linear transformation, or take the unit vectors in the space, in this case R3 or departure space, and just subject them to this transformation, and then set up this columns, and that's our matrix.1851

L of E1 equals L of, well in three space (1, 0, 0) is the first unit vector, the X, the I, well that equals, well, let's go up here, 1 + 0...1866

... 0 - 0, and 1 + 0...1886

... We end up with (1, 0, 1) okay, this is going to be column 1 of our matrix...1891

... L of E2 equals L(0, 1, 0), well X + Y, 0 + 1, Y - Z, 1 - 0 and X + Z, 0 + 0.1901

Then I should end up with (1, 1, 0), and that's going to be column 2, if I take L(E3), which is L(0, 0, 1) , well X+ Y, 0,+ 0, Y - Z, Y - Z, 0 - 1, and X + Z, 0 + 1.1917

I end up with (0, -1, 1)...1945

... This is my column 3, so A, the standard matrix is (1, 0, 1), (1, 1, 0), (0, -1, 1)...1950

... That is my answer...1969

Let me change it to blue, this was how the linear mapping was defined, the linear mapping. Therefore I know that there are some matrix associated with this linear mapping, I could represent it as a matrix multiplication, which is very convenient.1973

Well, I take the unit vectors for this space, I subject them to this transformation, I get these things.1989

I arrange these things one after the other as the columns of the matrix, and I end up with my matrix.1996

This means that...2003

... If I want to do this mapping, all I have to do is take any vector X, and multiply by this matrix on the left, profoundly important.2010

Every liner mapping is associated with an M by N matrix, and every M by N matrix represents some linear mapping somewhere.2020

That's extraordinary, so now you are not just talking about numbers arranged randomly in a square, or in some rectangular fashion, that this actually represents a linear mapping, a linear function from one space to another.2029

Okay we will talk a little bit more about this next time, thank you for joining us here at, we will see you again, bye, bye.2044