×
Start learning today, and be successful in your academic & professional career. Start Today!  Raffi Hovasapian

Linear Transformations, Part II

Slide Duration:

Section 1: Linear Equations and Matrices
Linear Systems

39m 3s

Intro
0:00
Linear Systems
1:20
Introduction to Linear Systems
1:21
Examples
10:35
Example 1
10:36
Example 2
13:44
Example 3
16:12
Example 4
23:48
Example 5
28:23
Example 6
32:32
Number of Solutions
35:08
One Solution, No Solution, Infinitely Many Solutions
35:09
Method of Elimination
36:57
Method of Elimination
36:58
Matrices

30m 34s

Intro
0:00
Matrices
0:47
Definition and Example of Matrices
0:48
Square Matrix
7:55
Diagonal Matrix
9:31
Operations with Matrices
10:35
10:36
Scalar Multiplication
15:01
Transpose of a Matrix
17:51
Matrix Types
23:17
Regular: m x n Matrix of m Rows and n Column
23:18
Square: n x n Matrix With an Equal Number of Rows and Columns
23:44
Diagonal: A Square Matrix Where All Entries OFF the Main Diagonal are '0'
24:07
Matrix Operations
24:37
Matrix Operations
24:38
Example
25:55
Example
25:56
Dot Product & Matrix Multiplication

41m 42s

Intro
0:00
Dot Product
1:04
Example of Dot Product
1:05
Matrix Multiplication
7:05
Definition
7:06
Example 1
12:26
Example 2
17:38
Matrices and Linear Systems
21:24
Matrices and Linear Systems
21:25
Example 1
29:56
Example 2
32:30
Summary
33:56
Dot Product of Two Vectors and Matrix Multiplication
33:57
Summary, cont.
35:06
Matrix Representations of Linear Systems
35:07
Examples
35:34
Examples
35:35
Properties of Matrix Operation

43m 17s

Intro
0:00
1:11
1:12
2:30
2:57
4:20
5:22
5:23
Properties of Multiplication
6:47
Properties of Multiplication: A
7:46
Properties of Multiplication: B
8:13
Properties of Multiplication: C
9:18
Example: Properties of Multiplication
9:35
Definitions and Properties (Multiplication)
14:02
Identity Matrix: n x n matrix
14:03
Let A Be a Matrix of m x n
15:23
Definitions and Properties (Multiplication)
18:36
Definitions and Properties (Multiplication)
18:37
Properties of Scalar Multiplication
22:54
Properties of Scalar Multiplication: A
23:39
Properties of Scalar Multiplication: B
24:04
Properties of Scalar Multiplication: C
24:29
Properties of Scalar Multiplication: D
24:48
Properties of the Transpose
25:30
Properties of the Transpose
25:31
Properties of the Transpose
30:28
Example
30:29
33:25
Let A, B, C, and D Be m x n Matrices
33:26
There is a Unique m x n Matrix, 0, Such That…
33:48
Unique Matrix D
34:17
Properties of Matrix Multiplication
34:58
Let A, B, and C Be Matrices of the Appropriate Size
34:59
Let A Be Square Matrix (n x n)
35:44
Properties of Scalar Multiplication
36:35
Let r and s Be Real Numbers, and A and B Matrices
36:36
Properties of the Transpose
37:10
Let r Be a Scalar, and A and B Matrices
37:12
Example
37:58
Example
37:59
Solutions of Linear Systems, Part 1

38m 14s

Intro
0:00
Reduced Row Echelon Form
0:29
An m x n Matrix is in Reduced Row Echelon Form If:
0:30
Reduced Row Echelon Form
2:58
Example: Reduced Row Echelon Form
2:59
Theorem
8:30
Every m x n Matrix is Row-Equivalent to a UNIQUE Matrix in Reduced Row Echelon Form
8:31
Systematic and Careful Example
10:02
Step 1
10:54
Step 2
11:33
Step 3
12:50
Step 4
14:02
Step 5
15:31
Step 6
17:28
Example
30:39
Find the Reduced Row Echelon Form of a Given m x n Matrix
30:40
Solutions of Linear Systems, Part II

28m 54s

Intro
0:00
Solutions of Linear Systems
0:11
Solutions of Linear Systems
0:13
Example I
3:25
Solve the Linear System 1
3:26
Solve the Linear System 2
14:31
Example II
17:41
Solve the Linear System 3
17:42
Solve the Linear System 4
20:17
Homogeneous Systems
21:54
Homogeneous Systems Overview
21:55
Theorem and Example
24:01
Inverse of a Matrix

40m 10s

Intro
0:00
Finding the Inverse of a Matrix
0:41
Finding the Inverse of a Matrix
0:42
Properties of Non-Singular Matrices
6:38
Practical Procedure
9:15
Step1
9:16
Step 2
10:10
Step 3
10:46
Example: Finding Inverse
12:50
Linear Systems and Inverses
17:01
Linear Systems and Inverses
17:02
Theorem and Example
21:15
Theorem
26:32
Theorem
26:33
List of Non-Singular Equivalences
28:37
Example: Does the Following System Have a Non-trivial Solution?
30:13
Example: Inverse of a Matrix
36:16
Section 2: Determinants
Determinants

21m 25s

Intro
0:00
Determinants
0:37
Introduction to Determinants
0:38
Example
6:12
Properties
9:00
Properties 1-5
9:01
Example
10:14
Properties, cont.
12:28
Properties 6 & 7
12:29
Example
14:14
Properties, cont.
18:34
Properties 8 & 9
18:35
Example
19:21
Cofactor Expansions

59m 31s

Intro
0:00
Cofactor Expansions and Their Application
0:42
Cofactor Expansions and Their Application
0:43
Example 1
3:52
Example 2
7:08
Evaluation of Determinants by Cofactor
9:38
Theorem
9:40
Example 1
11:41
Inverse of a Matrix by Cofactor
22:42
Inverse of a Matrix by Cofactor and Example
22:43
More Example
36:22
List of Non-Singular Equivalences
43:07
List of Non-Singular Equivalences
43:08
Example
44:38
Cramer's Rule
52:22
Introduction to Cramer's Rule and Example
52:23
Section 3: Vectors in Rn
Vectors in the Plane

46m 54s

Intro
0:00
Vectors in the Plane
0:38
Vectors in the Plane
0:39
Example 1
8:25
Example 2
15:23
19:33
19:34
Scalar Multiplication
24:08
Example
26:25
The Angle Between Two Vectors
29:33
The Angle Between Two Vectors
29:34
Example
33:54
Properties of the Dot Product and Unit Vectors
38:17
Properties of the Dot Product and Unit Vectors
38:18
Defining Unit Vectors
40:01
2 Very Important Unit Vectors
41:56
n-Vector

52m 44s

Intro
0:00
n-Vectors
0:58
4-Vector
0:59
7-Vector
1:50
2:43
Scalar Multiplication
3:37
Theorem: Part 1
4:24
Theorem: Part 2
11:38
Right and Left Handed Coordinate System
14:19
Projection of a Point Onto a Coordinate Line/Plane
17:20
Example
21:27
Cauchy-Schwarz Inequality
24:56
Triangle Inequality
36:29
Unit Vector
40:34
Vectors and Dot Products
44:23
Orthogonal Vectors
44:24
Cauchy-Schwarz Inequality
45:04
Triangle Inequality
45:21
Example 1
45:40
Example 2
48:16
Linear Transformation

48m 53s

Intro
0:00
Introduction to Linear Transformations
0:44
Introduction to Linear Transformations
0:45
Example 1
9:01
Example 2
11:33
Definition of Linear Mapping
14:13
Example 3
22:31
Example 4
26:07
Example 5
30:36
Examples
36:12
Projection Mapping
36:13
Images, Range, and Linear Mapping
38:33
Example of Linear Transformation
42:02
Linear Transformations, Part II

34m 8s

Intro
0:00
Linear Transformations
1:29
Linear Transformations
1:30
Theorem 1
7:15
Theorem 2
9:20
Example 1: Find L (-3, 4, 2)
11:17
Example 2: Is It Linear?
17:11
Theorem 3
25:57
Example 3: Finding the Standard Matrix
29:09
Lines and Planes

37m 54s

Intro
0:00
Lines and Plane
0:36
Example 1
0:37
Example 2
7:07
Lines in IR3
9:53
Parametric Equations
14:58
Example 3
17:26
Example 4
20:11
Planes in IR3
25:19
Example 5
31:12
Example 6
34:18
Section 4: Real Vector Spaces
Vector Spaces

42m 19s

Intro
0:00
Vector Spaces
3:43
Definition of Vector Spaces
3:44
Vector Spaces 1
5:19
Vector Spaces 2
9:34
Real Vector Space and Complex Vector Space
14:01
Example 1
15:59
Example 2
18:42
Examples
26:22
More Examples
26:23
Properties of Vector Spaces
32:53
Properties of Vector Spaces Overview
32:54
Property A
34:31
Property B
36:09
Property C
36:38
Property D
37:54
Property F
39:00
Subspaces

43m 37s

Intro
0:00
Subspaces
0:47
Defining Subspaces
0:48
Example 1
3:08
Example 2
3:49
Theorem
7:26
Example 3
9:11
Example 4
12:30
Example 5
16:05
Linear Combinations
23:27
Definition 1
23:28
Example 1
25:24
Definition 2
29:49
Example 2
31:34
Theorem
32:42
Example 3
34:00
Spanning Set for a Vector Space

33m 15s

Intro
0:00
A Spanning Set for a Vector Space
1:10
A Spanning Set for a Vector Space
1:11
Procedure to Check if a Set of Vectors Spans a Vector Space
3:38
Example 1
6:50
Example 2
14:28
Example 3
21:06
Example 4
22:15
Linear Independence

17m 20s

Intro
0:00
Linear Independence
0:32
Definition
0:39
Meaning
3:00
Procedure for Determining if a Given List of Vectors is Linear Independence or Linear Dependence
5:00
Example 1
7:21
Example 2
10:20
Basis & Dimension

31m 20s

Intro
0:00
Basis and Dimension
0:23
Definition
0:24
Example 1
3:30
Example 2: Part A
4:00
Example 2: Part B
6:53
Theorem 1
9:40
Theorem 2
11:32
Procedure for Finding a Subset of S that is a Basis for Span S
14:20
Example 3
16:38
Theorem 3
21:08
Example 4
25:27
Homogeneous Systems

24m 45s

Intro
0:00
Homogeneous Systems
0:51
Homogeneous Systems
0:52
Procedure for Finding a Basis for the Null Space of Ax = 0
2:56
Example 1
7:39
Example 2
18:03
Relationship Between Homogeneous and Non-Homogeneous Systems
19:47
Rank of a Matrix, Part I

35m 3s

Intro
0:00
Rank of a Matrix
1:47
Definition
1:48
Theorem 1
8:14
Example 1
9:38
Defining Row and Column Rank
16:53
If We Want a Basis for Span S Consisting of Vectors From S
22:00
If We want a Basis for Span S Consisting of Vectors Not Necessarily in S
24:07
Example 2: Part A
26:44
Example 2: Part B
32:10
Rank of a Matrix, Part II

29m 26s

Intro
0:00
Rank of a Matrix
0:17
Example 1: Part A
0:18
Example 1: Part B
5:58
Rank of a Matrix Review: Rows, Columns, and Row Rank
8:22
Procedure for Computing the Rank of a Matrix
14:36
Theorem 1: Rank + Nullity = n
16:19
Example 2
17:48
Rank & Singularity
20:09
Example 3
21:08
Theorem 2
23:25
List of Non-Singular Equivalences
24:24
List of Non-Singular Equivalences
24:25
Coordinates of a Vector

27m 3s

Intro
0:00
Coordinates of a Vector
1:07
Coordinates of a Vector
1:08
Example 1
8:35
Example 2
15:28
Example 3: Part A
19:15
Example 3: Part B
22:26
Change of Basis & Transition Matrices

33m 47s

Intro
0:00
Change of Basis & Transition Matrices
0:56
Change of Basis & Transition Matrices
0:57
Example 1
10:44
Example 2
20:44
Theorem
23:37
Example 3: Part A
26:21
Example 3: Part B
32:05
Orthonormal Bases in n-Space

32m 53s

Intro
0:00
Orthonormal Bases in n-Space
1:02
Orthonormal Bases in n-Space: Definition
1:03
Example 1
4:31
Theorem 1
6:55
Theorem 2
8:00
Theorem 3
9:04
Example 2
10:07
Theorem 2
13:54
Procedure for Constructing an O/N Basis
16:11
Example 3
21:42
Orthogonal Complements, Part I

21m 27s

Intro
0:00
Orthogonal Complements
0:19
Definition
0:20
Theorem 1
5:36
Example 1
6:58
Theorem 2
13:26
Theorem 3
15:06
Example 2
18:20
Orthogonal Complements, Part II

33m 49s

Intro
0:00
Relations Among the Four Fundamental Vector Spaces Associated with a Matrix A
2:16
Four Spaces Associated With A (If A is m x n)
2:17
Theorem
4:49
Example 1
7:17
Null Space and Column Space
10:48
Projections and Applications
16:50
Projections and Applications
16:51
Projection Illustration
21:00
Example 1
23:51
Projection Illustration Review
30:15
Section 5: Eigenvalues and Eigenvectors
Eigenvalues and Eigenvectors

38m 11s

Intro
0:00
Eigenvalues and Eigenvectors
0:38
Eigenvalues and Eigenvectors
0:39
Definition 1
3:30
Example 1
7:20
Example 2
10:19
Definition 2
21:15
Example 3
23:41
Theorem 1
26:32
Theorem 2
27:56
Example 4
29:14
Review
34:32
Similar Matrices & Diagonalization

29m 55s

Intro
0:00
Similar Matrices and Diagonalization
0:25
Definition 1
0:26
Example 1
2:00
Properties
3:38
Definition 2
4:57
Theorem 1
6:12
Example 3
9:37
Theorem 2
12:40
Example 4
19:12
Example 5
20:55
Procedure for Diagonalizing Matrix A: Step 1
24:21
Procedure for Diagonalizing Matrix A: Step 2
25:04
Procedure for Diagonalizing Matrix A: Step 3
25:38
Procedure for Diagonalizing Matrix A: Step 4
27:02
Diagonalization of Symmetric Matrices

30m 14s

Intro
0:00
Diagonalization of Symmetric Matrices
1:15
Diagonalization of Symmetric Matrices
1:16
Theorem 1
2:24
Theorem 2
3:27
Example 1
4:47
Definition 1
6:44
Example 2
8:15
Theorem 3
10:28
Theorem 4
12:31
Example 3
18:00
Section 6: Linear Transformations
Linear Mappings Revisited

24m 5s

Intro
0:00
Linear Mappings
2:08
Definition
2:09
Linear Operator
7:36
Projection
8:48
Dilation
9:40
Contraction
10:07
Reflection
10:26
Rotation
11:06
Example 1
13:00
Theorem 1
18:16
Theorem 2
19:20
Kernel and Range of a Linear Map, Part I

26m 38s

Intro
0:00
Kernel and Range of a Linear Map
0:28
Definition 1
0:29
Example 1
4:36
Example 2
8:12
Definition 2
10:34
Example 3
13:34
Theorem 1
16:01
Theorem 2
18:26
Definition 3
21:11
Theorem 3
24:28
Kernel and Range of a Linear Map, Part II

25m 54s

Intro
0:00
Kernel and Range of a Linear Map
1:39
Theorem 1
1:40
Example 1: Part A
2:32
Example 1: Part B
8:12
Example 1: Part C
13:11
Example 1: Part D
14:55
Theorem 2
16:50
Theorem 3
23:00
Matrix of a Linear Map

33m 21s

Intro
0:00
Matrix of a Linear Map
0:11
Theorem 1
1:24
Procedure for Computing to Matrix: Step 1
7:10
Procedure for Computing to Matrix: Step 2
8:58
Procedure for Computing to Matrix: Step 3
9:50
Matrix of a Linear Map: Property
10:41
Example 1
14:07
Example 2
18:12
Example 3
24:31

• ## Related Books

### Linear Transformations, Part II

Lecture Slides are screen-captured images of important points in the lecture. Students can download and print out these lecture slide images to do practice problems as well as take notes while watching the lecture.

• Intro 0:00
• Linear Transformations 1:29
• Linear Transformations
• Theorem 1
• Theorem 2
• Example 1: Find L (-3, 4, 2)
• Example 2: Is It Linear?
• Theorem 3
• Example 3: Finding the Standard Matrix

### Transcription: Linear Transformations, Part II

Welcome back to educator.com and welcome back to linear algebra, in the last lesson we introduced the idea of a linear transformation or a linear mapping.0000

They are synonymous, I will often say linear mapping, occasionally how you transformation, but they are synonymous.0009

Today I am going to continue the discussion, a couple of more examples, just to develop more of a sensitive intuition about what it is that's going on.0015

This is a profoundly important concept, as we move on from here, we are going to move on into studying the structure, as actually after we discussed lines and planes.0022

We are going to talk about the structure of something called a vector space, and linear mappings are going to be profoundly important, and how we discuss the transformations from one vector space to another, so this idea of linear mapping for many of you really is the first introduction to this abstraction, you know up to now you have been dealing with functions.0033

X2, radical X, 3X + 5, but now we are going to take, make it a little bit more general, and make the spaces from which we pull something, manipulate it, and land someplace else.0052

A lot more abstract, we are not going to necessity, I mean we will with specific examples, namely N space, R2, R3, Rn, but the idea, the underlying notions are what we really want to study, the underlying structure that's what's important.0063

Let's go ahead and get started, and recap what we did with linear transformations, and do some more examples, okay, so recall what a linear map means.0081

And again we are using this language linear line, but as it turns out we are using it as a language, because historically we did lines before we came up with a definition of what linear means, that's the only reason we call it linear, linearity is an algebraic property, it actually has nothing to do with lines at all.0094

Something is linear, means if we have a mapping or transformation from RN to RM, it has to satisfy the following properties...0114

... A + B = L(A) + L(B), these are vectors of course, because we are taking something from RN and moving it over to RM and it also has to satisfy this other property, that if i multiply.0135

take a vector, multiply it by a scalar, and then do something to it, it's the same as taking the vector, doing something to it, and then multiplying it by that scalar, so these two properties have to be satisfied for any particular function or mapping that we are dealing with.0153

Let's show what that looks like pictorially, so remember we are talking about two spaces, one of them w call the departure, and we call tit the departure space because we are taking something from this space, fiddling around with it and landing someplace else.0171

Now they could be the same space, like for example the function F(X) = X2., I am pulling a number like 5 and i am squaring it and I am getting back another number, 25, so the two spaces are the same, but they don't necessarily have to be the same, that's what makes this beautiful.0190

Okay, so let's say we have the vector A, and we have the vector B, well in this space we can of course add, so let's say we end up with this vector A + B, and we know that we can do that with vectors.0205

Let's see, now when we add here, we, this, we addition in this space, might be defined a certain way, now mind you, it doesn't have to be the same as addition in this space, the operations are different, because the spaces may be different.0219

Okay, so addition in these two spaces may not necessarily be the same, usually they will be, won't be a problem, you know we will always specify when it is different, but understand it, there is no reason to believe that it has to be the same.0239

Okay, so in this case we take A, so this left part here, it means I add A + B first, and then I do L to it.0252

And I end up some place, well what this says is that if this is a linear transformation, it has to satisfy the following properties, that mean if I add these two first, and I, then I transform it and move it to this space, what I end up with.0266

I should be able to get the same thing if I take A first under L and then if I do B first under L, and of course I am going to end up in two different places, and he if I add these two, I should do the same thing.0280

In other words, adding first and then applying the transformation, or applying the transformation separately and then adding, if I can reverse those, and if I still end up in the same place, that's what makes this a linear transformation.0294

And again that's pretty extra ordinary, and the same thing, if I take a vector A, if I multiply it by some scalar 18, and then i operate on it with linear transformation, I am going to end up some place.0308

Let's say I end up here, that means I should, if I take the vector A, map it to L, and then multiply by 18, I should end up in the same place.0320

Again these two things have to be satisfied for something to make it linear, and again not all maps as we saw from the previous examples satisfy the property, this is a very special property, that the structure from one space to another, the relationship is actually maintained, that's what makes this beautiful, now we are getting into deep mathematics.0334

Okay, let's actually represent this a little bit better, so that you can see it, so A, I can transform A under, it becomes l(A), I can transform B under L, it becomes L(B).0354

Now, I can...0371

... Add these two in my departure space, so I get A + B, and then I can apply L to it, to get L(A + B) or, you can do L first, do L for B, and then add to get here.0375

This is more of an expanded, so this is an expanded version of what it is that i sort of drew up here, it's up to you, if you want to work pictorially, if you want to work algebraically, this is what's going on, again profoundly important concept.0400

And again addition in this space does not necessarily need to be the same in addition in the arrival space, they often will be like for example, if this is R2 and this is R3, well addition of vectors is the same, you know from space to space you are adding components, but it doesn't necessarily need to be that way.0415

And again that's the power of this thing...0432

... Okay let's state a theorem, so...0436

... We will let L...0446

... From RN to RM and you will notice, sometimes I will do a single line, sometimes a double line, it's just the real numbers.0452

Let it be a linear mapping...0460

... Excuse me...0468

... then L of C1 times A1 + C2 time A2 + and so on all the way to Ck times Ak = C1 time L, A1.0471

Oops, no yes that's correct, let me erase this here + C2 times LA2 and so on.0495

Essentially this is just an extension of linearity, so I can do, I can add more than just two things, you know A + B, I can add a whole bunch of things, then I can multiply each of those vectors by a constant, so essentially what's happening here.0507

If you think about this algebraically from what you remember as far as distribution, the linear mapping actually distributes over each of these.0522

It says that I can multiply, I and take K vectors, multiply each of them by some constant, not all of them 0, and then apply the linear transformation to it.0531

Or, well that, this theorem says that it is actually equal to taking each individual vector, applying the linear transformation to it, and then multiplying it by a constant.0543

It's just a generalization onto an infinite, any number of vectors that you take, that's all this says.0552

And the second theorem...0560

... Okay, again we will let RN to RM be a linear map, L of the 0 vector in RN...0565

... Maps to the 0 vector in RM, okay, this notation is very important, notice this 0 with a vector, this 0 is a vector, because we are talking about a particular space, let's say in this case R2.0580

this 0 point is actually considered a vector, well the 0 vector in RN and the 0 vector in RM are not the same, one is two vector, one is a three vector, it might be an N vector.0595

What this is saying that if I take the 0, and if I subject it to the linear transformation, it actually maps it to the 0 in the other space, that's kind of extraordinary, so again if i draw a quick little picture, you know two different spaces.0608

let's say this is R3, and let's say this is R4, 3 space and 4 space, if I have this 0 vector here, and the 0 vector here, they are not the same thing, they fulfil the same row, in their, in their perspective spaces, they are still the 0 vector.0623

The additive identity, but if I subject it to transformation L, i actually map the 0 in this space to the 0 in that space, again it's maintained, it doesn't just end up randomly some place, the 0 goes to the 0.0637

And another one which is actually pretty intuitive if I take the transformation of U - V...0653

That's the same as L(U) - L(V), and again you know that the (-) sign is basically the just the addition of the negatives, so it's not a problem, okay.0662

Lets see if we can do an example here...0677

... Should I go for it, yeah that's okay, we can, we can start over here, let me o that, let me change over to a red ink here, okay.0684

We will let L in this particular case be a transformation from R3 to R2 so a three vector, we are going to do something to it and we are going to end up with a two vector...0700

... Be defined by...0715

... L(1, 0, 0), so in this case my definition is, I don't actually have the specific mapping that I am doing, but in this case this example is going to demonstrate that I know something about the unit vectors in this particular space.0720

Or, in this, where you will see, i know something about three of the vectors and we will see what happens, equals 2 - 1...0738

... L (0, 1, 0) is equal to (3, 1), excuse me, and L(0, 0, 1) is equal to (-1, 2), so again it says that if I take the vector (1, 0, 0) in R3 in three space.0749

Under this transformation I am defining it, I am saying that it equals this, that the vector (0, 1, 0) under the transformation L is equal to this, so I have made a statement about three vectors.0772

Now recall...0782

... That (1, 0, 0), we have specific symbols for these, we call them E1, they are unit vectors, they are vectors of length 1, and we happen to give them special symbols because they are very important, (0, 1, 0) in three space.0790

They actually form the unit vectors that are mutually orthogonal, remember X coordinate, Y coordinate, Z coordinate, E1 , we also call it I.0809

Remember, and we call this one J, so there are different kinds of symbols that we can use, they all represent the same thing, and (0, 0, 1) is called E3, and it is represented by a K vector.0820

Okay, our task is to find L of the vector (-3, 4, 2), so again we are given that the three unit vectors map to these three points under the transformation.0836

Can we find where, if we take a random vector, (-3, 4, and 2), can we actually find the point in R2 that l map's knowing just about these three vectors, well as it turns out, yes we can.0852

Let me, over here, well let's see, now (-3, 4, and 2) can be written as...0869

... -3I + 4J + 2K, right, we are just representing them as a linear combination of the unit vectors I, J, K, so L...0883

... Of (-3. 4. 2) is equal to L of -3I + 4J + 2K.0898

Well that's equal to, and again this is linear, so I can just sort of distribute this linearity if you will, it is -3 times L(I)...0913

... + 4 times L(J) + 2 time L(K), well we already know what these are, we already know what L(I), L(J), L(K) is.0927

It's the L(1, 0, 0), this is L(0, 1, 0), this is L(0, 0, 1), so we write -3 times, and I am going to write these as vectors, column vectors + 4 times 3, 1.0939

+ 2 times -1, 2, because this 2, -1 is L(I), we defined it earlier, that was part of the definition.0958

We know that the linear transformation maps these three vectors to these three points, that much we know, no we just sort of set it up in such way, and now end up with -6 and 3.0968

I am going to write everything out here, 12 and 4, please check my arithmetic because i will often make arithmetic mistakes, -2 and 4.0985

And then when we add these together, we end up with 4 and 11, or we can do it in coordinate form, (4, 11), so there we go, knowing where the linear transformation actually maps the unit vectors , allows us to find the linear transformation of any other vector in that space.0996

That's kind of extraordinary, okay...1019

... Now let's do another example here...1029

... Okay...1035

... Let F, this time I use the capital F, be a mapping from R2 to R3, so I am mapping something from two space to three space, okay.1039

Be defined by...1053

... The following, F of the vector XY is equal to, now I am going to represent this as a matrix, so again this is just a mapping that I am throwing out there.1059

(1, 0, 0, 1, 1, -1), this is a 3 by 2...1075

... This matrix multiplication is perfectly well defined, so it says F is a mapping, notice I haven't said anything about it being linear, I just said it's a mapping, that takes a vector in R2, transforms it and turns it into a vector in R3.1087

Lets exactly what happens here, this is the definition that says take XY, some 2 vector, multiply on the left by this matrix, and you actually do end up getting, so this is 3 by 2, this is a 2 by 1.1100

Well sure enough, you will end up getting a 3 by 1 matrix, which is a 3 vector, so we have taken a vector in R2, mapped it to R3, now our question is, is it linear?...1115

... That's just kind of interesting, I have this matrix multiplication, now I want to find out if it's linear, again the power of linearity, this has nothing to do with lines at all.1130

Okay, so again when we check linearity, we check two things, we check the addition and we check the scalar multiplication, we will go through the addition here, I will have, you go ahead and check these scalar multiplication if you want to, so check this, check that F of...1139

... U + V for any two vectors equals F(U) + F(V), that we can exchange the addition and the linear, and the actual function itself, okay.1158

We will say that U is equal to U1, U2, oops, then make that like that, we will say that V is...1175

... V1 and V2, okay, now U + V is exactly what you think it is, it is U1 + V1, U2 + V2, okay.1188

I am going to write that, let me actually write it a little differently, let me write it as a, as a 2 vector, column vector, I think it might be a little bit clear.1206

I will do this, because we are dealing with matrix multiplication, when, we will just deal with matrices, so U1 + V1, U2 + V2.1219

L:et me make sure I have my indices correct, yes, okay, now...1229

I will do a little 1 here, and now let's transform, let's do F(U + V), okay, well that's going to equal the matrix...1236

... (1, 0, 0, 1, 1, -1) times...1252

... U1 + V1, U2 + V2, okay, so again it's, when we do this times that + this times that and then this times that + this times that.1261

And then this times that + this + this times that, that's how matrix multiplication works, you choose a row and you go down the column, there are two elements in this row, two elements in this column, you add them together.1274

What you end up with is the following, U1 + V1, U2 + V2, and you get...1286

... U1 + V1 - U2 + V2.1300

This is the three vector, that's the first entry, that's the second entry, that whole thing is the third entry, so we have done this first part, the left, okay.1313

Now let's do the right...1323

F(U) is equal to...1327

... (1, 0, 0, 1, 1, -1) times U1, U2, that's equal to U1, U2, U1 - U2...1333

... Okay, now let's move to the next page...1351

... We will do F(V)...1360

... That's equal to (1, 0, 0, 1, 1, -1) times V1, V2, that's equal to V1, v2, V1 - V2.1363

Now we have to add the F(U) and the F(V), so F(U), which we just did + F(V), which was the second thing we just did, is equal to U1, U2, U1 - U2.1380

V1V2, V1 - V2, that's equal to U1 + V1...1400

... U2 + v2...1411

... U1 + V1, I have just rearranged and put them, and grouped the U1 and, the U1 with the V1, the U2 and the V2 +....1417

... + U2 + V2, there we go, and as it turns out, F(U) + F(V), does in fact equal F(U + V), quantity, so yes, so let me write that out.1432

F(U + V), does in fact equal F(U) + F(V).1448

Now when we check the scalar multiplication, it will also check out, so yes this map is linear...1460

... This is rather extra ordinary, matrix multiplication is a linear mapping, matrix multiplication allows you to map something in N space, like R5, into, let's say seven space, R7.1472

And to retain the structure of being able to add the vectors in five spaces first, and then do a linear transformation, or do the linear transformation first, end u in seven space, and then ad, you end up in the same place.1490

That's extraordinary, matrix multiplication is a linear mapping, notice it has nothing to do with linear, with a line, this is an algebraic property.1505

An underlying structure of the mapping itself...1513

... Okay...1517

... Therefore if you have some mapping...1521

... L, defined by the following L of some vector X, is actually equal to some matrix, some M by N matrix, multiplied by X...1533

... Then L is linear...1548

... We just proved it, always...1552

... Okay, now let's state a theorem here...1557

... If L is a mapping from RN...1564

... To RM, is a linear mapping...1571

... here is what's amazing, then there exists a unique M by N matrix...1582

... A, such that the mapping, the, is actually equal to some matrix, times...1591

... For any vector in RN...1602

... Okay, this is profoundly important...1612

... We just proved that matrix multiplication is a linear mapping, the other way around i also true, if I have a linear mapping that has nothing to do with the matrix, because remember the examples that we have been dealing up to this point have nothing to do with matrices necessarily.1616

They were just mapping, function, if it turns out that, that mapping is linear, what this theorem tells me is that there is some matrix.1630

Some matrix somewhere, that actually represents that mapping, in other words I can always, I may not really need to find, but it tells me that the matrix actually exist, that every linear mapping is associated with some M by N matrix.1641

And some M by N matrix is associated with some linear mapping, that's extraordinary, there is a correspondence between the set of all linear mappings, and the set of all M by N matrices, that's extra ordinary.1657

Actually there is way to find the matrix and here is how it is, so...1673

... The matrix A...1678

... And it's quite beautiful, is found as follows...1683

... The matrix of A is equal to the matrix of...1691

... I take the unit vectors in my space, in my departure space, I subject them to transformation, whatever the linear mapping happens to be, and then the vectors that I get, i set them up as columns in a matrix.1700

And that's actually the matrix of my transformation, of my linear transformation, L of...1717

... E to the N, okay...1725

... Yes, alright...1730

... I will write it out, so here's what I am doing, so the ith column, let's say the fourth column is just them linear transformation of the fourth unit vector for that space, we should probably just do an example about, or work out much better.1735

Okay...1754

... Let's...1756

... Let this be a mapping, defined, R3 to R2, so we are taking a three vector, mapping some, transforming it into a two vector.1764

Let it be defined by the following, L(XYZ) is equal to...1777

... I am sorry, no this is mapping from R3 to R3, so we are mapping, we are mapping it onto itself essentially, so it's mapping from three space onto three space, which by the way, when the spaces happen to be the same that you are mapping to and from, it's called an operator, a linear operator...1788

... X + Y is the first entry, Y- Z is the second entry, X + Z is the third entry, so I take a vector, do something to it, and arrange it like this, this is what the mapping is defined by.1809

Now the question is this, we said that any linear mapping has a matrix associated with it, I can always represent a linear mapping as a matrix multiplication, very convenient, let's find that matrix...1823

... And it's called the standard matrix by the way, I myself am not too big on nomenclature, am more interested that you actually understand what's happening, you could call it whatever name you want.1839

Okay, we said that all we have to do is take the linear transformation, or take the unit vectors in the space, in this case R3 or departure space, and just subject them to this transformation, and then set up this columns, and that's our matrix.1851

L of E1 equals L of, well in three space (1, 0, 0) is the first unit vector, the X, the I, well that equals, well, let's go up here, 1 + 0...1866

... 0 - 0, and 1 + 0...1886

... We end up with (1, 0, 1) okay, this is going to be column 1 of our matrix...1891

... L of E2 equals L(0, 1, 0), well X + Y, 0 + 1, Y - Z, 1 - 0 and X + Z, 0 + 0.1901

Then I should end up with (1, 1, 0), and that's going to be column 2, if I take L(E3), which is L(0, 0, 1) , well X+ Y, 0,+ 0, Y - Z, Y - Z, 0 - 1, and X + Z, 0 + 1.1917

I end up with (0, -1, 1)...1945

... This is my column 3, so A, the standard matrix is (1, 0, 1), (1, 1, 0), (0, -1, 1)...1950

Let me change it to blue, this was how the linear mapping was defined, the linear mapping. Therefore I know that there are some matrix associated with this linear mapping, I could represent it as a matrix multiplication, which is very convenient.1973

Well, I take the unit vectors for this space, I subject them to this transformation, I get these things.1989

I arrange these things one after the other as the columns of the matrix, and I end up with my matrix.1996

This means that...2003

... If I want to do this mapping, all I have to do is take any vector X, and multiply by this matrix on the left, profoundly important.2010

Every liner mapping is associated with an M by N matrix, and every M by N matrix represents some linear mapping somewhere.2020

That's extraordinary, so now you are not just talking about numbers arranged randomly in a square, or in some rectangular fashion, that this actually represents a linear mapping, a linear function from one space to another.2029

Okay we will talk a little bit more about this next time, thank you for joining us here at educator.com, we will see you again, bye, bye.2044

OR

### Start Learning Now

Our free lessons will get you started (Adobe Flash® required).