Raffi Hovasapian

Raffi Hovasapian

Matrix of a Linear Map

Slide Duration:

Table of Contents

Section 1: Linear Equations and Matrices
Linear Systems

39m 3s

Intro
0:00
Linear Systems
1:20
Introduction to Linear Systems
1:21
Examples
10:35
Example 1
10:36
Example 2
13:44
Example 3
16:12
Example 4
23:48
Example 5
28:23
Example 6
32:32
Number of Solutions
35:08
One Solution, No Solution, Infinitely Many Solutions
35:09
Method of Elimination
36:57
Method of Elimination
36:58
Matrices

30m 34s

Intro
0:00
Matrices
0:47
Definition and Example of Matrices
0:48
Square Matrix
7:55
Diagonal Matrix
9:31
Operations with Matrices
10:35
Matrix Addition
10:36
Scalar Multiplication
15:01
Transpose of a Matrix
17:51
Matrix Types
23:17
Regular: m x n Matrix of m Rows and n Column
23:18
Square: n x n Matrix With an Equal Number of Rows and Columns
23:44
Diagonal: A Square Matrix Where All Entries OFF the Main Diagonal are '0'
24:07
Matrix Operations
24:37
Matrix Operations
24:38
Example
25:55
Example
25:56
Dot Product & Matrix Multiplication

41m 42s

Intro
0:00
Dot Product
1:04
Example of Dot Product
1:05
Matrix Multiplication
7:05
Definition
7:06
Example 1
12:26
Example 2
17:38
Matrices and Linear Systems
21:24
Matrices and Linear Systems
21:25
Example 1
29:56
Example 2
32:30
Summary
33:56
Dot Product of Two Vectors and Matrix Multiplication
33:57
Summary, cont.
35:06
Matrix Representations of Linear Systems
35:07
Examples
35:34
Examples
35:35
Properties of Matrix Operation

43m 17s

Intro
0:00
Properties of Addition
1:11
Properties of Addition: A
1:12
Properties of Addition: B
2:30
Properties of Addition: C
2:57
Properties of Addition: D
4:20
Properties of Addition
5:22
Properties of Addition
5:23
Properties of Multiplication
6:47
Properties of Multiplication: A
7:46
Properties of Multiplication: B
8:13
Properties of Multiplication: C
9:18
Example: Properties of Multiplication
9:35
Definitions and Properties (Multiplication)
14:02
Identity Matrix: n x n matrix
14:03
Let A Be a Matrix of m x n
15:23
Definitions and Properties (Multiplication)
18:36
Definitions and Properties (Multiplication)
18:37
Properties of Scalar Multiplication
22:54
Properties of Scalar Multiplication: A
23:39
Properties of Scalar Multiplication: B
24:04
Properties of Scalar Multiplication: C
24:29
Properties of Scalar Multiplication: D
24:48
Properties of the Transpose
25:30
Properties of the Transpose
25:31
Properties of the Transpose
30:28
Example
30:29
Properties of Matrix Addition
33:25
Let A, B, C, and D Be m x n Matrices
33:26
There is a Unique m x n Matrix, 0, Such That…
33:48
Unique Matrix D
34:17
Properties of Matrix Multiplication
34:58
Let A, B, and C Be Matrices of the Appropriate Size
34:59
Let A Be Square Matrix (n x n)
35:44
Properties of Scalar Multiplication
36:35
Let r and s Be Real Numbers, and A and B Matrices
36:36
Properties of the Transpose
37:10
Let r Be a Scalar, and A and B Matrices
37:12
Example
37:58
Example
37:59
Solutions of Linear Systems, Part 1

38m 14s

Intro
0:00
Reduced Row Echelon Form
0:29
An m x n Matrix is in Reduced Row Echelon Form If:
0:30
Reduced Row Echelon Form
2:58
Example: Reduced Row Echelon Form
2:59
Theorem
8:30
Every m x n Matrix is Row-Equivalent to a UNIQUE Matrix in Reduced Row Echelon Form
8:31
Systematic and Careful Example
10:02
Step 1
10:54
Step 2
11:33
Step 3
12:50
Step 4
14:02
Step 5
15:31
Step 6
17:28
Example
30:39
Find the Reduced Row Echelon Form of a Given m x n Matrix
30:40
Solutions of Linear Systems, Part II

28m 54s

Intro
0:00
Solutions of Linear Systems
0:11
Solutions of Linear Systems
0:13
Example I
3:25
Solve the Linear System 1
3:26
Solve the Linear System 2
14:31
Example II
17:41
Solve the Linear System 3
17:42
Solve the Linear System 4
20:17
Homogeneous Systems
21:54
Homogeneous Systems Overview
21:55
Theorem and Example
24:01
Inverse of a Matrix

40m 10s

Intro
0:00
Finding the Inverse of a Matrix
0:41
Finding the Inverse of a Matrix
0:42
Properties of Non-Singular Matrices
6:38
Practical Procedure
9:15
Step1
9:16
Step 2
10:10
Step 3
10:46
Example: Finding Inverse
12:50
Linear Systems and Inverses
17:01
Linear Systems and Inverses
17:02
Theorem and Example
21:15
Theorem
26:32
Theorem
26:33
List of Non-Singular Equivalences
28:37
Example: Does the Following System Have a Non-trivial Solution?
30:13
Example: Inverse of a Matrix
36:16
Section 2: Determinants
Determinants

21m 25s

Intro
0:00
Determinants
0:37
Introduction to Determinants
0:38
Example
6:12
Properties
9:00
Properties 1-5
9:01
Example
10:14
Properties, cont.
12:28
Properties 6 & 7
12:29
Example
14:14
Properties, cont.
18:34
Properties 8 & 9
18:35
Example
19:21
Cofactor Expansions

59m 31s

Intro
0:00
Cofactor Expansions and Their Application
0:42
Cofactor Expansions and Their Application
0:43
Example 1
3:52
Example 2
7:08
Evaluation of Determinants by Cofactor
9:38
Theorem
9:40
Example 1
11:41
Inverse of a Matrix by Cofactor
22:42
Inverse of a Matrix by Cofactor and Example
22:43
More Example
36:22
List of Non-Singular Equivalences
43:07
List of Non-Singular Equivalences
43:08
Example
44:38
Cramer's Rule
52:22
Introduction to Cramer's Rule and Example
52:23
Section 3: Vectors in Rn
Vectors in the Plane

46m 54s

Intro
0:00
Vectors in the Plane
0:38
Vectors in the Plane
0:39
Example 1
8:25
Example 2
15:23
Vector Addition and Scalar Multiplication
19:33
Vector Addition
19:34
Scalar Multiplication
24:08
Example
26:25
The Angle Between Two Vectors
29:33
The Angle Between Two Vectors
29:34
Example
33:54
Properties of the Dot Product and Unit Vectors
38:17
Properties of the Dot Product and Unit Vectors
38:18
Defining Unit Vectors
40:01
2 Very Important Unit Vectors
41:56
n-Vector

52m 44s

Intro
0:00
n-Vectors
0:58
4-Vector
0:59
7-Vector
1:50
Vector Addition
2:43
Scalar Multiplication
3:37
Theorem: Part 1
4:24
Theorem: Part 2
11:38
Right and Left Handed Coordinate System
14:19
Projection of a Point Onto a Coordinate Line/Plane
17:20
Example
21:27
Cauchy-Schwarz Inequality
24:56
Triangle Inequality
36:29
Unit Vector
40:34
Vectors and Dot Products
44:23
Orthogonal Vectors
44:24
Cauchy-Schwarz Inequality
45:04
Triangle Inequality
45:21
Example 1
45:40
Example 2
48:16
Linear Transformation

48m 53s

Intro
0:00
Introduction to Linear Transformations
0:44
Introduction to Linear Transformations
0:45
Example 1
9:01
Example 2
11:33
Definition of Linear Mapping
14:13
Example 3
22:31
Example 4
26:07
Example 5
30:36
Examples
36:12
Projection Mapping
36:13
Images, Range, and Linear Mapping
38:33
Example of Linear Transformation
42:02
Linear Transformations, Part II

34m 8s

Intro
0:00
Linear Transformations
1:29
Linear Transformations
1:30
Theorem 1
7:15
Theorem 2
9:20
Example 1: Find L (-3, 4, 2)
11:17
Example 2: Is It Linear?
17:11
Theorem 3
25:57
Example 3: Finding the Standard Matrix
29:09
Lines and Planes

37m 54s

Intro
0:00
Lines and Plane
0:36
Example 1
0:37
Example 2
7:07
Lines in IR3
9:53
Parametric Equations
14:58
Example 3
17:26
Example 4
20:11
Planes in IR3
25:19
Example 5
31:12
Example 6
34:18
Section 4: Real Vector Spaces
Vector Spaces

42m 19s

Intro
0:00
Vector Spaces
3:43
Definition of Vector Spaces
3:44
Vector Spaces 1
5:19
Vector Spaces 2
9:34
Real Vector Space and Complex Vector Space
14:01
Example 1
15:59
Example 2
18:42
Examples
26:22
More Examples
26:23
Properties of Vector Spaces
32:53
Properties of Vector Spaces Overview
32:54
Property A
34:31
Property B
36:09
Property C
36:38
Property D
37:54
Property F
39:00
Subspaces

43m 37s

Intro
0:00
Subspaces
0:47
Defining Subspaces
0:48
Example 1
3:08
Example 2
3:49
Theorem
7:26
Example 3
9:11
Example 4
12:30
Example 5
16:05
Linear Combinations
23:27
Definition 1
23:28
Example 1
25:24
Definition 2
29:49
Example 2
31:34
Theorem
32:42
Example 3
34:00
Spanning Set for a Vector Space

33m 15s

Intro
0:00
A Spanning Set for a Vector Space
1:10
A Spanning Set for a Vector Space
1:11
Procedure to Check if a Set of Vectors Spans a Vector Space
3:38
Example 1
6:50
Example 2
14:28
Example 3
21:06
Example 4
22:15
Linear Independence

17m 20s

Intro
0:00
Linear Independence
0:32
Definition
0:39
Meaning
3:00
Procedure for Determining if a Given List of Vectors is Linear Independence or Linear Dependence
5:00
Example 1
7:21
Example 2
10:20
Basis & Dimension

31m 20s

Intro
0:00
Basis and Dimension
0:23
Definition
0:24
Example 1
3:30
Example 2: Part A
4:00
Example 2: Part B
6:53
Theorem 1
9:40
Theorem 2
11:32
Procedure for Finding a Subset of S that is a Basis for Span S
14:20
Example 3
16:38
Theorem 3
21:08
Example 4
25:27
Homogeneous Systems

24m 45s

Intro
0:00
Homogeneous Systems
0:51
Homogeneous Systems
0:52
Procedure for Finding a Basis for the Null Space of Ax = 0
2:56
Example 1
7:39
Example 2
18:03
Relationship Between Homogeneous and Non-Homogeneous Systems
19:47
Rank of a Matrix, Part I

35m 3s

Intro
0:00
Rank of a Matrix
1:47
Definition
1:48
Theorem 1
8:14
Example 1
9:38
Defining Row and Column Rank
16:53
If We Want a Basis for Span S Consisting of Vectors From S
22:00
If We want a Basis for Span S Consisting of Vectors Not Necessarily in S
24:07
Example 2: Part A
26:44
Example 2: Part B
32:10
Rank of a Matrix, Part II

29m 26s

Intro
0:00
Rank of a Matrix
0:17
Example 1: Part A
0:18
Example 1: Part B
5:58
Rank of a Matrix Review: Rows, Columns, and Row Rank
8:22
Procedure for Computing the Rank of a Matrix
14:36
Theorem 1: Rank + Nullity = n
16:19
Example 2
17:48
Rank & Singularity
20:09
Example 3
21:08
Theorem 2
23:25
List of Non-Singular Equivalences
24:24
List of Non-Singular Equivalences
24:25
Coordinates of a Vector

27m 3s

Intro
0:00
Coordinates of a Vector
1:07
Coordinates of a Vector
1:08
Example 1
8:35
Example 2
15:28
Example 3: Part A
19:15
Example 3: Part B
22:26
Change of Basis & Transition Matrices

33m 47s

Intro
0:00
Change of Basis & Transition Matrices
0:56
Change of Basis & Transition Matrices
0:57
Example 1
10:44
Example 2
20:44
Theorem
23:37
Example 3: Part A
26:21
Example 3: Part B
32:05
Orthonormal Bases in n-Space

32m 53s

Intro
0:00
Orthonormal Bases in n-Space
1:02
Orthonormal Bases in n-Space: Definition
1:03
Example 1
4:31
Theorem 1
6:55
Theorem 2
8:00
Theorem 3
9:04
Example 2
10:07
Theorem 2
13:54
Procedure for Constructing an O/N Basis
16:11
Example 3
21:42
Orthogonal Complements, Part I

21m 27s

Intro
0:00
Orthogonal Complements
0:19
Definition
0:20
Theorem 1
5:36
Example 1
6:58
Theorem 2
13:26
Theorem 3
15:06
Example 2
18:20
Orthogonal Complements, Part II

33m 49s

Intro
0:00
Relations Among the Four Fundamental Vector Spaces Associated with a Matrix A
2:16
Four Spaces Associated With A (If A is m x n)
2:17
Theorem
4:49
Example 1
7:17
Null Space and Column Space
10:48
Projections and Applications
16:50
Projections and Applications
16:51
Projection Illustration
21:00
Example 1
23:51
Projection Illustration Review
30:15
Section 5: Eigenvalues and Eigenvectors
Eigenvalues and Eigenvectors

38m 11s

Intro
0:00
Eigenvalues and Eigenvectors
0:38
Eigenvalues and Eigenvectors
0:39
Definition 1
3:30
Example 1
7:20
Example 2
10:19
Definition 2
21:15
Example 3
23:41
Theorem 1
26:32
Theorem 2
27:56
Example 4
29:14
Review
34:32
Similar Matrices & Diagonalization

29m 55s

Intro
0:00
Similar Matrices and Diagonalization
0:25
Definition 1
0:26
Example 1
2:00
Properties
3:38
Definition 2
4:57
Theorem 1
6:12
Example 3
9:37
Theorem 2
12:40
Example 4
19:12
Example 5
20:55
Procedure for Diagonalizing Matrix A: Step 1
24:21
Procedure for Diagonalizing Matrix A: Step 2
25:04
Procedure for Diagonalizing Matrix A: Step 3
25:38
Procedure for Diagonalizing Matrix A: Step 4
27:02
Diagonalization of Symmetric Matrices

30m 14s

Intro
0:00
Diagonalization of Symmetric Matrices
1:15
Diagonalization of Symmetric Matrices
1:16
Theorem 1
2:24
Theorem 2
3:27
Example 1
4:47
Definition 1
6:44
Example 2
8:15
Theorem 3
10:28
Theorem 4
12:31
Example 3
18:00
Section 6: Linear Transformations
Linear Mappings Revisited

24m 5s

Intro
0:00
Linear Mappings
2:08
Definition
2:09
Linear Operator
7:36
Projection
8:48
Dilation
9:40
Contraction
10:07
Reflection
10:26
Rotation
11:06
Example 1
13:00
Theorem 1
18:16
Theorem 2
19:20
Kernel and Range of a Linear Map, Part I

26m 38s

Intro
0:00
Kernel and Range of a Linear Map
0:28
Definition 1
0:29
Example 1
4:36
Example 2
8:12
Definition 2
10:34
Example 3
13:34
Theorem 1
16:01
Theorem 2
18:26
Definition 3
21:11
Theorem 3
24:28
Kernel and Range of a Linear Map, Part II

25m 54s

Intro
0:00
Kernel and Range of a Linear Map
1:39
Theorem 1
1:40
Example 1: Part A
2:32
Example 1: Part B
8:12
Example 1: Part C
13:11
Example 1: Part D
14:55
Theorem 2
16:50
Theorem 3
23:00
Matrix of a Linear Map

33m 21s

Intro
0:00
Matrix of a Linear Map
0:11
Theorem 1
1:24
Procedure for Computing to Matrix: Step 1
7:10
Procedure for Computing to Matrix: Step 2
8:58
Procedure for Computing to Matrix: Step 3
9:50
Matrix of a Linear Map: Property
10:41
Example 1
14:07
Example 2
18:12
Example 3
24:31
Loading...
This is a quick preview of the lesson. For full access, please Log In or Sign up.
For more information, please see full course syllabus of Linear Algebra
Bookmark & Share Embed

Share this knowledge with your friends!

Copy & Paste this embed code into your website’s HTML

Please ensure that your website editor is in text mode when you paste the code.
(In Wordpress, the mode button is on the top right corner.)
  ×
  • - Allow users to view the embedded video in full-size.
Since this lesson is not free, only the preview will appear on your website.
  • Discussion

  • Answer Engine

  • Download Lecture Slides

  • Table of Contents

  • Transcription

  • Related Books

Lecture Comments (13)

1 answer

Last reply by: Professor Hovasapian
Thu May 1, 2014 9:24 PM

Post by Josh Winfield on April 21, 2014

I thank you for every hour, minute and second you spent understanding these concepts, so you could explain it simply to us and inspire a love for mathematics and a sense of intuition that will only continue to grow. Cheers Raffi, thanks Chief.

1 answer

Last reply by: Professor Hovasapian
Fri Mar 28, 2014 5:46 PM

Post by Hoa Huynh on March 23, 2014

Dear Professor,
At 24:30, you said A(nat basis) = [(1,0), (1,0), (0,-1)]. I do not get where it comes from. Please, explain me

0 answers

Post by Manfred Berger on June 25, 2013

Thank you, I'll see you, quite literally in Multivariate Calculus

1 answer

Last reply by: Professor Hovasapian
Tue Sep 11, 2012 12:23 AM

Post by Ian Vaagenes on September 10, 2012

Hi Raffi,
Great course, any change we'll get a lecture on singular value decomposition?
Best,
Ian

0 answers

Post by Shahaz Shajahan on August 24, 2012

Hi, sorry to be a pain but I have sent another question on the facebook page, if you can just look at it for me please, thank you

1 answer

Last reply by: Professor Hovasapian
Wed Aug 15, 2012 6:17 PM

Post by Brendan Hu on August 15, 2012

I watched all of your lectures this summer and I've learned so much. As Peter has, I just wanted to express my gratitude. I've talked to my dad of your knowledge of and passion for math, and we both agree that we wish we saw teachers like you more often. I also feel (and hope) that it was a great preparation for my Linear Algebra class at Berkeley in the Fall. Thanks so much Dr. Hovasapian :)

1 answer

Last reply by: Professor Hovasapian
Thu Jul 26, 2012 12:55 AM

Post by Hengyao [Peter] Han on July 25, 2012

I really cant express how useful your lectures were to me, I really wouldn't do well in my class if I never found your lectures on this site. It's so worth it. Thank you so much Professor Hovasapian!

Matrix of a Linear Map

Lecture Slides are screen-captured images of important points in the lecture. Students can download and print out these lecture slide images to do practice problems as well as take notes while watching the lecture.

  • Intro 0:00
  • Matrix of a Linear Map 0:11
    • Theorem 1
    • Procedure for Computing to Matrix: Step 1
    • Procedure for Computing to Matrix: Step 2
    • Procedure for Computing to Matrix: Step 3
    • Matrix of a Linear Map: Property
    • Example 1
    • Example 2
    • Example 3

Transcription: Matrix of a Linear Map

Welcome back to Educator.com, welcome back to linear algebra.0000

Today we are going to talk about the matrix of a linear map.0004

Okay. Let us just jump right in. We have already seen that when you have a linear map from RN to RM, let us say form R3 to R5... that that linear map is always representable by some matrix... a 5 by 3 map in this case. Always.0009

So, today, we want to generalize that result and deal with linear maps in general, not necessarily from one Euclidian space to another Euclidian space, but any vector space at all.0030

So, let us go ahead and start with a theorem. Some people might call this... most of you are familiar, of course, with the fundamental theorem of calculus. There's also something called the fundamental theorem of algebra that concerns the roots of a polynomial equation.0042

In some sense, if you want this theorem that I am about to write, you can consider it the fundamental theorem of linear algebra.0058

It sort of ties everything together. If you want to call it that. Some people do, some people don't, it certainly is not historically referred to that way the way the others are... but this sort of brings the entire course that we have done... everything has sort of come to this one point.0065

Let us go ahead and write it down very carefully, and talk about it, do some examples.0085

Here - okay - the statement of this theorem is a bit long, but there is nothing strange about it. Let L from v to w be a linear map from an n-dimensional vector space into an m-dimensional vector space.0094

Again, we are talking about finite dimensional vector spaces, always. We are not talking about infinite dimensional vector spaces.0132

There is a branch of mathematics that does deal with that called functional analysis, but we are concerned with finite.0137

N - dimensional vector space, sorry about that, okay... we will let s, which equals the set v1, v2, so on, to vN, be a basis for v.0152

And, t which equals w1, w2, so on and so forth, on to wM... be a basis for w, the arrival space.0177

I really love referring to them as departure and arrival space. It is a lot more clear that way.0195

Then, the m by n matrix a, whose jth column is the vector L(vj), the coordinate vector with respect to t... and I will explain what all of this means in just a minute, do not worry about the notation... is the matrix associated with the linear map and has the following property.0202

L(x) with respect to t is equal to a × x, with respect to s. Okay. So, let me read through this and talk a little bit about what it means.0274

So, L is a linear map from a finite dimensional vector space, excuse me, to another finite dimensional vector space. The dimensions do not necessarily have to be the same, it is a linear map.0293

Okay. S is a basis for v, the departure space. T is a basis for the arrival space. Okay. Then, there is a matrix associated with this linear map of 2 arbitrary vector spaces.0302

There is a matrix associated with this, and the columns of the matrix happen to be, so for example if I want the first column of this particular matrix, I actually perform L the linear map, I perform it on the basis vectors for the departure space.0319

Then once I find those values, I find their coordinate vectors with respect to the basis t.0338

You remember coordinates... and once I put them in the columns, that is the matrix that is associated with this linear map. The same way that this matrix was associated with the linear map from one Euclidian space to another, R4 to R7.0345

You know, giving you a 7 by 4 matrix. Well, remember we talked about coordinates. You know a vector can be represented by a linear combination of the elements of the basis.0361

So, if we have a basis for the elements of a particular space, all we have to do is... the coordinates are just the particular constants that make up that linear combination.0372

In some sense, we are sort of associating a 5-dimensional random vector space with R5. We are giving it numbers, that is... we are labeling it. That is what we are doing, and it has an interesting property.0380

That if I take some random vector in the departure space, and I perform some operation on it, and then I find its coordinate vector with respect to the basis t, it is the same as if I take that x before I do anything to it. Find its coordinate vector with respect to s in the departure space, and then multiply it by this particular matrix. I get the same answer.0394

So, let us just do some examples and I think it will make a lot more sense. So, let us see, but before I do... let me write out a procedure explicitly for how to compute the matrix of the linear map.0420

Okay. So, let us do this in red. Oops - there it is. Wow, that was interesting. That is a strange line. Alright.0440

So, procedure for computing the matrix of L from v to w, the matrix of a linear map with s and t as respective bases.0458

So s is a basis for v, t is a basis for w, and it is the same as the theorem up here. S we will represent as v1, v2, all the way to vN. W we will... t we will represent with w1, w2, all the way... all the way through there.0495

Okay. So, the first thing you do, is compute Lvj, in other words, take all of the vector in the basis for the departure space, and perform the particular linear operation on them. Just perform the function and see what you get.0514

Step 2, now, once you have those, you want to find the coordinate vector with respect to t, what that means is if you remember right, and if you do not you can review the previous lesson where we talked about coordinate vectors.0539

We did a fair number of examples if I remember right - express the Lvj that you got from the first step as a linear combination of the vectors w1, w2, to wM. The vectors in the t bases... w1, w2... wM, and we will be doing this in a minute, so do not worry about the procedure if you do not remember it.0558

Three, we set the thing we got from R2, we set this as the jth column of the matrix.0592

So, we do it for each bases vector. For the bases of the departure space, and we... so if we have n vectors, we will have n columns.0608

That will be our matrix, and we are done. Okay.0617

Let us just... before I do that, I actually want to give you a little pictorial representation of what it is that is actually going on here. So, if I take x... let me show you what it is that I am talking about. Let me go back to blue.0622

What that property means. What it really means, t = a, this was the last thing that we wrote in the theorem, and we said that the coordinate vector under the transformation of some random vector in the departure space is equal to the matrix that we end up computing times the coordinate vector with respect to the bases s from the departure space. Here is what this means.0649

It means if I take some random x in the departure space, I can perform L on it and I get L(x), of course. Well, and then of course from there I can go ahead and find the coordinate vector, which is the L(x) with respect to some bases t.0680

So, in other words, I go from my departure space to my arrival space and then I actually convert that to some coordinate, because I need to deal with some numbers.0707

Well, as it turns out, instead what I can do is I can go ahead and just take x in my departure space, find its coordinate with respect to the bases of the departure space, and then I can just multiple by a, the matrix a that I compute.0714

They end up actually being the same thing. I can either go directly, or I can go through the matrix. That is what... you will see these often in algebraic courses in mathematics... is you will often see different paths to a particular place that you want to get to.0732

You can either do it directly from L, or you can do it through the matrix. And, it is nice to have these options, because sometimes this option might not be available, sometimes this might be the only one available. At least you have a path to get there.0749

a... x... s... those of you who have studied multi-variable calculus, or are doing so now, you are going to be discussing something called Green's Theorem and Stoke's Theorem, and possibly the generalized version of that.0762

I don't know, depending on the school that you are attending. But, essentially what those theorems do is they allow you to express an integral as the different kind of integral.0780

Instead of solving a line integral or a surface integral, you end up solving an area integral or a volume integral which you know how to do already from your basic calculus. It allows you a different path to the same place. That is what is going on.0792

That is essentially what the fundamental theorem of calculus is, is an alternate path to the same place. This is just an algebraic version of it... that is what we want in path.0804

We want different paths just to get some place. Because often one path is better than the other. Easier than the other. So again, it means if I want to take a vector, I can find its coordinate in the arrival space by just doing it directly.0813

But if that path is not available and I have the matrix, I can just take its coordinate vector in the departure space, multiply it by the matrix, and I end up with the same answer. That is kind of extraordinary.0829

Again, it is all a property of this linear map, and the maintenance of the structure of one space to another. Okay, let us just jump in to the examples because I think that is going to make the most sense.0842

Example... so, our linear map is going to be from R3 to R2. In this case we are using Euclidian spaces, 2-dimensional, 2 dimensions to 3 dimensions, and it is defined by L(x,y,z) = x + y, y - z. We take a 3 vector, we map it to a 2 vector. This is one entry, that is two entries. Okay.0856

Now, we have our two bases that we are given. So, basis is (1,0,0), no, that is not going to work. We have (1,0,0), (0,1,0), (0,0,1). The natural bases for R3.0885

And t is going to be... in the natural bases for R2. This is not going to work, I need to get this a little more clear. I know you guys know what is going on, but I definitely want... okay.0909

So, the first thing we are going to do is we are going to calculate L(v1), which is L(1,0,0), which equals 1 + 0, 0 - 0, which equals (1,0). Boom. That is that one.0930

We do the same thing for L of... well, let me... I am just going to go straight into it. Let us do L(v2), which is... (0,1,0). That is going to equal (1,1), and if I do L(0,0,1), I end up with 0 - 1. Okay.0953

So, I found L of the bases vectors of the departure space. Now, I want to express these... these numbers with respect to the basis for the arrival space, mainly with respect to t.0981

Well, since t is the natural basis, I do not have to do anything here, and again, when we are dealing with a natural basis, namely (1,0,0), (0,1,0), (0,0,1)... we do not have to change anything.0998

So, as it turns out, L(v1) with respect to the basis t, this thing with respect to t happens to equal (1,0), and the same for the others.1016

L(v2), with respect to the basis t because it is the natural basis is equal to (1,1), and then L(v3) with respect to the natural basis for t is equal to (0,-1).1039

Now, I take these 3 and I arrange them in a column, therefore my matrix a is... I arrange them as columns... (1,0), (1,1), (0,-1), that is my matrix for my transformation.1055

My linear transformation is associated with this matrix. Very nice.1071

Okay. Now, let us do the second example, which is going to... we are going to change the basis. We are not going to use the natural basis, we are going to change it, and we are going to see how the matrix changes. Alright.1080

I am going to do this one in red... Okay. So, everything is the same as far as the linear map is concerned. The only difference now is... well, let me write it out again... so L is a mapping from R3 to R2, defined by L(x,y,z) = x + y and y - z.1093

Now, our basis s = (1,0,1), (0,1,1) and (1,1,1)... and our basis t is now going to equal (1,2) and (-1,1), so we have changed the bases.1129

Well, we will see what happens. So, let us calculate L(v1), okay, that is going to equal 1 - 1, I will let you verify this.1165

L(v2) = (1,0) and L(v3) = (2,0). Okay.1194

To find L(vj) with respect to the basis t, here is what we have to do. We need to express each of these as a linear combination of the basis for the arrival space.1210

In other words, we need to express L(v1), which is equal to (1,-1) as a1 × (1,2) + a2... 2 constants × (-1,1).1233

(1,2) and (-1,1), that is the basis for our arrival space. So, we need to find constants a1 and a2 such that a linear combination of them, this linear combination of them equals this vector. That is the whole idea behind the coordinate vectors.1254

Okay, and I am going to write out all of this explicitly so we see what it is that we are looking at... L(v2), which is equal to (1,0), I want that to equal b1 × (1,2) + b2 × (-1,1)... and I want L(v3) which equals (2,0), I want that to equal c1 × (1,2) + c2 × (-1,1).1271

Okay. Well this is just a... we are going to solve an augmented system, except now we are looking for 3 solutions, just 1, 2, 3, so we are going to augment with 3 new columns.1302

So, here is what we form. We form (1,2), this is our matrix, (-1,1), and then we augment it with these 3 (1,-1), (1,0), (2,0).1313

I convert to reduce row echelon form, and I get (1,0,0,1), I get (0,-1,1/3,-2/3,2/3,-4/3). There we go.1329

Now I have expressed the L, I found the L, I converted each of these L into the coordinate vector... here, here, here, that is what these are. These are the coordinates of these 3 things that I found up here, with respect to the basis of the arrival space.1350

Now I just take these, and that is my matrix... a = (0,... mmm, these random lines are killing me... (0,-1,1/3,-2/3,2/3,-4/3). This is our matrix and notice this is not the same matrix that we had before.1369

This is (0,1/3,2/3,-1,-2/3,-4/3). The previous a that we got with respect to the natural basis, if you remember, let me do this in black actually, a natural basis, the first example, we ended up with (1,0,1,1... this is not going to work, what is going on here with these lines, it is driving me crazy... (1,0,1,1,0,-1).1402

This is the matrix, the same linear map... this is one matrix with respect to one basis, the same linear map. This matrix is different because we changed the basis. This is very, very important, and I will discuss this towards the end of the lesson.1447

I will actually digress into a bit of a philosophical discussion for what it is that is going on here.1465

Okay. So, we have this. So, we found our matrix with respect to that, now, let us confirm that property that we have.1472

So, we said that there is this property... let me go back to blue... L of some random vector in the departure space, the coordinate vector with respect to the basis is equal to this a × the x with respect to the basis of the arrival space.1485

Okay. So, a is the matrix that we just found, the one with (0,-1,1/3,-2/3,2/3,-4/3). Okay.1514

Let us just let x, pick a random... let us let it equal (3,7,5), random vector.1527

Okay. Well, L(x), or L(3,7,5) = 3 + 7, 7 - 5. It is equal to (10,2). Okay.1538

Let us circle that in red. Let us just set that aside. That is our transformation, (10,2). Okay.1554

Now, x... (3,7,5), with respect to f, the basis of the departure space equals the following... we are looking for... so we want to express this with respect to the basis s, which was (1,0,1,0,1,1,1,1,1,1), if you want to flip back and take a look at that basis.1560

We are looking for a1, a2, a3, such that a1 × (1,0,1) + a2 × (0,1,1) + a3 × (1,1,1) = this (3,7,5). Okay.1595

As it turns out, this a1, a2, a3... when I actually perform this ax = b, set this up, augmented matrix, I solve it... I end up with the following... (-2,2,5).1620

This equals x, my (3,7,5) with respect to the basis s. Okay. So, we will set that off for a second. Now, let me take this xx, which is (-2,2,5), and let me multiply it by my... so let me basically perform the right side here... let me multiply it by the matrix that I just got.1641

When I do that, I get (-1,1/3,-2/3,2/3,-4/3) × what I just got, which is (-2,2,5).1672

When I perform that, I end up with 4 - 6... put a blue circle around that.1692

Okay. That means... so, if I take... this is the coordinate vector of the thing that I got the (10,2). By getting it directly, when I solve the transformation, this is the coordinate vector with respect to its basis t, the arrival space.1707

It is telling me that that thing is equal to a × the xs, well I found the axs and I multiplied times the matrix, and this is what I got.1739

Now... that means that this thing times the basis t should give me (10,2), right?1748

So, if I take 4 times the basis for t, which is (1,2) - 6 × the other basis vector (1,1), if I actually perform this, I end up with (10,2).1757

This is the same as that. That confirms this property. That is what is going on here. Really, what is ultimately important here is the ability, is the first part of this example, the first two examples, the ability to compute the transformation matrix.1775

It allows me instead of doing the transformation directly, which may or may not be difficult, to go through and just do a matrix multiplication problem, which is usually very, very easy.1792

Okay. So, now, to our philosophical discussion. You notice that the same linear transformation gave rise to two different matrices.1803

Well, that is because we used two different bases. So, what you are seeing here is an example of something very, very, very profound. Not just mathematically, but very profound physically. Very profound with respect to the nature of reality.1819

Something exists. A linear map in this case. Something independent exists and it is independent of our representation.1836

In other words, in order for us to handle it, notice this linear map... we have to deal with coordinates. We have to choose a basis.1846

In one example, we chose the natural basis. In the second part of the example, we chose a different basis. Both of them are perfectly good bases, and this matrix that actually represents the matrix changes... the linear map does not change.1853

So, as it turns out, the linear map is that thing underneath which does exist, but in order to handle that linear map, we need to give it labels. We need to give it a frame of reference. We need to be able to "measure" it.1867

That is what science is all about... we are taking things that exist and we are assigning labels to them.1879

Well, let me take this a little bit deeper... all of your lives you have been told... so if you have some coordinate system... the standard Cartesian coordinate system, and if I tell you from this (0,0) point, if I move 3 spaces to the right, and then if I move 7 spaces up, that there is this point (3,7).1885

Well, (3,7) is just the label that we have attached to it. That point in that 2-dimensional vector space, Euclidian vector space exists whether I call it (3,7) or if I change bases, it might be an entirely... it might be (4,-19), depending on the basis that I choose.1905

Again, these things exist independent of the labels that we assign to them. The representations that we use to handle them. We need to... we need to be able to represent them somehow so that we can handle them, but the representation is not the thing in itself.1924

It is very, very important to be able to distinguish between those two. Okay.1939

Is it reducing it to something that it is not? No. It is just a label, but it is important for us to actually recognize that, so when we speak about the point (3,7), we need to understand that we have given that point a representation so that we can handle it.1948

We can manipulate it mathematically so that we can assign it some sort of value in the real world.1963

But its existence is not contingent on that. It exists whether we handle it or not, that is what is amazing. So what is amazing about abstract mathematics is we can make statement about the existence of something without having to handle it.1969

It is the engineers and physicists that actually label them, give them frames of references, so that they can actually manipulate them. That is all that is going on here.1982

Okay, with that, I am going to close it out.1993

Thank you for joining us at Educator.com, and thank you for joining us for linear algebra. Take good care, bye-bye.1997

Educator®

Please sign in to participate in this lecture discussion.

Resetting Your Password?
OR

Start Learning Now

Our free lessons will get you started (Adobe Flash® required).
Get immediate access to our entire library.

Membership Overview

  • Available 24/7. Unlimited Access to Our Entire Library.
  • Search and jump to exactly what you want to learn.
  • *Ask questions and get answers from the community and our teachers!
  • Practice questions with step-by-step solutions.
  • Download lecture slides for taking notes.
  • Track your course viewing progress.
  • Accessible anytime, anywhere with our Android and iOS apps.