Raffi Hovasapian

Raffi Hovasapian

Linear Mappings Revisited

Slide Duration:

Table of Contents

Section 1: Linear Equations and Matrices
Linear Systems

39m 3s

Intro
0:00
Linear Systems
1:20
Introduction to Linear Systems
1:21
Examples
10:35
Example 1
10:36
Example 2
13:44
Example 3
16:12
Example 4
23:48
Example 5
28:23
Example 6
32:32
Number of Solutions
35:08
One Solution, No Solution, Infinitely Many Solutions
35:09
Method of Elimination
36:57
Method of Elimination
36:58
Matrices

30m 34s

Intro
0:00
Matrices
0:47
Definition and Example of Matrices
0:48
Square Matrix
7:55
Diagonal Matrix
9:31
Operations with Matrices
10:35
Matrix Addition
10:36
Scalar Multiplication
15:01
Transpose of a Matrix
17:51
Matrix Types
23:17
Regular: m x n Matrix of m Rows and n Column
23:18
Square: n x n Matrix With an Equal Number of Rows and Columns
23:44
Diagonal: A Square Matrix Where All Entries OFF the Main Diagonal are '0'
24:07
Matrix Operations
24:37
Matrix Operations
24:38
Example
25:55
Example
25:56
Dot Product & Matrix Multiplication

41m 42s

Intro
0:00
Dot Product
1:04
Example of Dot Product
1:05
Matrix Multiplication
7:05
Definition
7:06
Example 1
12:26
Example 2
17:38
Matrices and Linear Systems
21:24
Matrices and Linear Systems
21:25
Example 1
29:56
Example 2
32:30
Summary
33:56
Dot Product of Two Vectors and Matrix Multiplication
33:57
Summary, cont.
35:06
Matrix Representations of Linear Systems
35:07
Examples
35:34
Examples
35:35
Properties of Matrix Operation

43m 17s

Intro
0:00
Properties of Addition
1:11
Properties of Addition: A
1:12
Properties of Addition: B
2:30
Properties of Addition: C
2:57
Properties of Addition: D
4:20
Properties of Addition
5:22
Properties of Addition
5:23
Properties of Multiplication
6:47
Properties of Multiplication: A
7:46
Properties of Multiplication: B
8:13
Properties of Multiplication: C
9:18
Example: Properties of Multiplication
9:35
Definitions and Properties (Multiplication)
14:02
Identity Matrix: n x n matrix
14:03
Let A Be a Matrix of m x n
15:23
Definitions and Properties (Multiplication)
18:36
Definitions and Properties (Multiplication)
18:37
Properties of Scalar Multiplication
22:54
Properties of Scalar Multiplication: A
23:39
Properties of Scalar Multiplication: B
24:04
Properties of Scalar Multiplication: C
24:29
Properties of Scalar Multiplication: D
24:48
Properties of the Transpose
25:30
Properties of the Transpose
25:31
Properties of the Transpose
30:28
Example
30:29
Properties of Matrix Addition
33:25
Let A, B, C, and D Be m x n Matrices
33:26
There is a Unique m x n Matrix, 0, Such That…
33:48
Unique Matrix D
34:17
Properties of Matrix Multiplication
34:58
Let A, B, and C Be Matrices of the Appropriate Size
34:59
Let A Be Square Matrix (n x n)
35:44
Properties of Scalar Multiplication
36:35
Let r and s Be Real Numbers, and A and B Matrices
36:36
Properties of the Transpose
37:10
Let r Be a Scalar, and A and B Matrices
37:12
Example
37:58
Example
37:59
Solutions of Linear Systems, Part 1

38m 14s

Intro
0:00
Reduced Row Echelon Form
0:29
An m x n Matrix is in Reduced Row Echelon Form If:
0:30
Reduced Row Echelon Form
2:58
Example: Reduced Row Echelon Form
2:59
Theorem
8:30
Every m x n Matrix is Row-Equivalent to a UNIQUE Matrix in Reduced Row Echelon Form
8:31
Systematic and Careful Example
10:02
Step 1
10:54
Step 2
11:33
Step 3
12:50
Step 4
14:02
Step 5
15:31
Step 6
17:28
Example
30:39
Find the Reduced Row Echelon Form of a Given m x n Matrix
30:40
Solutions of Linear Systems, Part II

28m 54s

Intro
0:00
Solutions of Linear Systems
0:11
Solutions of Linear Systems
0:13
Example I
3:25
Solve the Linear System 1
3:26
Solve the Linear System 2
14:31
Example II
17:41
Solve the Linear System 3
17:42
Solve the Linear System 4
20:17
Homogeneous Systems
21:54
Homogeneous Systems Overview
21:55
Theorem and Example
24:01
Inverse of a Matrix

40m 10s

Intro
0:00
Finding the Inverse of a Matrix
0:41
Finding the Inverse of a Matrix
0:42
Properties of Non-Singular Matrices
6:38
Practical Procedure
9:15
Step1
9:16
Step 2
10:10
Step 3
10:46
Example: Finding Inverse
12:50
Linear Systems and Inverses
17:01
Linear Systems and Inverses
17:02
Theorem and Example
21:15
Theorem
26:32
Theorem
26:33
List of Non-Singular Equivalences
28:37
Example: Does the Following System Have a Non-trivial Solution?
30:13
Example: Inverse of a Matrix
36:16
Section 2: Determinants
Determinants

21m 25s

Intro
0:00
Determinants
0:37
Introduction to Determinants
0:38
Example
6:12
Properties
9:00
Properties 1-5
9:01
Example
10:14
Properties, cont.
12:28
Properties 6 & 7
12:29
Example
14:14
Properties, cont.
18:34
Properties 8 & 9
18:35
Example
19:21
Cofactor Expansions

59m 31s

Intro
0:00
Cofactor Expansions and Their Application
0:42
Cofactor Expansions and Their Application
0:43
Example 1
3:52
Example 2
7:08
Evaluation of Determinants by Cofactor
9:38
Theorem
9:40
Example 1
11:41
Inverse of a Matrix by Cofactor
22:42
Inverse of a Matrix by Cofactor and Example
22:43
More Example
36:22
List of Non-Singular Equivalences
43:07
List of Non-Singular Equivalences
43:08
Example
44:38
Cramer's Rule
52:22
Introduction to Cramer's Rule and Example
52:23
Section 3: Vectors in Rn
Vectors in the Plane

46m 54s

Intro
0:00
Vectors in the Plane
0:38
Vectors in the Plane
0:39
Example 1
8:25
Example 2
15:23
Vector Addition and Scalar Multiplication
19:33
Vector Addition
19:34
Scalar Multiplication
24:08
Example
26:25
The Angle Between Two Vectors
29:33
The Angle Between Two Vectors
29:34
Example
33:54
Properties of the Dot Product and Unit Vectors
38:17
Properties of the Dot Product and Unit Vectors
38:18
Defining Unit Vectors
40:01
2 Very Important Unit Vectors
41:56
n-Vector

52m 44s

Intro
0:00
n-Vectors
0:58
4-Vector
0:59
7-Vector
1:50
Vector Addition
2:43
Scalar Multiplication
3:37
Theorem: Part 1
4:24
Theorem: Part 2
11:38
Right and Left Handed Coordinate System
14:19
Projection of a Point Onto a Coordinate Line/Plane
17:20
Example
21:27
Cauchy-Schwarz Inequality
24:56
Triangle Inequality
36:29
Unit Vector
40:34
Vectors and Dot Products
44:23
Orthogonal Vectors
44:24
Cauchy-Schwarz Inequality
45:04
Triangle Inequality
45:21
Example 1
45:40
Example 2
48:16
Linear Transformation

48m 53s

Intro
0:00
Introduction to Linear Transformations
0:44
Introduction to Linear Transformations
0:45
Example 1
9:01
Example 2
11:33
Definition of Linear Mapping
14:13
Example 3
22:31
Example 4
26:07
Example 5
30:36
Examples
36:12
Projection Mapping
36:13
Images, Range, and Linear Mapping
38:33
Example of Linear Transformation
42:02
Linear Transformations, Part II

34m 8s

Intro
0:00
Linear Transformations
1:29
Linear Transformations
1:30
Theorem 1
7:15
Theorem 2
9:20
Example 1: Find L (-3, 4, 2)
11:17
Example 2: Is It Linear?
17:11
Theorem 3
25:57
Example 3: Finding the Standard Matrix
29:09
Lines and Planes

37m 54s

Intro
0:00
Lines and Plane
0:36
Example 1
0:37
Example 2
7:07
Lines in IR3
9:53
Parametric Equations
14:58
Example 3
17:26
Example 4
20:11
Planes in IR3
25:19
Example 5
31:12
Example 6
34:18
Section 4: Real Vector Spaces
Vector Spaces

42m 19s

Intro
0:00
Vector Spaces
3:43
Definition of Vector Spaces
3:44
Vector Spaces 1
5:19
Vector Spaces 2
9:34
Real Vector Space and Complex Vector Space
14:01
Example 1
15:59
Example 2
18:42
Examples
26:22
More Examples
26:23
Properties of Vector Spaces
32:53
Properties of Vector Spaces Overview
32:54
Property A
34:31
Property B
36:09
Property C
36:38
Property D
37:54
Property F
39:00
Subspaces

43m 37s

Intro
0:00
Subspaces
0:47
Defining Subspaces
0:48
Example 1
3:08
Example 2
3:49
Theorem
7:26
Example 3
9:11
Example 4
12:30
Example 5
16:05
Linear Combinations
23:27
Definition 1
23:28
Example 1
25:24
Definition 2
29:49
Example 2
31:34
Theorem
32:42
Example 3
34:00
Spanning Set for a Vector Space

33m 15s

Intro
0:00
A Spanning Set for a Vector Space
1:10
A Spanning Set for a Vector Space
1:11
Procedure to Check if a Set of Vectors Spans a Vector Space
3:38
Example 1
6:50
Example 2
14:28
Example 3
21:06
Example 4
22:15
Linear Independence

17m 20s

Intro
0:00
Linear Independence
0:32
Definition
0:39
Meaning
3:00
Procedure for Determining if a Given List of Vectors is Linear Independence or Linear Dependence
5:00
Example 1
7:21
Example 2
10:20
Basis & Dimension

31m 20s

Intro
0:00
Basis and Dimension
0:23
Definition
0:24
Example 1
3:30
Example 2: Part A
4:00
Example 2: Part B
6:53
Theorem 1
9:40
Theorem 2
11:32
Procedure for Finding a Subset of S that is a Basis for Span S
14:20
Example 3
16:38
Theorem 3
21:08
Example 4
25:27
Homogeneous Systems

24m 45s

Intro
0:00
Homogeneous Systems
0:51
Homogeneous Systems
0:52
Procedure for Finding a Basis for the Null Space of Ax = 0
2:56
Example 1
7:39
Example 2
18:03
Relationship Between Homogeneous and Non-Homogeneous Systems
19:47
Rank of a Matrix, Part I

35m 3s

Intro
0:00
Rank of a Matrix
1:47
Definition
1:48
Theorem 1
8:14
Example 1
9:38
Defining Row and Column Rank
16:53
If We Want a Basis for Span S Consisting of Vectors From S
22:00
If We want a Basis for Span S Consisting of Vectors Not Necessarily in S
24:07
Example 2: Part A
26:44
Example 2: Part B
32:10
Rank of a Matrix, Part II

29m 26s

Intro
0:00
Rank of a Matrix
0:17
Example 1: Part A
0:18
Example 1: Part B
5:58
Rank of a Matrix Review: Rows, Columns, and Row Rank
8:22
Procedure for Computing the Rank of a Matrix
14:36
Theorem 1: Rank + Nullity = n
16:19
Example 2
17:48
Rank & Singularity
20:09
Example 3
21:08
Theorem 2
23:25
List of Non-Singular Equivalences
24:24
List of Non-Singular Equivalences
24:25
Coordinates of a Vector

27m 3s

Intro
0:00
Coordinates of a Vector
1:07
Coordinates of a Vector
1:08
Example 1
8:35
Example 2
15:28
Example 3: Part A
19:15
Example 3: Part B
22:26
Change of Basis & Transition Matrices

33m 47s

Intro
0:00
Change of Basis & Transition Matrices
0:56
Change of Basis & Transition Matrices
0:57
Example 1
10:44
Example 2
20:44
Theorem
23:37
Example 3: Part A
26:21
Example 3: Part B
32:05
Orthonormal Bases in n-Space

32m 53s

Intro
0:00
Orthonormal Bases in n-Space
1:02
Orthonormal Bases in n-Space: Definition
1:03
Example 1
4:31
Theorem 1
6:55
Theorem 2
8:00
Theorem 3
9:04
Example 2
10:07
Theorem 2
13:54
Procedure for Constructing an O/N Basis
16:11
Example 3
21:42
Orthogonal Complements, Part I

21m 27s

Intro
0:00
Orthogonal Complements
0:19
Definition
0:20
Theorem 1
5:36
Example 1
6:58
Theorem 2
13:26
Theorem 3
15:06
Example 2
18:20
Orthogonal Complements, Part II

33m 49s

Intro
0:00
Relations Among the Four Fundamental Vector Spaces Associated with a Matrix A
2:16
Four Spaces Associated With A (If A is m x n)
2:17
Theorem
4:49
Example 1
7:17
Null Space and Column Space
10:48
Projections and Applications
16:50
Projections and Applications
16:51
Projection Illustration
21:00
Example 1
23:51
Projection Illustration Review
30:15
Section 5: Eigenvalues and Eigenvectors
Eigenvalues and Eigenvectors

38m 11s

Intro
0:00
Eigenvalues and Eigenvectors
0:38
Eigenvalues and Eigenvectors
0:39
Definition 1
3:30
Example 1
7:20
Example 2
10:19
Definition 2
21:15
Example 3
23:41
Theorem 1
26:32
Theorem 2
27:56
Example 4
29:14
Review
34:32
Similar Matrices & Diagonalization

29m 55s

Intro
0:00
Similar Matrices and Diagonalization
0:25
Definition 1
0:26
Example 1
2:00
Properties
3:38
Definition 2
4:57
Theorem 1
6:12
Example 3
9:37
Theorem 2
12:40
Example 4
19:12
Example 5
20:55
Procedure for Diagonalizing Matrix A: Step 1
24:21
Procedure for Diagonalizing Matrix A: Step 2
25:04
Procedure for Diagonalizing Matrix A: Step 3
25:38
Procedure for Diagonalizing Matrix A: Step 4
27:02
Diagonalization of Symmetric Matrices

30m 14s

Intro
0:00
Diagonalization of Symmetric Matrices
1:15
Diagonalization of Symmetric Matrices
1:16
Theorem 1
2:24
Theorem 2
3:27
Example 1
4:47
Definition 1
6:44
Example 2
8:15
Theorem 3
10:28
Theorem 4
12:31
Example 3
18:00
Section 6: Linear Transformations
Linear Mappings Revisited

24m 5s

Intro
0:00
Linear Mappings
2:08
Definition
2:09
Linear Operator
7:36
Projection
8:48
Dilation
9:40
Contraction
10:07
Reflection
10:26
Rotation
11:06
Example 1
13:00
Theorem 1
18:16
Theorem 2
19:20
Kernel and Range of a Linear Map, Part I

26m 38s

Intro
0:00
Kernel and Range of a Linear Map
0:28
Definition 1
0:29
Example 1
4:36
Example 2
8:12
Definition 2
10:34
Example 3
13:34
Theorem 1
16:01
Theorem 2
18:26
Definition 3
21:11
Theorem 3
24:28
Kernel and Range of a Linear Map, Part II

25m 54s

Intro
0:00
Kernel and Range of a Linear Map
1:39
Theorem 1
1:40
Example 1: Part A
2:32
Example 1: Part B
8:12
Example 1: Part C
13:11
Example 1: Part D
14:55
Theorem 2
16:50
Theorem 3
23:00
Matrix of a Linear Map

33m 21s

Intro
0:00
Matrix of a Linear Map
0:11
Theorem 1
1:24
Procedure for Computing to Matrix: Step 1
7:10
Procedure for Computing to Matrix: Step 2
8:58
Procedure for Computing to Matrix: Step 3
9:50
Matrix of a Linear Map: Property
10:41
Example 1
14:07
Example 2
18:12
Example 3
24:31
Loading...
This is a quick preview of the lesson. For full access, please Log In or Sign up.
For more information, please see full course syllabus of Linear Algebra
Bookmark & Share Embed

Share this knowledge with your friends!

Copy & Paste this embed code into your website’s HTML

Please ensure that your website editor is in text mode when you paste the code.
(In Wordpress, the mode button is on the top right corner.)
  ×
  • - Allow users to view the embedded video in full-size.
Since this lesson is not free, only the preview will appear on your website.
  • Discussion

  • Answer Engine

  • Download Lecture Slides

  • Table of Contents

  • Transcription

  • Related Books

Lecture Comments (2)

1 answer

Last reply by: Professor Hovasapian
Sat Aug 8, 2015 10:00 PM

Post by Mohammed Alhumaidi on August 6, 2015

Dr. Raffi.

There is problem playing your videos. "The video could not be loaded, either due to network failure or an unsupported format." Can you please have it fixed

Linear Mappings Revisited

Lecture Slides are screen-captured images of important points in the lecture. Students can download and print out these lecture slide images to do practice problems as well as take notes while watching the lecture.

  • Intro 0:00
  • Linear Mappings 2:08
    • Definition
    • Linear Operator
    • Projection
    • Dilation
    • Contraction
    • Reflection
    • Rotation
    • Example 1
    • Theorem 1
    • Theorem 2

Transcription: Linear Mappings Revisited

Welcome back to educator.com and welcome back to linear algebra.0000

In our last lesson, we talked about the diagonalization of symmetric matrices, and that sort of closed out and rounded out the general discussion of Eigen values, Eigen vectors, Eigen spaces, things like that.0004

The Eigen value and Eigen vector problem is a profoundly important problem in all areas of mathematics and science, especially in the area of differential equations and partial differential equations in particular. It shows up in all kinds of interesting guises.0015

Of course, differential equations and partial differential equations are pretty much the, well, it is what science is all about, essentially, because all phenomena are described via differential and partial differential equations.0029

So this Eigen vector, Eigen value problem will show up profoundly often.0044

Today we are going to talk about linear mappings, and the matrices associated with linear mappings. Mostly just the linear mappings. We will get to the matrices in the next couple of lessons.0050

But, so you remember some lessons back, we actually talked about linear mappings, but we mostly talked about them from... like a 3-dimensional space to a 4-dimensional space.0059

RN to RM, some kind of space that we are familiar with for the most part. But, you also remember we have used examples where a space of continuous functions is a vector space, the space of polynomials of a given degree is a vector space...0070

So, these vector spaces, they do not, the points in the vector spaces do not have to necessarily be points the way that we are used to thinking about them, they can be any kind of mathematical object if they satisfy the properties of a vector space.0085

Well, vector spaces are nice, and we like to have that structure to deal with, but really what is interesting... when linear algebra becomes interesting is when you discuss mappings, and in particular linear mappings between vector spaces.0099

Now, we are going to speak more abstractly about vector spaces in linear mappings, as opposed to specifically from RN to RM.0113

Much of this is going to be review, which is very, very important for what it is we are going to be doing next. So, let us get started.0121

Okay. Let us start off like we always do with a definition. Let us go to black. I have not used black ink in a while.0130

Definition, let v and w be vector spaces, a linear mapping, which is also called a linear transformation, L(v) into w, and this into is very important for our definition.0140

We will talk about it more why we choose into here and another word later... into is a function, a signing... unique vector that we signify as L(u) in w to each vector u in v, such that the following hold... a, and if I have 2 vectors -- u and u... u and v -- if I have two vector u and v, both of them in v, and if I add them, then apply the linear mapping, that is the same as applying the linear mapping to each of them separately and then adding them... for u and v in v.0173

The second thing is... if I apply, if I take some vector u and I multiply by a constant then apply the function, it is the same as if I were to take the vector alone, apply the linear function and then multiply it by the constant.0247

For u and v... and k is any real number. Okay. Let us stop and take a look at this really, really carefully here.0269

This -- let me use a red -- this plus sign here on the left, this is addition in the space v. Let me draw these out so you see them.0286

That is v, this is w, my linear map is going to take something from here, do something to it, and it is going to land in another space, okay?0297

So, this addition is -- here let me... v... w... -- this addition on the left, these, this is addition in this space... a vector u and v, they are here... I add them and then I apply.0306

This addition over here, this is addition in this space. They do not have to be the same. It is very, very important to realize that. This is strictly symbolic.0321

As you go on in mathematics everything is going to become more symbolic and not necessarily hvae the meanings that you are used to seeing them with.0331

Yes, it is addition, but it does not necessarily mean the same addition.0338

So, for example, I can have R3, where I add vectors, the addition of vectors is totally different. A vector plus a vector is... yes, we add individual components, but we are really adding a vector and a vector, two objects.0344

This might be the real number system where I am actually adding a number and a number. Those are not the same things, because numbers are not vectors.0357

So, we symbolize it this way as long as you understand over on the left it is the addition in the departure space, over here on the right it is addition in the arrival space.0364

Okay, so, let us talk about what it is this means... if I have a vector u and if I have a vector v, I can add them in my v space, I stay in my v space and I get this vector u + v. It is another vector.0377

It is a closure property, it is a vector space. Then, if I apply the linear mapping L to it, I end up with some L(u+v). That is what this symbolizes. I do something to it, and I end up somewhere.0393

In order for this to be linear, it says that I can add them and then apply L, or I can apply L to u to get L(u), and I can apply L to v separately to get L(v), and now when I add these, I end up with the same thing.0412

Again, this is extraordinary. This is what makes a linear mapping. Okay? It has nothing to do with the idea of a line. That is just a name that we use to call it that. We could have called it anything else.0431

But, it actually preserves a structure moving from one space to another space. There is no reason why that should happen, and yet there it is. We give it a special name. Really, extraordinarily beautiful.0442

Okay. If the linear mapping has to be from a space onto itself, or a space onto a copy of itself... in other words R3 to R3, R5 to R5, the space of polynomials to the space of polynomials, we have a special name for it... we call it an operator.0456

Let me separate these words here... We call it a linear operator.0480

Operator theory is an entirely different branch of mathematics unto itself. Operators are the most important of the linear mappings, or the most ubiquitous.0494

Okay. Let us recall a couple of examples from linear operators -- of linear maps, I am sorry -- recall our previous examples of linear maps.0508

We had something called a projection... the projection was a map from, let us say, R3 to, let us say R2.0530

Defined by L of the vector (x,y,z), take a three vector and I end up spitting out a 2 vector.0544

I just take the first 2 (x,y), it is called a projection. We call it a projection because we are using our intuition to name this.0555

It is as if we shine a light on a 3-dimensional object, the shadow is a 2-dimensional object. All shadows are two dimensional. That is what a projection is. I am projecting the object onto a certain plane. I am projecting a 3-dimensional object onto its 2-dimensional shadow, creating the shadow. That is a linear map.0564

Dilation. This is a linear map. This is actually a linear operator. R3 to R3, and it is defined by L of some vector u is equal to R × u. I am basically just multiplying it by some real number, where R is bigger than 1.0583

Dilation means to make bigger. So, I expand the vector.0603

A contraction. Contraction is the same thing, so I will just put ditto marks here... and it is defined by the same thing, except now R is going to be > 0 and < 1. So, I take something, a vector, and I make it smaller. I contract it, I shrink it.0609

We have something called reflection. L is from R2... this is also a linear operator... I am mapping something in the plane to something in the plane. It is defined by L of the vector (x,y), or the point (x,y) = x - y. That is it. I am just reflexing it along the x axis.0629

There is also a reflection along the y axis if I want, where it is the x that becomes negative. Same thing.0661

The final one, rotation, which is probably the most important and the most complex... and most beautiful as it turns out... of the linear maps... also a linear operator. R2 to R2, or R3 to R3. We can rotate in 3-dimensions.0668

We can actually rotate in any number of dimensions. Again, mathematics is not constrained by the realities of physical space. That is what makes mathematics beautiful.0684

These things that exist and are real in real space, they exist and are real in any number of dimensions... defined by L(u), if I take a vector, and if I multiply it by the following matrix, cos(Θ) - sin(Θ), sin(Θ), cos(Θ), × the vector u.0694

If I take a vector u, and I multiply it on the left by this matrix, cos(Θ), -sin(Θ), sin(Θ), cos(Θ), this two by two matrix... I actually rotate this vector by the angle Θ. That is what I am doing.0723

Every time you turn on your computer screen, every time you play a video game, these linear maps that are actually making your video game, making your computer screen possible. That is what is happening on the screen.0735

We are just taking images, and we are projecting them, we are dilating them, we are contracting them, we are reflecting them, we are rotating them... at very high speeds of course... but this is all that is happening. It is all just linear algebra taking place in your computer, on your screen.0747

Okay. So, in order to verify that a function is a linear mapping, we have to check the function against the definition. It means we have to check the part a and part b. Okay, so let us do an example.0762

Let us see, let v be an n-dimensional vector space... does not say anything about the nature of this space, just says n-dimensional... We do not know what the objects are... space... vector space... n-dimensional vector space.0783

Let s = v1, v2, all the way to vN be a basis for v.0810

Okay. We know that for v, a vector v in the vector space v, we can because this is a basis, we can write it as a series of constants... c1 × v1, just a linear combination of the elements of the basis. That is the whole idea of a basis. Linearly independent and spans the entire base.0832

Every vector in that space can be written as a linear combination. A unique linear combination, in fact... 1 + c2v2 + cNvN.0851

Okay. So, let us just throw that out there. Now, let us define our linear map, which takes v and maps it to the space RN, to N-space.0871

So some N-dimensional vector space v, and it is going to map to our N-dimensional Euclidian space, RN, and defined by L(v), whatever the vector is, I end up taking its -- I end up with the coordinate vector.0885

So, I have some vector v in some random vector space that has this basis. Well, anything in v can be written this way.0908

Therefore this v, what it does is it takes this v and it maps it to the RN space, which is the list of coordinates which is just the constants that make up the representation from the basis.0918

So, I am taking the vector v, and I am spitting out the coordinate vector of v, with respect to this basis s.0932

Okay. Is this linear map -- I am sorry -- is this map linear? Is L linear? We do not know if it is linear, we just know that it is a mapping from one vector space to another.0949

Well, let us check a. A, we need to check whether the sum of two vectors in v? Does it equal L(u) individually + L(v).0953

Well, let us do it. L(u + v). Well, we just use our definition, we just plug the definition in.0970

That is equal to u + v, the coordinate of that. Well, we already know that the coordinates are themselves are linear. So, this is equal to the coordinate of u with respect to s + the coordinate of v with respect to s, but that is just L(u) + L(v).0981

So, I have shown that this equals that. So, part a is taken care of. So, now let us do part b.1007

We need to show that L(k × u)... does it equal k × L(u)?1018

Well, L(k × vector u) = k × u, that is the coordinate vector of ku, but the coordinate vector of k × u with respect to s is equal to k × the coordinate vector of u with respect to s.1029

That is equal to k × L(u), because that is the definition. So, we have shown that L of ku equals k × L(u), so yes, b is also taken care of.1050

So, this map, that takes any vector from a vector space with a given basis and spits out... does something to it and spits out the coordinate vectors... the coordinate vector with respect to the basis s, which is just the coefficients that make it up, this is a linear map. That is all we are doing, we are just checking the definition.1068

Let us throw out a nice little theorem here. Let L from v to w, be a linear map... a linear transformation. Then, a L of the 0 vector in v is equal to the 0 vector in w.1097

So, we put this v and w to remind us that we are talking about different spaces. If I take the 0 vector in b, and if I apply L to it, it maps to the 0 vector in my arrival space. That is kind of extraordinary actually.1126

And b, which will make sense. It is just the inverse of addition. It says L(u - v) = L(u) - L(v), so we are just extending this subtraction.1143

Okay. Now, let us have another theorem, which will come in handy. Let L be a mapping from v to w, and we will let it be a linear mapping of an n-dimensional vector space into w.1162

Also, let s = set v1, v2, just like before, all the way to vN, be a basis for v. So, I have a vector space v, I have a basis for v, and I have some function L which takes a vector in v and spits out something in w.1208

If u is in v, then L(u) is completely determined... I am going to be a little bit clearer here. Let me actually write out all of my letters. Completely determined by the set L(v1) L(v2), so on and so forth... L(vN).1237

I will tell you what this means. If I have a vector space v, and if I have a basis for that vector space... and if I take some random vector in v, and apply the linear transformation to it, I end up somewhere in w. Well, because I have a basis for v, I know exactly where I am going to end up in w. 2138 Because all I have to do is take these basis vectors, v1 to vN, apply L to them, and the L(v1), L(v2), L(v3) all the way to L(vN)... they actually end up becoming precisely the vectors, in some sense, that are needed to describe w, where I ended up. That is what linearity means.1276

For this particular theorem, I am not going to go ahead and give you a complete proof, but I am going to make it plausible for you here. So, let us take this vector v, in the vector space v... well, we know that we can write v as c1v1 + c2v2 + so on and so forth + cNvN.1318

Okay. Now, let us apply L to this. Well, L(v) is equal to L of this whole thing. c1v1 + c2v2 + so on and so forth + cNvN.1342

Well, L is a linear map. That is the hypothesis of the... that is the given part of the theorem. It is a linear map. Well a linear map, just pull out the linearity by definition. That equals c1 × L(v1) + c2 × L(v2) + so on and so forth + cN × L(vN).1362

So, again, if I have a basis for my departure space, and I take some random vector v in that departure space, I transform it, you know do some function to it, I already know what my answer is going to be... it is going to be precisely the coefficients c1, c2, all the way to cN multiplied by the transformation on the basis vectors.1389

All I have to do is operate on the basis vectors and I stick the coefficients that I got from my original v and I have got my answer. Where I ended up in my arrival space.1414

So, again, it is completely determined where I end up in my arrival space is completely determined by the linear transformation on the basis of the departure space.1427

Okay. Thank you for joining us at Educator.com for this particular review of linear mappings, we will see you next time.1440

Educator®

Please sign in to participate in this lecture discussion.

Resetting Your Password?
OR

Start Learning Now

Our free lessons will get you started (Adobe Flash® required).
Get immediate access to our entire library.

Membership Overview

  • Available 24/7. Unlimited Access to Our Entire Library.
  • Search and jump to exactly what you want to learn.
  • *Ask questions and get answers from the community and our teachers!
  • Practice questions with step-by-step solutions.
  • Download lecture slides for taking notes.
  • Track your course viewing progress.
  • Accessible anytime, anywhere with our Android and iOS apps.