Raffi Hovasapian

Raffi Hovasapian

Kernel and Range of a Linear Map, Part II

Slide Duration:

Table of Contents

Section 1: Linear Equations and Matrices
Linear Systems

39m 3s

Intro
0:00
Linear Systems
1:20
Introduction to Linear Systems
1:21
Examples
10:35
Example 1
10:36
Example 2
13:44
Example 3
16:12
Example 4
23:48
Example 5
28:23
Example 6
32:32
Number of Solutions
35:08
One Solution, No Solution, Infinitely Many Solutions
35:09
Method of Elimination
36:57
Method of Elimination
36:58
Matrices

30m 34s

Intro
0:00
Matrices
0:47
Definition and Example of Matrices
0:48
Square Matrix
7:55
Diagonal Matrix
9:31
Operations with Matrices
10:35
Matrix Addition
10:36
Scalar Multiplication
15:01
Transpose of a Matrix
17:51
Matrix Types
23:17
Regular: m x n Matrix of m Rows and n Column
23:18
Square: n x n Matrix With an Equal Number of Rows and Columns
23:44
Diagonal: A Square Matrix Where All Entries OFF the Main Diagonal are '0'
24:07
Matrix Operations
24:37
Matrix Operations
24:38
Example
25:55
Example
25:56
Dot Product & Matrix Multiplication

41m 42s

Intro
0:00
Dot Product
1:04
Example of Dot Product
1:05
Matrix Multiplication
7:05
Definition
7:06
Example 1
12:26
Example 2
17:38
Matrices and Linear Systems
21:24
Matrices and Linear Systems
21:25
Example 1
29:56
Example 2
32:30
Summary
33:56
Dot Product of Two Vectors and Matrix Multiplication
33:57
Summary, cont.
35:06
Matrix Representations of Linear Systems
35:07
Examples
35:34
Examples
35:35
Properties of Matrix Operation

43m 17s

Intro
0:00
Properties of Addition
1:11
Properties of Addition: A
1:12
Properties of Addition: B
2:30
Properties of Addition: C
2:57
Properties of Addition: D
4:20
Properties of Addition
5:22
Properties of Addition
5:23
Properties of Multiplication
6:47
Properties of Multiplication: A
7:46
Properties of Multiplication: B
8:13
Properties of Multiplication: C
9:18
Example: Properties of Multiplication
9:35
Definitions and Properties (Multiplication)
14:02
Identity Matrix: n x n matrix
14:03
Let A Be a Matrix of m x n
15:23
Definitions and Properties (Multiplication)
18:36
Definitions and Properties (Multiplication)
18:37
Properties of Scalar Multiplication
22:54
Properties of Scalar Multiplication: A
23:39
Properties of Scalar Multiplication: B
24:04
Properties of Scalar Multiplication: C
24:29
Properties of Scalar Multiplication: D
24:48
Properties of the Transpose
25:30
Properties of the Transpose
25:31
Properties of the Transpose
30:28
Example
30:29
Properties of Matrix Addition
33:25
Let A, B, C, and D Be m x n Matrices
33:26
There is a Unique m x n Matrix, 0, Such That…
33:48
Unique Matrix D
34:17
Properties of Matrix Multiplication
34:58
Let A, B, and C Be Matrices of the Appropriate Size
34:59
Let A Be Square Matrix (n x n)
35:44
Properties of Scalar Multiplication
36:35
Let r and s Be Real Numbers, and A and B Matrices
36:36
Properties of the Transpose
37:10
Let r Be a Scalar, and A and B Matrices
37:12
Example
37:58
Example
37:59
Solutions of Linear Systems, Part 1

38m 14s

Intro
0:00
Reduced Row Echelon Form
0:29
An m x n Matrix is in Reduced Row Echelon Form If:
0:30
Reduced Row Echelon Form
2:58
Example: Reduced Row Echelon Form
2:59
Theorem
8:30
Every m x n Matrix is Row-Equivalent to a UNIQUE Matrix in Reduced Row Echelon Form
8:31
Systematic and Careful Example
10:02
Step 1
10:54
Step 2
11:33
Step 3
12:50
Step 4
14:02
Step 5
15:31
Step 6
17:28
Example
30:39
Find the Reduced Row Echelon Form of a Given m x n Matrix
30:40
Solutions of Linear Systems, Part II

28m 54s

Intro
0:00
Solutions of Linear Systems
0:11
Solutions of Linear Systems
0:13
Example I
3:25
Solve the Linear System 1
3:26
Solve the Linear System 2
14:31
Example II
17:41
Solve the Linear System 3
17:42
Solve the Linear System 4
20:17
Homogeneous Systems
21:54
Homogeneous Systems Overview
21:55
Theorem and Example
24:01
Inverse of a Matrix

40m 10s

Intro
0:00
Finding the Inverse of a Matrix
0:41
Finding the Inverse of a Matrix
0:42
Properties of Non-Singular Matrices
6:38
Practical Procedure
9:15
Step1
9:16
Step 2
10:10
Step 3
10:46
Example: Finding Inverse
12:50
Linear Systems and Inverses
17:01
Linear Systems and Inverses
17:02
Theorem and Example
21:15
Theorem
26:32
Theorem
26:33
List of Non-Singular Equivalences
28:37
Example: Does the Following System Have a Non-trivial Solution?
30:13
Example: Inverse of a Matrix
36:16
Section 2: Determinants
Determinants

21m 25s

Intro
0:00
Determinants
0:37
Introduction to Determinants
0:38
Example
6:12
Properties
9:00
Properties 1-5
9:01
Example
10:14
Properties, cont.
12:28
Properties 6 & 7
12:29
Example
14:14
Properties, cont.
18:34
Properties 8 & 9
18:35
Example
19:21
Cofactor Expansions

59m 31s

Intro
0:00
Cofactor Expansions and Their Application
0:42
Cofactor Expansions and Their Application
0:43
Example 1
3:52
Example 2
7:08
Evaluation of Determinants by Cofactor
9:38
Theorem
9:40
Example 1
11:41
Inverse of a Matrix by Cofactor
22:42
Inverse of a Matrix by Cofactor and Example
22:43
More Example
36:22
List of Non-Singular Equivalences
43:07
List of Non-Singular Equivalences
43:08
Example
44:38
Cramer's Rule
52:22
Introduction to Cramer's Rule and Example
52:23
Section 3: Vectors in Rn
Vectors in the Plane

46m 54s

Intro
0:00
Vectors in the Plane
0:38
Vectors in the Plane
0:39
Example 1
8:25
Example 2
15:23
Vector Addition and Scalar Multiplication
19:33
Vector Addition
19:34
Scalar Multiplication
24:08
Example
26:25
The Angle Between Two Vectors
29:33
The Angle Between Two Vectors
29:34
Example
33:54
Properties of the Dot Product and Unit Vectors
38:17
Properties of the Dot Product and Unit Vectors
38:18
Defining Unit Vectors
40:01
2 Very Important Unit Vectors
41:56
n-Vector

52m 44s

Intro
0:00
n-Vectors
0:58
4-Vector
0:59
7-Vector
1:50
Vector Addition
2:43
Scalar Multiplication
3:37
Theorem: Part 1
4:24
Theorem: Part 2
11:38
Right and Left Handed Coordinate System
14:19
Projection of a Point Onto a Coordinate Line/Plane
17:20
Example
21:27
Cauchy-Schwarz Inequality
24:56
Triangle Inequality
36:29
Unit Vector
40:34
Vectors and Dot Products
44:23
Orthogonal Vectors
44:24
Cauchy-Schwarz Inequality
45:04
Triangle Inequality
45:21
Example 1
45:40
Example 2
48:16
Linear Transformation

48m 53s

Intro
0:00
Introduction to Linear Transformations
0:44
Introduction to Linear Transformations
0:45
Example 1
9:01
Example 2
11:33
Definition of Linear Mapping
14:13
Example 3
22:31
Example 4
26:07
Example 5
30:36
Examples
36:12
Projection Mapping
36:13
Images, Range, and Linear Mapping
38:33
Example of Linear Transformation
42:02
Linear Transformations, Part II

34m 8s

Intro
0:00
Linear Transformations
1:29
Linear Transformations
1:30
Theorem 1
7:15
Theorem 2
9:20
Example 1: Find L (-3, 4, 2)
11:17
Example 2: Is It Linear?
17:11
Theorem 3
25:57
Example 3: Finding the Standard Matrix
29:09
Lines and Planes

37m 54s

Intro
0:00
Lines and Plane
0:36
Example 1
0:37
Example 2
7:07
Lines in IR3
9:53
Parametric Equations
14:58
Example 3
17:26
Example 4
20:11
Planes in IR3
25:19
Example 5
31:12
Example 6
34:18
Section 4: Real Vector Spaces
Vector Spaces

42m 19s

Intro
0:00
Vector Spaces
3:43
Definition of Vector Spaces
3:44
Vector Spaces 1
5:19
Vector Spaces 2
9:34
Real Vector Space and Complex Vector Space
14:01
Example 1
15:59
Example 2
18:42
Examples
26:22
More Examples
26:23
Properties of Vector Spaces
32:53
Properties of Vector Spaces Overview
32:54
Property A
34:31
Property B
36:09
Property C
36:38
Property D
37:54
Property F
39:00
Subspaces

43m 37s

Intro
0:00
Subspaces
0:47
Defining Subspaces
0:48
Example 1
3:08
Example 2
3:49
Theorem
7:26
Example 3
9:11
Example 4
12:30
Example 5
16:05
Linear Combinations
23:27
Definition 1
23:28
Example 1
25:24
Definition 2
29:49
Example 2
31:34
Theorem
32:42
Example 3
34:00
Spanning Set for a Vector Space

33m 15s

Intro
0:00
A Spanning Set for a Vector Space
1:10
A Spanning Set for a Vector Space
1:11
Procedure to Check if a Set of Vectors Spans a Vector Space
3:38
Example 1
6:50
Example 2
14:28
Example 3
21:06
Example 4
22:15
Linear Independence

17m 20s

Intro
0:00
Linear Independence
0:32
Definition
0:39
Meaning
3:00
Procedure for Determining if a Given List of Vectors is Linear Independence or Linear Dependence
5:00
Example 1
7:21
Example 2
10:20
Basis & Dimension

31m 20s

Intro
0:00
Basis and Dimension
0:23
Definition
0:24
Example 1
3:30
Example 2: Part A
4:00
Example 2: Part B
6:53
Theorem 1
9:40
Theorem 2
11:32
Procedure for Finding a Subset of S that is a Basis for Span S
14:20
Example 3
16:38
Theorem 3
21:08
Example 4
25:27
Homogeneous Systems

24m 45s

Intro
0:00
Homogeneous Systems
0:51
Homogeneous Systems
0:52
Procedure for Finding a Basis for the Null Space of Ax = 0
2:56
Example 1
7:39
Example 2
18:03
Relationship Between Homogeneous and Non-Homogeneous Systems
19:47
Rank of a Matrix, Part I

35m 3s

Intro
0:00
Rank of a Matrix
1:47
Definition
1:48
Theorem 1
8:14
Example 1
9:38
Defining Row and Column Rank
16:53
If We Want a Basis for Span S Consisting of Vectors From S
22:00
If We want a Basis for Span S Consisting of Vectors Not Necessarily in S
24:07
Example 2: Part A
26:44
Example 2: Part B
32:10
Rank of a Matrix, Part II

29m 26s

Intro
0:00
Rank of a Matrix
0:17
Example 1: Part A
0:18
Example 1: Part B
5:58
Rank of a Matrix Review: Rows, Columns, and Row Rank
8:22
Procedure for Computing the Rank of a Matrix
14:36
Theorem 1: Rank + Nullity = n
16:19
Example 2
17:48
Rank & Singularity
20:09
Example 3
21:08
Theorem 2
23:25
List of Non-Singular Equivalences
24:24
List of Non-Singular Equivalences
24:25
Coordinates of a Vector

27m 3s

Intro
0:00
Coordinates of a Vector
1:07
Coordinates of a Vector
1:08
Example 1
8:35
Example 2
15:28
Example 3: Part A
19:15
Example 3: Part B
22:26
Change of Basis & Transition Matrices

33m 47s

Intro
0:00
Change of Basis & Transition Matrices
0:56
Change of Basis & Transition Matrices
0:57
Example 1
10:44
Example 2
20:44
Theorem
23:37
Example 3: Part A
26:21
Example 3: Part B
32:05
Orthonormal Bases in n-Space

32m 53s

Intro
0:00
Orthonormal Bases in n-Space
1:02
Orthonormal Bases in n-Space: Definition
1:03
Example 1
4:31
Theorem 1
6:55
Theorem 2
8:00
Theorem 3
9:04
Example 2
10:07
Theorem 2
13:54
Procedure for Constructing an O/N Basis
16:11
Example 3
21:42
Orthogonal Complements, Part I

21m 27s

Intro
0:00
Orthogonal Complements
0:19
Definition
0:20
Theorem 1
5:36
Example 1
6:58
Theorem 2
13:26
Theorem 3
15:06
Example 2
18:20
Orthogonal Complements, Part II

33m 49s

Intro
0:00
Relations Among the Four Fundamental Vector Spaces Associated with a Matrix A
2:16
Four Spaces Associated With A (If A is m x n)
2:17
Theorem
4:49
Example 1
7:17
Null Space and Column Space
10:48
Projections and Applications
16:50
Projections and Applications
16:51
Projection Illustration
21:00
Example 1
23:51
Projection Illustration Review
30:15
Section 5: Eigenvalues and Eigenvectors
Eigenvalues and Eigenvectors

38m 11s

Intro
0:00
Eigenvalues and Eigenvectors
0:38
Eigenvalues and Eigenvectors
0:39
Definition 1
3:30
Example 1
7:20
Example 2
10:19
Definition 2
21:15
Example 3
23:41
Theorem 1
26:32
Theorem 2
27:56
Example 4
29:14
Review
34:32
Similar Matrices & Diagonalization

29m 55s

Intro
0:00
Similar Matrices and Diagonalization
0:25
Definition 1
0:26
Example 1
2:00
Properties
3:38
Definition 2
4:57
Theorem 1
6:12
Example 3
9:37
Theorem 2
12:40
Example 4
19:12
Example 5
20:55
Procedure for Diagonalizing Matrix A: Step 1
24:21
Procedure for Diagonalizing Matrix A: Step 2
25:04
Procedure for Diagonalizing Matrix A: Step 3
25:38
Procedure for Diagonalizing Matrix A: Step 4
27:02
Diagonalization of Symmetric Matrices

30m 14s

Intro
0:00
Diagonalization of Symmetric Matrices
1:15
Diagonalization of Symmetric Matrices
1:16
Theorem 1
2:24
Theorem 2
3:27
Example 1
4:47
Definition 1
6:44
Example 2
8:15
Theorem 3
10:28
Theorem 4
12:31
Example 3
18:00
Section 6: Linear Transformations
Linear Mappings Revisited

24m 5s

Intro
0:00
Linear Mappings
2:08
Definition
2:09
Linear Operator
7:36
Projection
8:48
Dilation
9:40
Contraction
10:07
Reflection
10:26
Rotation
11:06
Example 1
13:00
Theorem 1
18:16
Theorem 2
19:20
Kernel and Range of a Linear Map, Part I

26m 38s

Intro
0:00
Kernel and Range of a Linear Map
0:28
Definition 1
0:29
Example 1
4:36
Example 2
8:12
Definition 2
10:34
Example 3
13:34
Theorem 1
16:01
Theorem 2
18:26
Definition 3
21:11
Theorem 3
24:28
Kernel and Range of a Linear Map, Part II

25m 54s

Intro
0:00
Kernel and Range of a Linear Map
1:39
Theorem 1
1:40
Example 1: Part A
2:32
Example 1: Part B
8:12
Example 1: Part C
13:11
Example 1: Part D
14:55
Theorem 2
16:50
Theorem 3
23:00
Matrix of a Linear Map

33m 21s

Intro
0:00
Matrix of a Linear Map
0:11
Theorem 1
1:24
Procedure for Computing to Matrix: Step 1
7:10
Procedure for Computing to Matrix: Step 2
8:58
Procedure for Computing to Matrix: Step 3
9:50
Matrix of a Linear Map: Property
10:41
Example 1
14:07
Example 2
18:12
Example 3
24:31
Loading...
This is a quick preview of the lesson. For full access, please Log In or Sign up.
For more information, please see full course syllabus of Linear Algebra
Bookmark & Share Embed

Share this knowledge with your friends!

Copy & Paste this embed code into your website’s HTML

Please ensure that your website editor is in text mode when you paste the code.
(In Wordpress, the mode button is on the top right corner.)
  ×
  • - Allow users to view the embedded video in full-size.
Since this lesson is not free, only the preview will appear on your website.
  • Discussion

  • Answer Engine

  • Download Lecture Slides

  • Table of Contents

  • Transcription

  • Related Books

Lecture Comments (1)

0 answers

Post by Manfred Berger on June 25, 2013

Am I correct in assuming that Theorem 3b only holds for finite dimensional spaces?

Kernel and Range of a Linear Map, Part II

Lecture Slides are screen-captured images of important points in the lecture. Students can download and print out these lecture slide images to do practice problems as well as take notes while watching the lecture.

  • Intro 0:00
  • Kernel and Range of a Linear Map 1:39
    • Theorem 1
    • Example 1: Part A
    • Example 1: Part B
    • Example 1: Part C
    • Example 1: Part D
    • Theorem 2
    • Theorem 3

Transcription: Kernel and Range of a Linear Map, Part II

Hello and welcome back to Educator.com and welcome back to linear algebra.0000

Today we are going to continue our discussion of the kernel and range of a linear map of a linear transformation.0004

From the previous lesson, we left it off defining what the range of a linear map is.0011

Real quickly though, let me go back and discuss what the kernel of a linear map is.0017

Basically, the kernel of a linear map, from a vector space v to a vector space w is all those vectors in v that map to the 0 vector. That is it.0022

So, if I have one vector that goes to 0, that is the kernel. If I have 5 vectors that map to 0, those 5 vectors, they form the kernel. If I have an infinite number of vectors that form... that all map to the same thing, the 0 vector in w, that is what the kernel is.0031

Recall that the kernel is not only a subset of the vector space v, but it is also a sub-space, so it is a very special kind of thing.0049

As a subspace, you can find a basis for it. Okay. Now, we defined the range also. So, the range is all those vectors in w, the arrival space that are images of some vector in v.0056

So, if there is something in v that maps to w, all of those w that are represented... that is the range.0071

That does not mean it is all of w... it can be all of w, which we actually give a special name and we will talk about that in a second, but it is just those vectors in w that are mapped from vectors in v, under the linear transformation L.0080

Okay. Now, let us go ahead and get started with our first theorem concerning the range.0096

Well, just like the kernel is a subspace of the departure space, the range happens to be a subspace of the arrival space.0102

So, our first theorem says... the range L is a subspace of w for L being a linear mapping from v to w.0111

So, again, kernel is a subspace in v, range is a subspace of w.0146

Okay. Let us do an example here, concerning ranges and kernels and things like that. Ranges actually.0152

So, we will say that L is a mapping from R3 to R3 itself, which again, when the dimension is the same, or when it is a mapping from itself onto itself, we call it a linear operator, but it is still just a linear map - let it be defined by L of some vector x is equal to a matrix product... (1,0,1), (1,1,2), (2,1,3) × x, which happens to be x1, x2, x3 in component form.0161

So if I have a vector in x, the transformation, the linear transformation is multiplication by a matrix on the left.0196

Okay. Our question is... is L onto. Okay, so, this onto thing... remember we said that the range of a linear map is those vectors in the arrival space, w, that are the image of some vector v from the departure space.0204

Well, if every vector in w is the image of some vector in v, that means if every single vector in w is represented, that is what we mean it is onto.0224

That means the linear map literally maps onto the entire space w, as opposed to the range which is just a subspace of it, a part of it. That is all onto means, all of w is represented.0235

Okay. So, well, let us take a random vector in w, and in the case the arrival space, in R3, so let us just go ahead and... in R3, and we will just call it w and we will call it a, b, and c.0248

It is just some random vector in the arrival space. Okay.0275

Now, the question is, can we find some vector in the departure space that is the pre-image of this w in the arrival space? That is the whole idea. So, we speak about the image, we speak about the pre-image.0281

So, I am starting from the perspective of w, some random vector in w... can I find... if I take every vector w... can I find something in v that actually maps to that w. That is what we want to know. Is every vector in w represented? Okay.0299

So, the question we want to answer is can we find (x,y,z), also in R3 because R3 is the departure space such that (1,0,1), (1,1,2), (2,1,3) × (x,y,z) equals our a,b,c, which is our random in w. That is what we want to find.0314

We want to find x,y,z... x, y and, z, such that this is represented. What values of a,b,c, will make this possible. Well, we go ahead and we form the augmented system, (1,0,1), (1,1,2), (2,1,3), and we augment it with a.0349

We augment it with a... b... c... okay, that is our augment, and then we subject it to Gauss - Jordan elimination to take it to reduced row echelon form, and when we do that we end up with the following. (1,0,1,0,1,1,0,0,0)0372

Over here we end up with a... b - a... c - a - b, so let us talk about what this means. Notice this last row here is all 0's, and this is c - a - b over here.0398

The only way that this a consistent system, the only way that this has a solution is if c - a - b = 0.0411

So, the only way that some random vector, when I take a random vector in w, and I subject it to the conditions of this linear map, x, y, and z, the relationship between a, b, and c, has to be that c - a - b = 0.0425

What this means it that this is very specific. I cannot just take any random numbers. I cannot just take (5,7,18).0440

The relationship among these 3 numbers for the vector in w... a, b, c, has to be such that c - a - b = 0, which means that not every vector in w is represented. So, this is not onto.0450

Okay. I hope that makes sense. Again, I took a random vector, I need to be able to solve this system and have every single vector be possible, but this solution, this system tells me that this has a solution, because if c - a - b = 0.0462

Those are very specific numbers that actually do that. Yes, there may be an infinite number of them, but they do not represent all of the vectors in w, therefore this linear map is not onto.0480

Okay. Now, let us do something else. Continue the example... part b.0490

Now, the question is, find a basis for the range of L. Find a basis for the range of L.0501

So, we know it is a subspace, so we know the basis for it. So, let us go ahead and see what this range is.0513

In other words, let us take L of some random x,y,z, which is in v, the departure space... well, L is of course, that matrix, (1,0,1), (1,1,2), (2,1,3), × x,y,z, and when I do this matrix multiplication, I end up with the following.0520

The vector that I get is +z, x + y, + 2z, and the third entry is going to be 2x + y + 3z, and I just got that from basic matrix multiplication... this times that + this times that + this times that, and then go to the second row... this times that, this times that...0546

Remember? Matrix multiplication. We did it really, really, really early on. Okay.0570

So, this thing, I can actually pull out... it becomes the following. It is x × (1,1,2), I just take the coefficients of the x's, +y × (0,1,1)... + z × (1,2,3).0575

Therefore, that vector, that vector, and that vector... let me actually write them as a set, but we have not found the basis yet. This just gives me the span. Okay.0603

So, the vector is (1,1,2), (0,1,1), and (1,2,3), they span the range of L.0616

So remember, a series of vectors, a set of vectors that spans a subspace or a space, in order for it to be a basis, it has to be a linearly independent set.0632

So, once we have our span, we need to check these three vectors to makes sure that they are linearly independent, and we do that by solving, by taking this matrix... augmenting it with a 0 vector, and then solving... turning it into reduced row echelon form and then the vector of the one's with the leading entries... those actually form a basis for a space.0644

So, let us take this, the matrix again, so we do (1,1,2), (0,1,1), (1,2,3), we augment with (0,0,0), we turn it into reduced row echelon form, and we end up with the following.0668

We end up with (1,0,0), (0,1,0), both of those columns have leading entries, we end up with (1,1,0), no leading entry there, and of course (0,0,0).0684

So, our first column and our second column have leading entries which means the vectors corresponding to the first and second column, namely that and that, they form a basis for this space.0693

That means that this was actually a linear combination of these two. This set is not linearly independent, but these two alone are linearly independent. That is what this process did.0707

So, now, I am going to take the first two vectors, (1,1,2), the first two columns... and (0,1,1), this is a basis for the range. Notice, the basis for the range has 2 vectors in it.0717

A basis, the number of vectors in the basis is the dimension of that subspace. So, the range has a dimension of 2.0744

However, our w, our arrival space was R3. It has dimension of 3. Since this dimension is 2, it is not the entire space. So, this confirms the fact of what we did in a.0751

It confirms the fact that this linear map is not onto, and in fact this procedure is probably the best thing to do... find the spanning set for the range, and then reduce it to a basis, and then just count the number of vectors in the basis.0765

If the dimension of the range is less than the dimension of the arrival space, well, it is not onto. If it is, it is onto.0779

Okay. Let us continue with this and do a little bit more, extract a little bit more information here. Let us find the kernel of this linear map, find the kernel of L. Okay.0790

So, now, -- let me erase this -- now what we want, we want to take... well, the kernel is again, all the... we want to find all of the vectors that map to 0 in the arrival space, which means we want to solve the homogeneous system.0803

We want to find a, all the vectors x, that map to 0 in w. Well, that is just... take the matrix 1... let me do it as rows (1,0,1), (1,1,2), (2,1,3) × (x,y,z) is equal to (0,0,0). Okay.0820

Then, what we end up doing is... well, this, when we form the augmented matrix of this matrix + that, which is just the one that we just did, we end up with the following... (x,y,z), the vectors in v that satisfy this take the form (-r,-r,r), which is the same as r × (-1,-1,1).0848

This is one vector, therefore, the vector (-1,-1,1), this vector is a basis for the kernel. The kernel is a subspace of the departure space. This is a basis for the kernel. It is one dimensional.0871

Okay. d is L... 1 to 1... do you remember what 1 to 1 means? It means any two different vectors in the departure space mapped to two different vectors in the arrival space... 1 to 1.0896

Okay. Well, let us see. The dimension of the kernel of L = 1, which is what we just got up here. That implies that it is not 1 to 1.0918

The reason is... in order to have 1 to 1, the dimension of the kernel needs to be 0. It means this map, the only thing that should be in the kernel is the 0 vector in the departure space. That means 0 maps only to 0.0932

That is the only thing that maps to 0. Everything else maps to something else. If that is the case, when only the 0 is in the kernel, 0 in the departure space, we can say -- that is one of the theorems we had in the last lesson -- we can say that this linear map is 1 to 1.0949

We know that it is not onto. Now we also know that it is not 1 to 1. Okay.0965

So, this is not 1 to 1. Now, I would like you to notice something. Let me put this in blue.0972

We had our departure space v as R3, 3-dimensional. The dimension of the range of the linear map is equal to 2. The dimension of the kernel of this linear map was equal to 1. 2 + 1 = 3. This is always true. This is not a coincidence.0983

So, let us express this as a theorem. Profound, profound theorem.1011

Okay. Let L be a mapping from v to w, let it be a linear mapping -- because we are talking about linear maps after all -- let it be a linear map of an n-dimensional space... n-dimensional vector space into -- notice we did not say onto -- into an m-dimensional vector space w, vector space v, w.1025

Then, the dimension of the kernel of L + the dimension of the range of L is equal to the dimension of v. The departure space. Let us stop and think about that for a second.1066

If we have a linear map, and let us say my departure space is 5-dimensional, well, I know that the relationship that exists between the kernel of that linear map and the range of that linear map is that their sum of the dimensions is always going to equal the dimension of the departure space.1086

So, if I have a 5-dimensional departure space, let us say R5 -- excuse me -- and I happen to know that my kernel has a dimension 2, I know that my range has a dimension 3. I know that I am already dealing with a linear map that is neither 1 to 1 nor onto.1109

This is kind of extraordinary. Now, recall from a previous discussion when we were talking about matrices, and how matrix has those fundamental spaces.1125

It has the column space, it has the row space, it has the null space, which is the -- you know -- the space of all of the vectors that map to 0, which is exactly what the kernel is.1136

So, we can also express this in the analogous form, you have already seen this theorem before... the nullity of a matrix plus the rank of the matrix, which is the dimension of the row space, or column space, is equal to n for an m x n matrix.1149

Of course, we have already said that when we are dealing with the Euclidian space RN and RM, every linear map is representable by a matrix. So, we already did the matrix version of this.1177

Now we are doing the general linear mapping. We do not have to necessarily be talking about Euclidian space R2 to R3, it can be any n-dimensional space. For example, a space of polynomials of degree < or = 5. Boom. There you have your particular, you know, finite dimensional vector space.1191

Again, given a linear map, any linear map, the dimension of the kernel of that linear map, plus the dimension of the range of that linear map, is going to equal the dimension of the departure space. Deep theorem, profound theorem.1211

Okay. Now, let me see here, I wanted to talk a little bit about the nature of 1 to 1 and onto, just to give you a pictorial representation of what it is that really means, and then state a theorem for linear operators, when the arrival space and the departure space happen to be of the same dimensions.1224

So, let us draw some pictures and as you will see I am actually going to draw them... the sizes that I draw them are going to be significant.1250

If I have v, and if I have w, so this is v, this is w... departure space, arrival space... if I have a 1 to 1 map, a 1 to 1 map means that everything over here... everything in v maps to a different element in w.1259

It does not map to everything in w, but it maps to something different in w. However, everything in v is represented, and it goes to something over here.1287

I drew it like this to let you know that there is this size issue. In some sense, w is bigger than v, and I use the term bigger in quotes.1296

Now, let us do an onto map. So, this is 1 to 1, but it is not onto. Let me actually write that... "not onto".1305

Now we will do the other version. We will do something that is onto, but not 1 to 1.1318

So, let us make this v, and let us make this w. So now, an onto map means everything, every single vector in w comes from some vector in v.1323

But that does not mean that every vector in v maps to something in w. Every single one in here, so now, everything in w is represented, so this is onto, not 1 to 1.1336

Okay. It could also be that -- you know -- two different vectors here actually map to the same vector here.1356

The point is that every single vector in w comes from is the image under the linear map of something from v.1364

But, it does not mean that it is everything from v... something from v.1372

Okay. Now, 1 to 1 and onto, now let me state my theorem and we will draw our picture.1378

Theorem. Let L be mapping from v to w... be a linear map and let the dimension of v equal the dimension of w.1386

So, it is a mapping -- not necessarily the same space, but the have the same dimension -- so, the most general idea, not just R3 to R3, or R4 to R4, but some finite dimensional vector space that has dimension 5, some set of objects, it maps to a different set of objects that happen to have the same dimension. They do not have to be the same objects.1409

But, the dimensions of the two spaces are the same... then we can conclude the following: L... -- well, let me write it as an if, then statement -- if L is 1 to 1, then L is onto, and vice versa... If L onto, then L is 1 to 1.1432

So, if I am dealing with a map from one vector space to another, where the dimensions of the two spaces, the departure and the arrival are the same ... if I know that it is 1 to 1, I know that it is onto... if I know that it is onto, I know that it is 1 to 1.1463

This makes sense intuitively if you think about the fact that we want every vector in v to map to every vector in w. We call this a 1 to 1 correspondence, completely for the both sets.1476

The fact that they are the same dimension, it intuitively makes sense. In some sense, those spaces are the same size, if you will.1489

Again, we use the term size in quotes, so it looks something like this... everything in v is represented, and it maps to everything... something in v, everything in v maps to something in w, and all of w is represented.1496

In some sense, I am saying that they are equal, that they are the same size.1519

Again, we use those terms sort of loosely, but it sort of gives you a way of thinking about what it means... a smaller space going into a smaller space, onto.1525

This whole idea if they happen to be of the same dimension, the same size so to speak.1537

Then a 1 to 1 map implies that it is onto, and onto implies that it is 1 to 1.1542

Okay. Thank you for joining us at Educator.com, we will see you next time.1549

Educator®

Please sign in to participate in this lecture discussion.

Resetting Your Password?
OR

Start Learning Now

Our free lessons will get you started (Adobe Flash® required).
Get immediate access to our entire library.

Membership Overview

  • Available 24/7. Unlimited Access to Our Entire Library.
  • Search and jump to exactly what you want to learn.
  • *Ask questions and get answers from the community and our teachers!
  • Practice questions with step-by-step solutions.
  • Download lecture slides for taking notes.
  • Track your course viewing progress.
  • Accessible anytime, anywhere with our Android and iOS apps.