  Raffi Hovasapian

Kernel and Range of a Linear Map, Part I

Slide Duration:

Section 1: Linear Equations and Matrices
Linear Systems

39m 3s

Intro
0:00
Linear Systems
1:20
Introduction to Linear Systems
1:21
Examples
10:35
Example 1
10:36
Example 2
13:44
Example 3
16:12
Example 4
23:48
Example 5
28:23
Example 6
32:32
Number of Solutions
35:08
One Solution, No Solution, Infinitely Many Solutions
35:09
Method of Elimination
36:57
Method of Elimination
36:58
Matrices

30m 34s

Intro
0:00
Matrices
0:47
Definition and Example of Matrices
0:48
Square Matrix
7:55
Diagonal Matrix
9:31
Operations with Matrices
10:35
10:36
Scalar Multiplication
15:01
Transpose of a Matrix
17:51
Matrix Types
23:17
Regular: m x n Matrix of m Rows and n Column
23:18
Square: n x n Matrix With an Equal Number of Rows and Columns
23:44
Diagonal: A Square Matrix Where All Entries OFF the Main Diagonal are '0'
24:07
Matrix Operations
24:37
Matrix Operations
24:38
Example
25:55
Example
25:56
Dot Product & Matrix Multiplication

41m 42s

Intro
0:00
Dot Product
1:04
Example of Dot Product
1:05
Matrix Multiplication
7:05
Definition
7:06
Example 1
12:26
Example 2
17:38
Matrices and Linear Systems
21:24
Matrices and Linear Systems
21:25
Example 1
29:56
Example 2
32:30
Summary
33:56
Dot Product of Two Vectors and Matrix Multiplication
33:57
Summary, cont.
35:06
Matrix Representations of Linear Systems
35:07
Examples
35:34
Examples
35:35
Properties of Matrix Operation

43m 17s

Intro
0:00
1:11
1:12
2:30
2:57
4:20
5:22
5:23
Properties of Multiplication
6:47
Properties of Multiplication: A
7:46
Properties of Multiplication: B
8:13
Properties of Multiplication: C
9:18
Example: Properties of Multiplication
9:35
Definitions and Properties (Multiplication)
14:02
Identity Matrix: n x n matrix
14:03
Let A Be a Matrix of m x n
15:23
Definitions and Properties (Multiplication)
18:36
Definitions and Properties (Multiplication)
18:37
Properties of Scalar Multiplication
22:54
Properties of Scalar Multiplication: A
23:39
Properties of Scalar Multiplication: B
24:04
Properties of Scalar Multiplication: C
24:29
Properties of Scalar Multiplication: D
24:48
Properties of the Transpose
25:30
Properties of the Transpose
25:31
Properties of the Transpose
30:28
Example
30:29
33:25
Let A, B, C, and D Be m x n Matrices
33:26
There is a Unique m x n Matrix, 0, Such That…
33:48
Unique Matrix D
34:17
Properties of Matrix Multiplication
34:58
Let A, B, and C Be Matrices of the Appropriate Size
34:59
Let A Be Square Matrix (n x n)
35:44
Properties of Scalar Multiplication
36:35
Let r and s Be Real Numbers, and A and B Matrices
36:36
Properties of the Transpose
37:10
Let r Be a Scalar, and A and B Matrices
37:12
Example
37:58
Example
37:59
Solutions of Linear Systems, Part 1

38m 14s

Intro
0:00
Reduced Row Echelon Form
0:29
An m x n Matrix is in Reduced Row Echelon Form If:
0:30
Reduced Row Echelon Form
2:58
Example: Reduced Row Echelon Form
2:59
Theorem
8:30
Every m x n Matrix is Row-Equivalent to a UNIQUE Matrix in Reduced Row Echelon Form
8:31
Systematic and Careful Example
10:02
Step 1
10:54
Step 2
11:33
Step 3
12:50
Step 4
14:02
Step 5
15:31
Step 6
17:28
Example
30:39
Find the Reduced Row Echelon Form of a Given m x n Matrix
30:40
Solutions of Linear Systems, Part II

28m 54s

Intro
0:00
Solutions of Linear Systems
0:11
Solutions of Linear Systems
0:13
Example I
3:25
Solve the Linear System 1
3:26
Solve the Linear System 2
14:31
Example II
17:41
Solve the Linear System 3
17:42
Solve the Linear System 4
20:17
Homogeneous Systems
21:54
Homogeneous Systems Overview
21:55
Theorem and Example
24:01
Inverse of a Matrix

40m 10s

Intro
0:00
Finding the Inverse of a Matrix
0:41
Finding the Inverse of a Matrix
0:42
Properties of Non-Singular Matrices
6:38
Practical Procedure
9:15
Step1
9:16
Step 2
10:10
Step 3
10:46
Example: Finding Inverse
12:50
Linear Systems and Inverses
17:01
Linear Systems and Inverses
17:02
Theorem and Example
21:15
Theorem
26:32
Theorem
26:33
List of Non-Singular Equivalences
28:37
Example: Does the Following System Have a Non-trivial Solution?
30:13
Example: Inverse of a Matrix
36:16
Section 2: Determinants
Determinants

21m 25s

Intro
0:00
Determinants
0:37
Introduction to Determinants
0:38
Example
6:12
Properties
9:00
Properties 1-5
9:01
Example
10:14
Properties, cont.
12:28
Properties 6 & 7
12:29
Example
14:14
Properties, cont.
18:34
Properties 8 & 9
18:35
Example
19:21
Cofactor Expansions

59m 31s

Intro
0:00
Cofactor Expansions and Their Application
0:42
Cofactor Expansions and Their Application
0:43
Example 1
3:52
Example 2
7:08
Evaluation of Determinants by Cofactor
9:38
Theorem
9:40
Example 1
11:41
Inverse of a Matrix by Cofactor
22:42
Inverse of a Matrix by Cofactor and Example
22:43
More Example
36:22
List of Non-Singular Equivalences
43:07
List of Non-Singular Equivalences
43:08
Example
44:38
Cramer's Rule
52:22
Introduction to Cramer's Rule and Example
52:23
Section 3: Vectors in Rn
Vectors in the Plane

46m 54s

Intro
0:00
Vectors in the Plane
0:38
Vectors in the Plane
0:39
Example 1
8:25
Example 2
15:23
19:33
19:34
Scalar Multiplication
24:08
Example
26:25
The Angle Between Two Vectors
29:33
The Angle Between Two Vectors
29:34
Example
33:54
Properties of the Dot Product and Unit Vectors
38:17
Properties of the Dot Product and Unit Vectors
38:18
Defining Unit Vectors
40:01
2 Very Important Unit Vectors
41:56
n-Vector

52m 44s

Intro
0:00
n-Vectors
0:58
4-Vector
0:59
7-Vector
1:50
2:43
Scalar Multiplication
3:37
Theorem: Part 1
4:24
Theorem: Part 2
11:38
Right and Left Handed Coordinate System
14:19
Projection of a Point Onto a Coordinate Line/Plane
17:20
Example
21:27
Cauchy-Schwarz Inequality
24:56
Triangle Inequality
36:29
Unit Vector
40:34
Vectors and Dot Products
44:23
Orthogonal Vectors
44:24
Cauchy-Schwarz Inequality
45:04
Triangle Inequality
45:21
Example 1
45:40
Example 2
48:16
Linear Transformation

48m 53s

Intro
0:00
Introduction to Linear Transformations
0:44
Introduction to Linear Transformations
0:45
Example 1
9:01
Example 2
11:33
Definition of Linear Mapping
14:13
Example 3
22:31
Example 4
26:07
Example 5
30:36
Examples
36:12
Projection Mapping
36:13
Images, Range, and Linear Mapping
38:33
Example of Linear Transformation
42:02
Linear Transformations, Part II

34m 8s

Intro
0:00
Linear Transformations
1:29
Linear Transformations
1:30
Theorem 1
7:15
Theorem 2
9:20
Example 1: Find L (-3, 4, 2)
11:17
Example 2: Is It Linear?
17:11
Theorem 3
25:57
Example 3: Finding the Standard Matrix
29:09
Lines and Planes

37m 54s

Intro
0:00
Lines and Plane
0:36
Example 1
0:37
Example 2
7:07
Lines in IR3
9:53
Parametric Equations
14:58
Example 3
17:26
Example 4
20:11
Planes in IR3
25:19
Example 5
31:12
Example 6
34:18
Section 4: Real Vector Spaces
Vector Spaces

42m 19s

Intro
0:00
Vector Spaces
3:43
Definition of Vector Spaces
3:44
Vector Spaces 1
5:19
Vector Spaces 2
9:34
Real Vector Space and Complex Vector Space
14:01
Example 1
15:59
Example 2
18:42
Examples
26:22
More Examples
26:23
Properties of Vector Spaces
32:53
Properties of Vector Spaces Overview
32:54
Property A
34:31
Property B
36:09
Property C
36:38
Property D
37:54
Property F
39:00
Subspaces

43m 37s

Intro
0:00
Subspaces
0:47
Defining Subspaces
0:48
Example 1
3:08
Example 2
3:49
Theorem
7:26
Example 3
9:11
Example 4
12:30
Example 5
16:05
Linear Combinations
23:27
Definition 1
23:28
Example 1
25:24
Definition 2
29:49
Example 2
31:34
Theorem
32:42
Example 3
34:00
Spanning Set for a Vector Space

33m 15s

Intro
0:00
A Spanning Set for a Vector Space
1:10
A Spanning Set for a Vector Space
1:11
Procedure to Check if a Set of Vectors Spans a Vector Space
3:38
Example 1
6:50
Example 2
14:28
Example 3
21:06
Example 4
22:15
Linear Independence

17m 20s

Intro
0:00
Linear Independence
0:32
Definition
0:39
Meaning
3:00
Procedure for Determining if a Given List of Vectors is Linear Independence or Linear Dependence
5:00
Example 1
7:21
Example 2
10:20
Basis & Dimension

31m 20s

Intro
0:00
Basis and Dimension
0:23
Definition
0:24
Example 1
3:30
Example 2: Part A
4:00
Example 2: Part B
6:53
Theorem 1
9:40
Theorem 2
11:32
Procedure for Finding a Subset of S that is a Basis for Span S
14:20
Example 3
16:38
Theorem 3
21:08
Example 4
25:27
Homogeneous Systems

24m 45s

Intro
0:00
Homogeneous Systems
0:51
Homogeneous Systems
0:52
Procedure for Finding a Basis for the Null Space of Ax = 0
2:56
Example 1
7:39
Example 2
18:03
Relationship Between Homogeneous and Non-Homogeneous Systems
19:47
Rank of a Matrix, Part I

35m 3s

Intro
0:00
Rank of a Matrix
1:47
Definition
1:48
Theorem 1
8:14
Example 1
9:38
Defining Row and Column Rank
16:53
If We Want a Basis for Span S Consisting of Vectors From S
22:00
If We want a Basis for Span S Consisting of Vectors Not Necessarily in S
24:07
Example 2: Part A
26:44
Example 2: Part B
32:10
Rank of a Matrix, Part II

29m 26s

Intro
0:00
Rank of a Matrix
0:17
Example 1: Part A
0:18
Example 1: Part B
5:58
Rank of a Matrix Review: Rows, Columns, and Row Rank
8:22
Procedure for Computing the Rank of a Matrix
14:36
Theorem 1: Rank + Nullity = n
16:19
Example 2
17:48
Rank & Singularity
20:09
Example 3
21:08
Theorem 2
23:25
List of Non-Singular Equivalences
24:24
List of Non-Singular Equivalences
24:25
Coordinates of a Vector

27m 3s

Intro
0:00
Coordinates of a Vector
1:07
Coordinates of a Vector
1:08
Example 1
8:35
Example 2
15:28
Example 3: Part A
19:15
Example 3: Part B
22:26
Change of Basis & Transition Matrices

33m 47s

Intro
0:00
Change of Basis & Transition Matrices
0:56
Change of Basis & Transition Matrices
0:57
Example 1
10:44
Example 2
20:44
Theorem
23:37
Example 3: Part A
26:21
Example 3: Part B
32:05
Orthonormal Bases in n-Space

32m 53s

Intro
0:00
Orthonormal Bases in n-Space
1:02
Orthonormal Bases in n-Space: Definition
1:03
Example 1
4:31
Theorem 1
6:55
Theorem 2
8:00
Theorem 3
9:04
Example 2
10:07
Theorem 2
13:54
Procedure for Constructing an O/N Basis
16:11
Example 3
21:42
Orthogonal Complements, Part I

21m 27s

Intro
0:00
Orthogonal Complements
0:19
Definition
0:20
Theorem 1
5:36
Example 1
6:58
Theorem 2
13:26
Theorem 3
15:06
Example 2
18:20
Orthogonal Complements, Part II

33m 49s

Intro
0:00
Relations Among the Four Fundamental Vector Spaces Associated with a Matrix A
2:16
Four Spaces Associated With A (If A is m x n)
2:17
Theorem
4:49
Example 1
7:17
Null Space and Column Space
10:48
Projections and Applications
16:50
Projections and Applications
16:51
Projection Illustration
21:00
Example 1
23:51
Projection Illustration Review
30:15
Section 5: Eigenvalues and Eigenvectors
Eigenvalues and Eigenvectors

38m 11s

Intro
0:00
Eigenvalues and Eigenvectors
0:38
Eigenvalues and Eigenvectors
0:39
Definition 1
3:30
Example 1
7:20
Example 2
10:19
Definition 2
21:15
Example 3
23:41
Theorem 1
26:32
Theorem 2
27:56
Example 4
29:14
Review
34:32
Similar Matrices & Diagonalization

29m 55s

Intro
0:00
Similar Matrices and Diagonalization
0:25
Definition 1
0:26
Example 1
2:00
Properties
3:38
Definition 2
4:57
Theorem 1
6:12
Example 3
9:37
Theorem 2
12:40
Example 4
19:12
Example 5
20:55
Procedure for Diagonalizing Matrix A: Step 1
24:21
Procedure for Diagonalizing Matrix A: Step 2
25:04
Procedure for Diagonalizing Matrix A: Step 3
25:38
Procedure for Diagonalizing Matrix A: Step 4
27:02
Diagonalization of Symmetric Matrices

30m 14s

Intro
0:00
Diagonalization of Symmetric Matrices
1:15
Diagonalization of Symmetric Matrices
1:16
Theorem 1
2:24
Theorem 2
3:27
Example 1
4:47
Definition 1
6:44
Example 2
8:15
Theorem 3
10:28
Theorem 4
12:31
Example 3
18:00
Section 6: Linear Transformations
Linear Mappings Revisited

24m 5s

Intro
0:00
Linear Mappings
2:08
Definition
2:09
Linear Operator
7:36
Projection
8:48
Dilation
9:40
Contraction
10:07
Reflection
10:26
Rotation
11:06
Example 1
13:00
Theorem 1
18:16
Theorem 2
19:20
Kernel and Range of a Linear Map, Part I

26m 38s

Intro
0:00
Kernel and Range of a Linear Map
0:28
Definition 1
0:29
Example 1
4:36
Example 2
8:12
Definition 2
10:34
Example 3
13:34
Theorem 1
16:01
Theorem 2
18:26
Definition 3
21:11
Theorem 3
24:28
Kernel and Range of a Linear Map, Part II

25m 54s

Intro
0:00
Kernel and Range of a Linear Map
1:39
Theorem 1
1:40
Example 1: Part A
2:32
Example 1: Part B
8:12
Example 1: Part C
13:11
Example 1: Part D
14:55
Theorem 2
16:50
Theorem 3
23:00
Matrix of a Linear Map

33m 21s

Intro
0:00
Matrix of a Linear Map
0:11
Theorem 1
1:24
Procedure for Computing to Matrix: Step 1
7:10
Procedure for Computing to Matrix: Step 2
8:58
Procedure for Computing to Matrix: Step 3
9:50
Matrix of a Linear Map: Property
10:41
Example 1
14:07
Example 2
18:12
Example 3
24:31
Bookmark & Share Embed

## Copy & Paste this embed code into your website’s HTML

Please ensure that your website editor is in text mode when you paste the code.
(In Wordpress, the mode button is on the top right corner.)
×
• - Allow users to view the embedded video in full-size.
Since this lesson is not free, only the preview will appear on your website.

• ## Related Books 2 answers Last reply by: Professor HovasapianThu May 1, 2014 9:19 PMPost by Josh Winfield on April 21, 2014Given, L(x) = x^2 - its Ker(L) = (0) why is it not it 1-to-1 based on Theorem 2........or is it a further condition that (n does not = m) 2 answersLast reply by: Josh WinfieldMon Apr 21, 2014 1:30 AMPost by Suhaib Hasan on November 15, 2012Judging by the second definition of a kernel (the 0 vector one) does that mean that W must contain a zero vector? 2 answersLast reply by: Shahaz ShajahanSun Aug 19, 2012 5:02 AMPost by Shahaz Shajahan on August 16, 2012there is a question that i would like to ask that involves lots of symbols, would i be able to send you a particular question, by PDF on the facebook page, the same way you sent me your answer to my previous question?

### Kernel and Range of a Linear Map, Part I

Lecture Slides are screen-captured images of important points in the lecture. Students can download and print out these lecture slide images to do practice problems as well as take notes while watching the lecture.

• Intro 0:00
• Kernel and Range of a Linear Map 0:28
• Definition 1
• Example 1
• Example 2
• Definition 2
• Example 3
• Theorem 1
• Theorem 2
• Definition 3
• Theorem 3

### Transcription: Kernel and Range of a Linear Map, Part I

Welcome back to educator.com, welcome back to linear algebra.0000

Today we are going to be talking about something called the kernel and the range of a linear map, so we talked about linear maps... we recalled some of the definitions, well, recalled the definition of a linear map... we did a couple of examples on how to check linearity.0004

Now we are going to talk about some specific... get a little bit deeper into the structure of a linear map, so let us just jump in and see what we can do.0020

Okay. Let us start off with a definition here. Okay... a linear map L from v to w is said to be 1 to 1, if for all v1 and v2 in v, v1 not equal to v2, implies that L(v1) does not equal L(v2)... excuse me.0029

Basically what this means is that each vector in v1 maps to a completely different element of something in w. Now, we have seen examples where... let us just take the function like x2, that you know of.0099

Well, I know that if I take 2 and I square it, I get 4. Well, if I take a different x, -2, and I square it, I also get 4. So, as it turns out, for that function, x2, the 2 and the -2, they map to the same number... 4.0116

That is not 1 to 1. 1 to 1 means every different number maps to a completely different number, or maps to a completely different object in the arrival space.0133

So, let us draw what that means. Essentially what you have is... that is the departure space, and that is the arrival space, this is v, this is w, if I have v1, v2, v3... each one of these goes some place different.0144

They do not go to the same place distinct, distinct, distinct, because these are distinct, that is all it is. This is just a formal way of saying it, and we call it 1 to 1... which makes sense... 1 to 1, as opposed to 2 to 1, like the x2 example.0164

Okay. An alternative definition here, if I want to, this is an implication in mathematics. This says that if this holds, that this implies this.0180

It means that if I know this, then this is true. Well, as it turns out, there is something called the contrapositive, where I... it is equivalent to saying, well, here let me write it out...0191

So, I will end up using both formulations when I do the examples. That is why I am going to give you this equivalent condition for what 1 to 1 means.0203

An equivalent condition for 1 to 1 is that L(v1) = L(v2), implies that v1 = v2.0214

This is sort of a reverse way of saying it. If I note that I have two values here, L(v1) = L(v2), I automatically know that v1 and v2 are the same thing.0234

This is our way of saying, again, that this thing... that two things do not map to one thing. Only one thing maps to one thing distinctly.0246

This one... the only reason we have two formulations of it is different problems... sometimes this formulation is easier to work with from a practical standpoint, vs. this one.0256

As far as intuition and understanding it, this first one is the one that makes sense ot me personally. Two different things mapped to two different things. That is all this is saying.0267

Okay. Let us do an example here. A couple of examples, in fact. Example... okay.0276

Let L be a mapping from R2 to R2, so this is a linear operator... be defined by L of the vector xy is equal to x + y, x - y.0285

Okay. We will let v1 be x1, y1, we will let v2 be x2, y2... we want to show... we are going to use the second formulation... L(v1) = L(v2)... implies that v1 = v2.0314

So, we are trying to show that it is 1 to 1, and we are going to use this alternate condition.0359

Let us let this be true... so L(v1) = L(v2). That means x1 + y1, x1 - y1 = L(v2), which is x2 + y2, x2 - y2... not 1.0364

Well, these are equal to each other. That means I get this equation, x1 + y1 = x2 + y2, and from the second part, these are equal, so let me draw these are equal and these are equal.0394

So, x1 - y1 = x2 - y2. Alright.0413

The way I have arranged these, if I actually just add these equations straight down, I get 2x1, is equal to 2x2, which implies that x1 = x2.0422

When I put these back, I also get, y1 = y2. This means that v1, which is x1, y1, is equal to v2.0437

So, by starting with the sub position that this is the case, I have shown that this is the case, which is precisely what this implication means. Implication means that when this is true, it implies this.0451

Well, work this out mathematically, I start with this and I follow the train of logic, and if I end up with this that means this implication is true.0465

This implication is the definition of 1 to 1, therefore yes. This map is 1 to 1. In other words, every single vector that I take, that I map, will always map to something different.0474

Okay. Let us do a second example here. Example 2. L will be R3 to R2, so it is a linear map, not a linear operator.0491

It is defined by L(x,y,z) = xy. This is our projection mapping. Okay, I will talk some random xyz, instead of variables we will actually use numbers.0511

Let us let v1 = (2,4,5), and we will let our second vector = (2,4,-7).0531

Well, not let us use v1 is not equal to v2. These two are not equal to each other.0544

However, let us see if this implies... question, does it imply that L(v1) does not equal L(v2).0551

Well, L(v1) is 2,4... if I take (2,4,5), I take the first 2... and the question... does it equal (2,4), which is the L(v2).0564

Yes. I take that one and that one, v2... (2,4), (2,4) = (2,4)... so therefore, this implication is not true.0580

I started off with 2 different vectors, yet I ended up mapping to the same vector in R2. In other words what happened was these 2 spaces, okay, I had 2 separate vectors in my departure space.0591

I had this vector (2,4), they both mapped to the same thing. That is not 1 to 1. This is 2 to 1. So, no, not 1 to 1.0604

Okay. Now, we can go ahead and go through this process to check 1 to 1, but as it turns out, we often would like simpler ways to decide whether a certain linear mapping or a certain mapping is 1 to 1.0617

As it turns out, there is an easier way, so let us introduce another definition. This time I am going to do it in red. This is a profoundly important definition.0632

Let L be a mapping from v to w... you have actually seen a variant of this definition under a different name, and you will recognize it immediately when I write it down... be a linear map.0645

Okay. The kernel of L is the subset of v, the departure space, consisting of all vectors such that L of a system of all vectors v, let us actually use a vector symbol for this... all vectors v, such that L(v) = the 0 vector in w.0665

So, the kernel of a linear map is the set of all those vector in v, that map to 0 in the arrival space.0722

Let us draw a picture of this. Very important. That is the departure space v, this is the arrival space w, if I have a series of vectors, I will just mark them as x's and I will put the 0 vector here.0732

Let us say I have 3 vectors in v that map to 0, those three vectors, that is my kernel of my linear map. It is the set of vectors, the collection of vectors that end up under the transformation mapping to 0.0750

Null space. You should think about something called the null space. It is essentially the same thing here that we are talking about.0769

So, where are we now? Okay. So, in this particular case, this vector, this vector, this vector would be the kernel of this particular map, whatever it is, L.0775

Okay. Note that 0 in v is always in the kernel of L, right? Because a linear map, the 0 vector in the departure space maps to the 0 vector, so I know that at least 0 is in our kernel.0788

I might have more vectors in there, but at least I know the 0 is in there.0810

Okay. Let us do an example. L(x,y,z,w) = x + y, z + w, this is a mapping from R4 to R2.0816

We want all vectors in R4 that map to (0,0). Okay? We want all vectors v in R4 that equal the 0 vector.0836

In other words, we want it to equal (0,0). Okay, well, when we take a look at this thing right here, x + y = 0, z + w = 0.0854

Well, you get x = -y, z = -w, so as it turns out, all vectors of the following form, if I let w = r, and if I let y = s, something like that, well, what you get is the following.0880

So, these are my two equations so I end up with (-r,r) and (-s,s). So, here I let y = ... it looks like r, and it looks like I let w = s.0903

Yes, I let y = r, w = s, therefore z = -s, and x = -r. So, that is what you get.0928

Every vector of this form, so you might have (-1,1), (-2,2), every vector of this form is in the kernel of this particular linear map.0937

So, there is an infinite number of these. So, the kernel has an infinite number of members in here.0951

Now, come to some interesting theorems here. If the linear mapping from v to w is a linear map, then the kernel of L is a subspace.0961

So before, we said it is a subset. But it is a very special kind of subset. The kernel is actually a subspace of our departure space v. So, extraordinary.0989

Let us look at the example that we just did, we have this linear mapping, we found the kernel... the kernel is all vectors of this form... well, this is the same as r × (-1,1,0,0) + s × (0,0,-1,1).1002

Therefore, these little triangles mean therefore, (-1,1,0,0), that vector, which what is wrong with these writings... I think I am writing too fast, I think that is what is happening here.1032

So, (-1,1,0,0) and (0 ... this is not going to work... (0,0,-1,1) is a basis for the kernel of L.1050

So here, we found the kernel, all vectors of this form, we were able to break it up into a... two sets of vectors here.1073

Well, since we discovered this theorem says that it is not only a subset, it is actually a subspace... well, subspaces have bases, right?1083

Well, this actually is a basis for the kernel and the dimension of the kernel here is dimension 2, because I have 2 vectors in my basis. That is the whole idea of dimension.1090

Now, let us see what else we have got. If a linear map, which maps from RN to RM is linear.1106

And if it is defined by matrix multiplication, then, the kernel of L is just the null space.1129

So if I have a linear map, where I am saying that the mapping if I have some vector... that I take that vector and I multiply it by a matrix on the left, well, the kernel of that linear map is all of the vectors which map to 0.1151

So, if the kernel is just the null space of that. I mean, this is the whole definition, it is this homogeneous system... a, the matrix a, times x is equal to 0.1165

The theorem says a linear mapping is 1 to 1 if and only if the kernel of L is equal to the 0 vector... let me redo this last part... if and only if the kernel of L equals the 0 vector in v.1183

If the only vector in my departure space that maps to 0 in the arrival space is the 0 vector, that tells me that - excuse me - that the linear map is 1 to 1. That means that every element v in the departure space maps to a different element v.1214

All I need to do is make sure that it has a 0 vector... is the only vector in the kernel.1235

In other words, it is of dimension 0. Okay. We have got a corollary to that.1242

Actually, you know, the corollary is not all together that... it is important but we will deal with it again, so I do not really want to mention it here. I have changed my mind.1262

Now, let me introduce our last definition before we close it out.1271

If L from v to w is linear, if the mapping is linear, then the range of L is the set of all vectors in w that are images under L of vectors in v.1280

Okay, let us just show what that means. This is our departure space, our arrival space, this is w, this is v. Let us say I have v1, v2, v3, v4, and v5.1342

Let us say v1 maps to 21, let us say v2 also maps to w1, let us say v3 maps to w2, and let us say v4 maps to w3, and v5 maps to w3.1360

The range is w1, w2, w3. It is all of the vectors in w that come from some vector in v, under L.1380

Now, that does not mean that every single vector... we will talk more about this actually next lesson, where I will introduce the distinction between into and onto.1394

So, this is not saying that every single vector in w is the image of some vector that is mapped under L.1408

It says that all of the vectors in w that actually come from some vector in v, that is the range. So, the range is a subset of w.1417

You are going to see in a second, my last theorem before we closed out this lesson, it is the range is actually a subspace of w.1428

So, again, the range is exactly what you have known it to be all of these years.1437

Normally, we speak of the domain and the range, we speak about the whole space. That is not the case here.1444

The range is only those things in the arrival space that are actually represented, mapped, from some vector in v.1450

It is not all of the space, the arrival space could be all of the arrival space, but it is not necessarily that way.1457

Okay. So, let us do something like, actually let me do another picture just for the hell of it, so that you see.1465

So, we might have... so this is v... and this is w... so the kernel might be some small little subset of that, that is a subset of v, also happens to be a subspace.1477

Well the range might be, some subset of w. All of these vectors in here come from some vector in here.1490

Okay, so it is not the entire space, and it is also a subspace. Okay. That is going to be our final theorem before we close out this lesson.1501

If L, which maps v to w, the vector spaces, is linear, then range of L is a subspace... subspace of w.1513

So, the kernel is a subspace of the departure space, the range is a subspace of the arrival space.1536

We are going to close it out here, but I do want to say a couple of words before we actually go to the next lesson where we are going to talk about some relationships between the kernel and the range.1547

I am going to ask you to recall something that we discussed called the rank nullity theorem. We said that the rank of a matrix + the dimension of the null space, which we called the nullity is equal to the dimension of the column space, which is n.1555

Recall that, and in the next lesson we are going to talk about the dimension of the kernel, the dimension of the range space, and the dimension of the departure space.1574

It is really extraordinarily beautiful relationship that exists. Certainly one of the prettiest that I personally have ever seen.1585

So, with that, thank you for joining us here at educator.com, we will see you next time.1592

OR

### Start Learning Now

Our free lessons will get you started (Adobe Flash® required).