WEBVTT mathematics/linear-algebra/hovasapian
00:00:00.000 --> 00:00:04.000
Welcome back to educator.com, welcome back to linear algebra.
00:00:04.000 --> 00:00:20.000
Today we are going to be talking about something called the kernel and the range of a linear map, so we talked about linear maps... we recalled some of the definitions, well, recalled the definition of a linear map... we did a couple of examples on how to check linearity.
00:00:20.000 --> 00:00:29.000
Now we are going to talk about some specific... get a little bit deeper into the structure of a linear map, so let us just jump in and see what we can do.
00:00:29.000 --> 00:01:39.000
Okay. Let us start off with a definition here. Okay... a linear map L from v to w is said to be 1 to 1, if for all v1 and v2 in v, v1 not equal to v2, implies that L(v1) does not equal L(v2)... excuse me.
00:01:39.000 --> 00:01:56.000
Basically what this means is that each vector in v1 maps to a completely different element of something in w. Now, we have seen examples where... let us just take the function like x², that you know of.
00:01:56.000 --> 00:02:13.000
Well, I know that if I take 2 and I square it, I get 4. Well, if I take a different x, -2, and I square it, I also get 4. So, as it turns out, for that function, x², the 2 and the -2, they map to the same number... 4.
00:02:13.000 --> 00:02:24.000
That is not 1 to 1. 1 to 1 means every different number maps to a completely different number, or maps to a completely different object in the arrival space.
00:02:24.000 --> 00:02:44.000
So, let us draw what that means. Essentially what you have is... that is the departure space, and that is the arrival space, this is v, this is w, if I have v1, v2, v3... each one of these goes some place different.
00:02:44.000 --> 00:03:00.000
They do not go to the same place distinct, distinct, distinct, because these are distinct, that is all it is. This is just a formal way of saying it, and we call it 1 to 1... which makes sense... 1 to 1, as opposed to 2 to 1, like the x² example.
00:03:00.000 --> 00:03:11.000
Okay. An alternative definition here, if I want to, this is an implication in mathematics. This says that if this holds, that this implies this.
00:03:11.000 --> 00:03:23.000
It means that if I know this, then this is true. Well, as it turns out, there is something called the contrapositive, where I... it is equivalent to saying, well, here let me write it out...
00:03:23.000 --> 00:03:34.000
So, I will end up using both formulations when I do the examples. That is why I am going to give you this equivalent condition for what 1 to 1 means.
00:03:34.000 --> 00:03:54.000
An equivalent condition for 1 to 1 is that L(v1) = L(v2), implies that v1 = v2.
00:03:54.000 --> 00:04:06.000
This is sort of a reverse way of saying it. If I note that I have two values here, L(v1) = L(v2), I automatically know that v1 and v2 are the same thing.
00:04:06.000 --> 00:04:16.000
This is our way of saying, again, that this thing... that two things do not map to one thing. Only one thing maps to one thing distinctly.
00:04:16.000 --> 00:04:27.000
This one... the only reason we have two formulations of it is different problems... sometimes this formulation is easier to work with from a practical standpoint, vs. this one.
00:04:27.000 --> 00:04:36.000
As far as intuition and understanding it, this first one is the one that makes sense ot me personally. Two different things mapped to two different things. That is all this is saying.
00:04:36.000 --> 00:04:45.000
Okay. Let us do an example here. A couple of examples, in fact. Example... okay.
00:04:45.000 --> 00:05:14.000
Let L be a mapping from R2 to R2, so this is a linear operator... be defined by L of the vector xy is equal to x + y, x - y.
00:05:14.000 --> 00:05:59.000
Okay. We will let v1 be x1, y1, we will let v2 be x2, y2... we want to show... we are going to use the second formulation... L(v1) = L(v2)... implies that v1 = v2.
00:05:59.000 --> 00:06:04.000
So, we are trying to show that it is 1 to 1, and we are going to use this alternate condition.
00:06:04.000 --> 00:06:34.000
Let us let this be true... so L(v1) = L(v2). That means x1 + y1, x1 - y1 = L(v2), which is x2 + y2, x2 - y2... not 1.
00:06:34.000 --> 00:06:53.000
Well, these are equal to each other. That means I get this equation, x1 + y1 = x2 + y2, and from the second part, these are equal, so let me draw these are equal and these are equal.
00:06:53.000 --> 00:07:02.000
So, x1 - y1 = x2 - y2. Alright.
00:07:02.000 --> 00:07:17.000
The way I have arranged these, if I actually just add these equations straight down, I get 2x1, is equal to 2x2, which implies that x1 = x2.
00:07:17.000 --> 00:07:31.000
When I put these back, I also get, y1 = y2. This means that v1, which is x1, y1, is equal to v2.
00:07:31.000 --> 00:07:45.000
So, by starting with the sub position that this is the case, I have shown that this is the case, which is precisely what this implication means. Implication means that when this is true, it implies this.
00:07:45.000 --> 00:07:54.000
Well, work this out mathematically, I start with this and I follow the train of logic, and if I end up with this that means this implication is true.
00:07:54.000 --> 00:08:11.000
This implication is the definition of 1 to 1, therefore yes. This map is 1 to 1. In other words, every single vector that I take, that I map, will always map to something different.
00:08:11.000 --> 00:08:31.000
Okay. Let us do a second example here. Example 2. L will be R3 to R2, so it is a linear map, not a linear operator.
00:08:31.000 --> 00:08:51.000
It is defined by L(x,y,z) = xy. This is our projection mapping. Okay, I will talk some random xyz, instead of variables we will actually use numbers.
00:08:51.000 --> 00:09:04.000
Let us let v1 = (2,4,5), and we will let our second vector = (2,4,-7).
00:09:04.000 --> 00:09:11.000
Well, not let us use v1 is not equal to v2. These two are not equal to each other.
00:09:11.000 --> 00:09:24.000
However, let us see if this implies... question, does it imply that L(v1) does not equal L(v2).
00:09:24.000 --> 00:09:40.000
Well, L(v1) is 2,4... if I take (2,4,5), I take the first 2... and the question... does it equal (2,4), which is the L(v2).
00:09:40.000 --> 00:09:51.000
Yes. I take that one and that one, v2... (2,4), (2,4) = (2,4)... so therefore, this implication is not true.
00:09:51.000 --> 00:10:04.000
I started off with 2 different vectors, yet I ended up mapping to the same vector in R2. In other words what happened was these 2 spaces, okay, I had 2 separate vectors in my departure space.
00:10:04.000 --> 00:10:17.000
I had this vector (2,4), they both mapped to the same thing. That is not 1 to 1. This is 2 to 1. So, no, not 1 to 1.
00:10:17.000 --> 00:10:32.000
Okay. Now, we can go ahead and go through this process to check 1 to 1, but as it turns out, we often would like simpler ways to decide whether a certain linear mapping or a certain mapping is 1 to 1.
00:10:32.000 --> 00:10:45.000
As it turns out, there is an easier way, so let us introduce another definition. This time I am going to do it in red. This is a profoundly important definition.
00:10:45.000 --> 00:11:05.000
Let L be a mapping from v to w... you have actually seen a variant of this definition under a different name, and you will recognize it immediately when I write it down... be a linear map.
00:11:05.000 --> 00:12:02.000
Okay. The kernel of L is the subset of v, the departure space, consisting of all vectors such that L of a system of all vectors v, let us actually use a vector symbol for this... all vectors v, such that L(v) = the 0 vector in w.
00:12:02.000 --> 00:12:12.000
So, the kernel of a linear map is the set of all those vector in v, that map to 0 in the arrival space.
00:12:12.000 --> 00:12:30.000
Let us draw a picture of this. Very important. That is the departure space v, this is the arrival space w, if I have a series of vectors, I will just mark them as x's and I will put the 0 vector here.
00:12:30.000 --> 00:12:49.000
Let us say I have 3 vectors in v that map to 0, those three vectors, that is my kernel of my linear map. It is the set of vectors, the collection of vectors that end up under the transformation mapping to 0.
00:12:49.000 --> 00:12:55.000
Null space. You should think about something called the null space. It is essentially the same thing here that we are talking about.
00:12:55.000 --> 00:13:08.000
So, where are we now? Okay. So, in this particular case, this vector, this vector, this vector would be the kernel of this particular map, whatever it is, L.
00:13:08.000 --> 00:13:30.000
Okay. Note that 0 in v is always in the kernel of L, right? Because a linear map, the 0 vector in the departure space maps to the 0 vector, so I know that at least 0 is in our kernel.
00:13:30.000 --> 00:13:36.000
I might have more vectors in there, but at least I know the 0 is in there.
00:13:36.000 --> 00:13:56.000
Okay. Let us do an example. L(x,y,z,w) = x + y, z + w, this is a mapping from R4 to R2.
00:13:56.000 --> 00:14:14.000
We want all vectors in R4 that map to (0,0). Okay? We want all vectors v in R4 that equal the 0 vector.
00:14:14.000 --> 00:14:40.000
In other words, we want it to equal (0,0). Okay, well, when we take a look at this thing right here, x + y = 0, z + w = 0.
00:14:40.000 --> 00:15:03.000
Well, you get x = -y, z = -w, so as it turns out, all vectors of the following form, if I let w = r, and if I let y = s, something like that, well, what you get is the following.
00:15:03.000 --> 00:15:28.000
So, these are my two equations so I end up with (-r,r) and (-s,s). So, here I let y = ... it looks like r, and it looks like I let w = s.
00:15:28.000 --> 00:15:37.000
Yes, I let y = r, w = s, therefore z = -s, and x = -r. So, that is what you get.
00:15:37.000 --> 00:15:51.000
Every vector of this form, so you might have (-1,1), (-2,2), every vector of this form is in the kernel of this particular linear map.
00:15:51.000 --> 00:16:01.000
So, there is an infinite number of these. So, the kernel has an infinite number of members in here.
00:16:01.000 --> 00:16:29.000
Now, come to some interesting theorems here. If the linear mapping from v to w is a linear map, then the kernel of L is a subspace.
00:16:29.000 --> 00:16:42.000
So before, we said it is a subset. But it is a very special kind of subset. The kernel is actually a subspace of our departure space v. So, extraordinary.
00:16:42.000 --> 00:17:12.000
Let us look at the example that we just did, we have this linear mapping, we found the kernel... the kernel is all vectors of this form... well, this is the same as r × (-1,1,0,0) + s × (0,0,-1,1).
00:17:12.000 --> 00:17:30.000
Therefore, these little triangles mean therefore, (-1,1,0,0), that vector, which what is wrong with these writings... I think I am writing too fast, I think that is what is happening here.
00:17:30.000 --> 00:17:53.000
So, (-1,1,0,0) and (0 ... this is not going to work... (0,0,-1,1) is a basis for the kernel of L.
00:17:53.000 --> 00:18:03.000
So here, we found the kernel, all vectors of this form, we were able to break it up into a... two sets of vectors here.
00:18:03.000 --> 00:18:10.000
Well, since we discovered this theorem says that it is not only a subset, it is actually a subspace... well, subspaces have bases, right?
00:18:10.000 --> 00:18:26.000
Well, this actually is a basis for the kernel and the dimension of the kernel here is dimension 2, because I have 2 vectors in my basis. That is the whole idea of dimension.
00:18:26.000 --> 00:18:49.000
Now, let us see what else we have got. If a linear map, which maps from RN to RM is linear.
00:18:49.000 --> 00:19:11.000
And if it is defined by matrix multiplication, then, the kernel of L is just the null space.
00:19:11.000 --> 00:19:25.000
So if I have a linear map, where I am saying that the mapping if I have some vector... that I take that vector and I multiply it by a matrix on the left, well, the kernel of that linear map is all of the vectors which map to 0.
00:19:25.000 --> 00:19:43.000
So, if the kernel is just the null space of that. I mean, this is the whole definition, it is this homogeneous system... a, the matrix a, times x is equal to 0.
00:19:43.000 --> 00:20:14.000
The theorem says a linear mapping is 1 to 1 if and only if the kernel of L is equal to the 0 vector... let me redo this last part... if and only if the kernel of L equals the 0 vector in v.
00:20:14.000 --> 00:20:35.000
If the only vector in my departure space that maps to 0 in the arrival space is the 0 vector, that tells me that - excuse me - that the linear map is 1 to 1. That means that every element v in the departure space maps to a different element v.
00:20:35.000 --> 00:20:42.000
All I need to do is make sure that it has a 0 vector... is the only vector in the kernel.
00:20:42.000 --> 00:21:02.000
In other words, it is of dimension 0. Okay. We have got a corollary to that.
00:21:02.000 --> 00:21:11.000
Actually, you know, the corollary is not all together that... it is important but we will deal with it again, so I do not really want to mention it here. I have changed my mind.
00:21:11.000 --> 00:21:20.000
Now, let me introduce our last definition before we close it out.
00:21:20.000 --> 00:22:22.000
If L from v to w is linear, if the mapping is linear, then the range of L is the set of all vectors in w that are images under L of vectors in v.
00:22:22.000 --> 00:22:40.000
Okay, let us just show what that means. This is our departure space, our arrival space, this is w, this is v. Let us say I have v1, v2, v3, v4, and v5.
00:22:40.000 --> 00:23:00.000
Let us say v1 maps to 21, let us say v2 also maps to w1, let us say v3 maps to w2, and let us say v4 maps to w3, and v5 maps to w3.
00:23:00.000 --> 00:23:14.000
The range is w1, w2, w3. It is all of the vectors in w that come from some vector in v, under L.
00:23:14.000 --> 00:23:28.000
Now, that does not mean that every single vector... we will talk more about this actually next lesson, where I will introduce the distinction between into and onto.
00:23:28.000 --> 00:23:37.000
So, this is not saying that every single vector in w is the image of some vector that is mapped under L.
00:23:37.000 --> 00:23:48.000
It says that all of the vectors in w that actually come from some vector in v, that is the range. So, the range is a subset of w.
00:23:48.000 --> 00:23:57.000
You are going to see in a second, my last theorem before we closed out this lesson, it is the range is actually a subspace of w.
00:23:57.000 --> 00:24:04.000
So, again, the range is exactly what you have known it to be all of these years.
00:24:04.000 --> 00:24:10.000
Normally, we speak of the domain and the range, we speak about the whole space. That is not the case here.
00:24:10.000 --> 00:24:17.000
The range is only those things in the arrival space that are actually represented, mapped, from some vector in v.
00:24:17.000 --> 00:24:25.000
It is not all of the space, the arrival space could be all of the arrival space, but it is not necessarily that way.
00:24:25.000 --> 00:24:37.000
Okay. So, let us do something like, actually let me do another picture just for the hell of it, so that you see.
00:24:37.000 --> 00:24:50.000
So, we might have... so this is v... and this is w... so the kernel might be some small little subset of that, that is a subset of v, also happens to be a subspace.
00:24:50.000 --> 00:25:01.000
Well the range might be, some subset of w. All of these vectors in here come from some vector in here.
00:25:01.000 --> 00:25:13.000
Okay, so it is not the entire space, and it is also a subspace. Okay. That is going to be our final theorem before we close out this lesson.
00:25:13.000 --> 00:25:36.000
If L, which maps v to w, the vector spaces, is linear, then range of L is a subspace... subspace of w.
00:25:36.000 --> 00:25:47.000
So, the kernel is a subspace of the departure space, the range is a subspace of the arrival space.
00:25:47.000 --> 00:25:55.000
We are going to close it out here, but I do want to say a couple of words before we actually go to the next lesson where we are going to talk about some relationships between the kernel and the range.
00:25:55.000 --> 00:26:14.000
I am going to ask you to recall something that we discussed called the rank nullity theorem. We said that the rank of a matrix + the dimension of the null space, which we called the nullity is equal to the dimension of the column space, which is n.
00:26:14.000 --> 00:26:25.000
Recall that, and in the next lesson we are going to talk about the dimension of the kernel, the dimension of the range space, and the dimension of the departure space.
00:26:25.000 --> 00:26:32.000
It is really extraordinarily beautiful relationship that exists. Certainly one of the prettiest that I personally have ever seen.
00:26:32.000 --> 00:26:38.000
So, with that, thank you for joining us here at educator.com, we will see you next time.