WEBVTT mathematics/linear-algebra/hovasapian
00:00:00.000 --> 00:00:04.000
Welcome back to Educator.com and welcome back to linear algebra.
00:00:04.000 --> 00:00:08.000
Today we are going to be talking about orthogonal complements.
00:00:08.000 --> 00:00:19.000
So, rather than doing a preamble discussion of what it is, let us just jump into some definitions and it should make sense once we actually set it out in a definition form.
00:00:19.000 --> 00:00:32.000
Okay. So, let us start with our definition. It is a little long, but nothing strange.
00:00:32.000 --> 00:00:49.000
Let w be a subspace of RN, so we are definitely talking about N-space here. Okay.
00:00:49.000 --> 00:01:10.000
A vector u which is a member of RN is said to be orthogonal to w, so notice orthogonal to w as a subspace.
00:01:10.000 --> 00:01:34.000
Orthogonal to an entire subspace, if it is orthogonal to every vector in that subspace.
00:01:34.000 --> 00:02:22.000
Okay. The set of all vectors in RN that are orthogonal to all vectors in w is called the orthogonal complement of w.
00:02:22.000 --> 00:02:37.000
And... is symbolized by w with a little perpendicular mark on the top, and they call it w perp.
00:02:37.000 --> 00:02:41.000
The top right. Okay. SO, let us look at this definition again.
00:02:41.000 --> 00:02:50.000
So, w is a subspace of RN, okay? So it could be dimension 1, 2, 3, all the way up to N, because RN is a subspace of itself.
00:02:50.000 --> 00:03:00.000
A vector u in RN is said to be orthogonal to that subspace if its orthogonal to every vector in that subspace.
00:03:00.000 --> 00:03:07.000
So, the set of all vectors that are orthogonal, we call it the orthogonal complement of w.
00:03:07.000 --> 00:03:14.000
It is symbolized by that w with a little perpendicular mark on the top right, called w perp.
00:03:14.000 --> 00:03:19.000
Let us give a little bit of a picture so that we see what it is we are looking at.
00:03:19.000 --> 00:03:25.000
So, let us just deal in R3, and let us say that, so let me draw a plane.
00:03:25.000 --> 00:03:34.000
As you know, a plane is 2-dimensional so it is in R3, and then let me just draw some random vectors in this plane. Something like that.
00:03:34.000 --> 00:03:46.000
Well, if I have some vector like that, which is perpendicular to this plane, so this plane... let us call that w.
00:03:46.000 --> 00:03:51.000
So, that is some subspace of R3, and again, let me make sure that I write it down... so we are dealing with R3.
00:03:51.000 --> 00:04:01.000
This two dimensional plane is a subspace of R3, and every single vector in there is of course... well, it is a vector in the plane.
00:04:01.000 --> 00:04:08.000
Then, if I take this vector here, well every single vector that is perpendicular to it is going to be parallel to this vector, right?
00:04:08.000 --> 00:04:13.000
So, when we speak about parallel vectors, we really only speak about 1 vector.
00:04:13.000 --> 00:04:28.000
So, as it turns out, if this is w, this vector right here and all of the scalar multiples of it, like shortened versions of it, long versions of it, this is your w perp.
00:04:28.000 --> 00:04:39.000
Because, this vector, any vector in here, is going to end up being perpendicular to every one of these vectors. This is the orthogonal complement of that.
00:04:39.000 --> 00:04:47.000
So, it helps to use this picture working in R3, and working with either dimension 1 or 2, because we can actually picture it.
00:04:47.000 --> 00:04:54.000
For something like R4 or R5, I mean I can go ahead and tell you what it is that you will be dealing with.
00:04:54.000 --> 00:05:02.000
So let us say in R4 you have a subspace that is 2-dimensional, that is some kind of plane so to speak in R4.
00:05:02.000 --> 00:05:14.000
Well, the orthogonal complement of that is going to be every vector that is going to be perpendicular to that 1 or 2 dimensions, that is actually going to end up being 2-dimensional.
00:05:14.000 --> 00:05:24.000
The idea is we have this subspace and we have a bunch of vectors that are orthogonal to every vector in that subspace.
00:05:24.000 --> 00:05:31.000
The set of all of those vectors that are orthogonal are called the orthogonal complement. That is all that it means.
00:05:31.000 --> 00:05:38.000
Okay. Let us actually do a little bit of an example here.
00:05:38.000 --> 00:05:48.000
So, let us say... well actually, you know what, let us just jump into a theorem and we will get into an example in a minute.
00:05:48.000 --> 00:06:11.000
So, theorem... let w be a subspace of RN... okay.
00:06:11.000 --> 00:06:23.000
Then, aw perp is a subspace of RN.
00:06:23.000 --> 00:06:28.000
So, if w is a subspace, w perp, its orthogonal complement, is also a subspace.
00:06:28.000 --> 00:06:33.000
We do not have to go through that procedure of checking whether it is a subspace.
00:06:33.000 --> 00:06:42.000
And... it is interesting... that the intersection of w and w perp is the 0-vector.
00:06:42.000 --> 00:06:53.000
So, again, they are subspaces so they have to include the 0 vector, both of them, but that is the only thing common between the two subspaces of w and w perp, its orthogonal complement.
00:06:53.000 --> 00:06:59.000
The only thing they have in common. They actually pass through the origin.
00:06:59.000 --> 00:07:05.000
Okay. So, now let us do our example.
00:07:05.000 --> 00:07:27.000
Let us see. We will let w be a subspace of, this time we will work in R4, with basis w1, w2.
00:07:27.000 --> 00:07:33.000
So, w1, w2, these two vectors form a basis for our subspace w.
00:07:33.000 --> 00:07:46.000
And... w1 is going to be 1, 1, 0, 1, and I have just written this vector in horizontal form without the... not as a list without the commas, it does not really matter.
00:07:46.000 --> 00:07:54.000
w2 is going to be the vector 0, -1, 1, 1, 1.
00:07:54.000 --> 00:07:59.000
So, you have 2 vectors, they form a basis for the subspace w in R4.
00:07:59.000 --> 00:08:18.000
Now, our task is find a basis for the orthogonal complement, w perp. Find a basis for the subspace of all of the vectors that are orthogonal to all of the vectors in w, that has these 2 vectors as a basis.
00:08:18.000 --> 00:08:32.000
Okay, well, so, let us just take some random... okay... so we will let u, let us choose u equal to some random vector in R4.
00:08:32.000 --> 00:08:38.000
a, b, c, d, we want to be as general as possible... a, b, c, d.
00:08:38.000 --> 00:08:52.000
Well, so we are looking for the following. We want... actually, let me see, let this be -- I am sorry -- let this be a random vector in the orthogonal complement.
00:08:52.000 --> 00:08:58.000
Okay. So, we are just going to look for some random vector, see if we can find values for a, b, c, d.
00:08:58.000 --> 00:09:03.000
We are going to take a vector in the orthogonal complement, and we know that this is going to be true.
00:09:03.000 --> 00:09:15.000
We know that because w perp and w are orthogonal to each other, we know that u ⋅ w1 = 0.
00:09:15.000 --> 00:09:30.000
We know that u ⋅ w2... let me make this dot a little more clear, we do not want that.... is equal to 0, right?
00:09:30.000 --> 00:09:38.000
So, because they are orthogonal complements, we know that they are orthogonal, which means that their dot product is equal to 0.
00:09:38.000 --> 00:09:42.000
Well, these are just a couple of equations, so let us actually do this.
00:09:42.000 --> 00:09:55.000
So, if I do u ⋅ w1, I get a + b + 0 + b = 0.
00:09:55.000 --> 00:10:08.000
Then, if I do u ⋅ w2, I get 0 - b + c + d = 0.
00:10:08.000 --> 00:10:18.000
When we solve this using the techniques that we have at our disposal... I am going to go ahead and do it over here.
00:10:18.000 --> 00:10:29.000
So, this is just a homogeneous system, you set up the coefficient matrix, reduced row echelon form, the columns that have... that do not have a leading entry, those are your free variables... r, s, t, u, v, whatever you want.
00:10:29.000 --> 00:10:33.000
Then you solve for the other variables that do have leading entries
00:10:33.000 --> 00:10:55.000
When you do this, you end up with the following. You get a, b, c, and d, the vector takes on the form R × -1, 1, 1, 1, 0, + s × -2, 1, 0, 1.
00:10:55.000 --> 00:11:03.000
So, those two vectors form a basis for the orthogonal complement w perp.
00:11:03.000 --> 00:11:22.000
Therefore, we will set it up as c -- set notation, let me just write it and again -1, 1, 1, 1, 0... comma, -2, 1, 0, 1... is a basis for w perp.
00:11:22.000 --> 00:11:29.000
So, that is it. We started with a basis of two vectors in R4.
00:11:29.000 --> 00:11:42.000
Then, just by virtue of the fact that we know that the orthogonal complement is going to be orthogonal to every single vector in this, so it is certainly going to be orthogonal to these two... I pick a random vector in this orthogonal complement.
00:11:42.000 --> 00:11:48.000
I write my equation... orthogonal just means the dot product equals 0, get a homogeneous system.
00:11:48.000 --> 00:12:05.000
I solve the homogeneous system and I set it up a way where I can basically read off the basis for my solution space of this homogeneous system, which is the basis for, in this particular case, based on this problem, the orthogonal complement.
00:12:05.000 --> 00:12:13.000
Nice, straight forward, nothing we have not done. We have seen dot product, we have seen homogeneous systems, we have seen basis, everything is new.
00:12:13.000 --> 00:12:22.000
Now we are just supplying it to this new idea of 2 subspaces being orthogonal to each other. Being perpendicular to each other.
00:12:22.000 --> 00:12:30.000
Of course, perpendicularity, of course you know from your geometric intuition, only makes sense in R2 and R3, which is why we do not use the word perpendicular, we use the word orthogonal, but it is the same thing in some sense
00:12:30.000 --> 00:12:40.000
So, you might have a 6-dimensional subspace being orthogonal to a 3-dimensional subspace "whatever that means".
00:12:40.000 --> 00:12:47.000
Well, geometrically, pictorially, we do not know what that means. We cannot actually picture that. We have no way of representing it geographically.
00:12:47.000 --> 00:12:54.000
But, we know what it means algebraically. The dot product of two vectors in those spaces is equal to 0.
00:12:54.000 --> 00:13:02.000
Okay. One of the things that I would like you to notice when we had R4, you notice that our w had dimension 2.
00:13:02.000 --> 00:13:13.000
Its basis had 2 vectors, dimension 2... and you noticed when we had w perp, the orthogonal complement, we ended up with 2 vectors as a basis, also in dimension 2.
00:13:13.000 --> 00:13:28.000
Notice that the dimension of the subspace w + the dimension of its orthogonal complement added up to 4, the actual dimension of the space. That is not a coincidence.
00:13:28.000 --> 00:13:46.000
So, let us write down a theorem... Let w be a subspace of RN.
00:13:46.000 --> 00:13:56.000
Then, RN, the actual space itself is made up of 2 w + w perp. SO, let me talk about this thing.
00:13:56.000 --> 00:14:06.000
This little plus sign with a circle around it, it is called a direct sum, and I will speak about it in just a minute.
00:14:06.000 --> 00:14:20.000
Okay. Essentially what this means is... we will have to speak a little bit more about it, but one of the things that it means is that w intersect w perp, the only thing they have in common is like we said before... the 0 vector.
00:14:20.000 --> 00:14:30.000
These are both subspaces, so they have to have at least the 0 vector in common. They do not share anything else in common.
00:14:30.000 --> 00:14:50.000
Okay. Yet another theorem, and I will talk about the sum in just a moment, but going back to the problem that we just did, this basically says that if I take the subspace w and its orthogonal complement, and if I somehow combine them -- which we will talk about it in a minute -- we will actually end up getting this space itself, the 4-dimensional space.
00:14:50.000 --> 00:15:06.000
So if I had a 6-dimensional space and I know that I am dealing with a subspace of 2-dimensions, w, I know that the orthogonal complement is going to have dimension 4 because 2 + 4 has to equal 6, or 6 - 2 = 4, however you want to look at it.
00:15:06.000 --> 00:15:15.000
Okay. Let us do another theorem here. Just a little bit of an informational theorem, which will make sense.
00:15:15.000 --> 00:15:32.000
If w is a subspace of RN, then w perp perp = w.
00:15:32.000 --> 00:15:40.000
This just says that if you take an orthogonal complement of some subspace and you take the orthogonal complement of that ,you are going to end up getting the original subspace.
00:15:40.000 --> 00:15:50.000
Nothing new about that, I mean like a function... if you take the inverse of a function and then you take the inverse of the inverse, you get the function back. That is all it is. Very, very intuitive.
00:15:50.000 --> 00:15:56.000
Okay. Now, let us discuss this symbol some more. This + symbol.
00:15:56.000 --> 00:16:10.000
So, when we write... this direct sum symbol -- I am sorry -- when we write w + w perp, these are subspaces, okay?
00:16:10.000 --> 00:16:16.000
We do not... this is a symbol for the addition of subspaces, we are not actually doing the operation of addition.
00:16:16.000 --> 00:16:27.000
What this means, so this symbolizes the addition of a subspace. This whole thing is a space.
00:16:27.000 --> 00:16:43.000
What this means is that something... it means that if I have some w1 -- no, let me make it a little bit more general, there are going to be too many w's floating around.
00:16:43.000 --> 00:16:49.000
So, if I have a, this direct sum symbol, plus b, okay?
00:16:49.000 --> 00:17:20.000
It is the space made up of vectors v, such that v is equal to some a + b, where the vector a comes from the space a and the vector b comes from the space b.
00:17:20.000 --> 00:17:35.000
So, this symbol, this direct sum symbol... it means if I take some vector in the subspace a... and a vector in the subspace b, and i actually add those vectors like I normally would? I am going to get some vector.
00:17:35.000 --> 00:17:50.000
Well, that vector belongs to this space. When I see this symbol, I am talking about a space. In some sense what I have done is I have taken 1 whole space and I have attached another space right to it.
00:17:50.000 --> 00:17:58.000
In the case of the example that we did, we had a 2-dimensional subspace, we added a 2-dimensional orthogonal complement to it, and what I got was the entire space R4.
00:17:58.000 --> 00:18:08.000
That is what is happening here. That is what this direct sum symbol means. It symbolizes the addition of spaces, the putting together of spaces.
00:18:08.000 --> 00:18:17.000
But, these vectors are spaces that contain individual vectors.
00:18:17.000 --> 00:18:32.000
Okay. Let us see. Let us do a little bit further here. Let us take R4, expand upon this...
00:18:32.000 --> 00:18:43.000
Let us let w = ... well not equal, let us say it has a basis.
00:18:43.000 --> 00:18:56.000
Let w have... has a basis vector (1,0,0,0), and (0,1,0,0).
00:18:56.000 --> 00:19:01.000
So, let us say that w is the subspace that has these 2 vectors as a basis.
00:19:01.000 --> 00:19:25.000
So, it is a 2-dimensional subspace, and we will let w perp have basis (0,0,1,0)... (0,0,0,1)... okay, as a basis.
00:19:25.000 --> 00:19:40.000
Now, if I take w, the direct sum w perp, well, that is equal to R4... right?
00:19:40.000 --> 00:20:07.000
So, a vector in R4... let us say for example... which is let us just say some random vector (4,3,-2,6), which is a vector in R4, it can be written as... well, it can be written as a vector from this subspace + a vector from this subspace.
00:20:07.000 --> 00:20:20.000
Just like what we defined, that is what a direct sum means. This w + the w perp, means take a vector from here, add it to a vector from here, and you have a vector in the sum, which happens to be R4.
00:20:20.000 --> 00:20:31.000
We can write it as (4,3,0,0)... this vector right here is in the space w.
00:20:31.000 --> 00:20:40.000
We can add it to the vector (0,0,-2,6), which is a vector in w perp.
00:20:40.000 --> 00:21:02.000
What is nice about this representation, this direct sum representation is that -- let us see -- this representation is unique.
00:21:02.000 --> 00:21:13.000
So, when I write a particular vector as a direct sum of 2 individual subspaces, the way that I write it is unique. There is no other way of writing it.
00:21:13.000 --> 00:21:17.000
Okay. So that gives us a nice basic idea of orthogonal complements to work with.
00:21:17.000 --> 00:21:20.000
We will continue on next time some more with orthogonal complements.
00:21:20.000 --> 00:21:27.000
Thank you for joining us at Educator.com and we will see you for the next instalment of linear algebra. Take care.