WEBVTT mathematics/linear-algebra/hovasapian
00:00:00.000 --> 00:00:04.000
Welcome back to educator.com and welcome back to linear algebra.
00:00:04.000 --> 00:00:30.000
Today, we are going to start on a new topic. A very, very, very important topic. 0008 Probably the single most important topic both in terms of the underlying structure of linear mappings, matrices, and also of profoundly practical importance, in all areas of science and math... quantum mechanics, engineering, all areas of physics, all areas of mathematics.
00:00:30.000 --> 00:00:39.000
Also, we are going to be discussing Eigenvalues and Eigenvectors. So, let us solve this and jump on in and see if we can make sense of this.
00:00:39.000 --> 00:00:56.000
Okay. Recall if you will, so if... a is n by n... and for the discussion of Eigenvalues and Eigenvectors, we are always going to be talking about matrices that are n by n.
00:00:56.000 --> 00:01:03.000
So, we are no longer going to be talking about 5 by 6, 3 by 9, it is always going to be 3 by 3, 4 by 4, 2 by 2... things like that.
00:01:03.000 --> 00:01:47.000
Okay. So, if a is n by n, we know that the function L, which is a mapping from RN to RN defined by the multiplication of some vector x by that matrix... we know that it is a linear mapping.
00:01:47.000 --> 00:02:02.000
So, this we know. That when we are given a n by n matrix, and we use that matrix to multiply on the left of some vector in RN, we know that what we get is a linear mapping. Okay.
00:02:02.000 --> 00:02:35.000
What we want to do, so we wish to discuss this situation where the vector x and anything I do to x which is multiply it by the matrix on the left are parallel to each other... by parallel, what we are really saying is that they are scalar multiples of each other.
00:02:35.000 --> 00:02:55.000
In other words, or when a × x is just a scalar multiple of x.
00:02:55.000 --> 00:03:04.000
In other words, I do not just map it to a completely different vector all together, all I do is take the vector x and I either expand it or contract it or leave it the same length.
00:03:04.000 --> 00:03:13.000
So, I am keeping it in its own space. I do not jump to another space. That is really what is going on here with this idea of Eigenvalue and Eigenvector.
00:03:13.000 --> 00:03:26.000
It has to do with starting in a space, taking a vector in that space, multiplying it by a matrix, and instead of twisting it and turning that vector, turn it into something else... just -- you know -- dilating it, making it bigger or smaller.
00:03:26.000 --> 00:03:33.000
That is all we are doing. Still staying in that space. Staying parallel. Okay.
00:03:33.000 --> 00:03:54.000
Now, let us see what we have got. Let us start with a definition. Let a be an n by n matrix, the real number λ.
00:03:54.000 --> 00:04:00.000
It is always symbolized with a λ this is traditional.
00:04:00.000 --> 00:04:32.000
It is called an Eigenvalue of a if there exists a non-zero vector x such that the matrix a × x just gives me some scalar λ × x.
00:04:32.000 --> 00:04:45.000
So, again, all this is saying is that we are starting with a vector x, if I multiply it by a matrix, it is the same as multiplying that vector by some scalar multiple.
00:04:45.000 --> 00:04:52.000
Instead of twisting it and turning it, all I have done is expand it, contract it, or left it the same. Okay.
00:04:52.000 --> 00:05:46.000
Now, every non-zero vector satisfying this relation is called an Eigenvector... whoa, that was interesting... an Eigenvector of a associated with the Eigenvalue λ.
00:05:46.000 --> 00:06:02.000
Okay. So, our central equation here is this one, our definition. It basically says, again, if I take a vector x in a subspace, and if I multiply it by a matrix, and n by n matrix, I am going to be transforming that vector... turning it into something else.
00:06:02.000 --> 00:06:11.000
If, all I do to it is expand or contract that vector x, and there is actually some number by which I expand or contract it.
00:06:11.000 --> 00:06:24.000
That is called an Eigenvalue, and every vector that satisfies this condition... meaning every vector that when multiplied by the matrix only ends up being expanded or contracted or left the same.
00:06:24.000 --> 00:06:31.000
That is called an Eigenvector associated with the Eigenvalue, associated with the matrix a.
00:06:31.000 --> 00:06:41.000
Very, very important relation. Again we are staying in this space. We are not doing anything to it. We are just moving along that space in a parallel fashion.
00:06:41.000 --> 00:07:07.000
Okay. One thing we definitely want to note here is that the 0 vector cannot be an Eigenvector, but 0, the real number can be an Eigenvalue.
00:07:07.000 --> 00:07:25.000
So, once again, the 0 vector cannot be an Eigenvector. We just exclude that possibility, but the number 0 can be an Eigenvalue. Okay. That is the only caveat with respect to this.
00:07:25.000 --> 00:07:38.000
Quick example... let us say that a is the matrix (0,1/2,1/2,0). Okay.
00:07:38.000 --> 00:07:53.000
Well, if we take a × the vector -- let us just say (1,1) -- well that is equal to (0,1/2,1/2,0) × (1, 1), that is what this is.
00:07:53.000 --> 00:08:06.000
That is equal to 0 × 1 + 1/2 × that is equal to 1/2, and then 1/2 ×1 + 0 × 1, 1/2... all that is equal to 1/2 × (1,1).
00:08:06.000 --> 00:08:16.000
Notice what I have done here. a × the vector (1,1) is equal to 1/2 × (1,1).
00:08:16.000 --> 00:08:29.000
My Eigenvalue λ = 1/2, because that is all a did... just simply by virtue of this multiplication... all I did was shrink it by 1/2.
00:08:29.000 --> 00:08:41.000
λ = 1/2, and the vector (1,1) happens to be one of the Eigenvectors. It is an Eigenvector... not the only Eigenvector.
00:08:41.000 --> 00:08:48.000
Oftentimes, for a given Eigenvalue you have an infinite number of Eigenvectors. We will show you why in a minute.
00:08:48.000 --> 00:09:06.000
Again, ax does nothing but expand or contract a vector. Okay. Now, a given λ can have many Eigenvectors.
00:09:06.000 --> 00:09:17.000
Often, we are only interested in 1... we do not necessarily need to list them all... so 1 will do.
00:09:17.000 --> 00:09:26.000
So a given λ can have many Eigenvectors associated with it, and here is why.
00:09:26.000 --> 00:09:40.000
Well, if I take a × some number R × x, if I just take any vector x and I multiply it by any number, that is an infinite number of vectors that I can get.
00:09:40.000 --> 00:09:55.000
Then, if I multiply that by a, we can reverse this... we can do R × a × x is equal to r × λ x... because a(x) is equal to λx, right? λ is an Eigenvalue.
00:09:55.000 --> 00:10:03.000
Well, that is equal to λ × r(x). Notice what I have got. a × R(x) = λ × R(x).
00:10:03.000 --> 00:10:14.000
If I have a given vector x, any scalar multiply of x is also an Eigenvector associated with that Eigenvalue.
00:10:14.000 --> 00:10:35.000
Okay. Let us do a example again. This time, we will let a equal to (1,1) - 2 and 4. Okay.
00:10:35.000 --> 00:10:59.000
This time we want to actually find the Eigenvalues and associated Eigenvectors of a.
00:10:59.000 --> 00:11:34.000
So, a given matrix can have Eigenvalues and Eigenvectors associated with it. Okay. Now, what do we want. So, we want real numbers λ and all of the variables x, which I will write in component form... x1, x2, such that, well, ax = λx.
00:11:34.000 --> 00:11:54.000
Well, a is (1,1,-2,4)... x is (x1,x2) = λ × x1... λ's x's, all these symbols everywhere... x2... okay.
00:11:54.000 --> 00:12:22.000
When we actually multiply this out, we get the following system: x1 + x2 = λx1, and we get -2x1 + 4x2 = λx2... let us fiddle around with this a little bit.
00:12:22.000 --> 00:12:51.000
Let me bring this over here, and this over here... and set it equal to 0, so I am going to write the equivalent version. It is going to be λ - 1 × x1, right? I have λ x1 - x1... I can pull out the x1 and I get λ - 1 × x1 - x2 = 0.
00:12:51.000 --> 00:13:14.000
I also get 2x1, moved it over to that side... + λx2 - 4x2, which is λ - 4 × x2 = 0. Right? Okay.
00:13:14.000 --> 00:13:22.000
Now, take a look at this linear system right here. It is a homogeneous system, okay? 2 by 2.
00:13:22.000 --> 00:13:40.000
Let me go back to my blue ink. This system has a non-trivial solution... remember the list of non-singular equivalences? It has a non-trivial solution, if and only if the determinant of the coefficient matrix is equal to 0.
00:13:40.000 --> 00:14:03.000
So, if I have -- no, this one I definitely want to write as clear as possible... start again -- coefficient matrix is (λ - 1, - 1, 2, λ - 4)... the determinant = 0.
00:14:03.000 --> 00:14:16.000
This homogeneous system has the non-trivial solution and the determinant is 0. Well, the determinant is this. The determinant of a 2 by 2 is this × this - that × that.
00:14:16.000 --> 00:14:51.000
So, I end up with λ - 1 × λ - 4 -2 = 0. I get λ² - 5λ + 4 + 2. I get λ² - 5λ + 6 = 0. All I am doing is following the map... that is all I am doing.
00:14:51.000 --> 00:15:10.000
Let me rewrite this, there are too many lines here... + 6 = 0... rewrite it again... go to red... λ² - 5λ + 6 = 0.
00:15:10.000 --> 00:15:34.000
This factors into λ - 2, λ - 3, this implies that λ1 = 2, λ2 = 3. These are my Eigenvalues associated with that matrix, and all I did was solved this homogeneous system, right? Okay.
00:15:34.000 --> 00:15:48.000
Now we want to find the Eigenvectors associated with the Eigenvalues. Well, I have 2 Eigenvalues, so I am going to be solving 2 systems to find the associated Eigenvectors.
00:15:48.000 --> 00:16:04.000
Let me show you what I just did here. I started with ax = λ × x. Bring this over here and set it equal to 0.
00:16:04.000 --> 00:16:25.000
λ... let us make the λ look like a λ and the x look like an x... λx - a × x is equal to 0. So, let me put the 0 vector over here... so I am working on the right because that is our habit.
00:16:25.000 --> 00:16:37.000
Let me factor out the x... well, λ ×... we are talking about matrices here, so since this is a matrix a, λ is a scalar... I just multiply that scalar by the identity matrix.
00:16:37.000 --> 00:16:50.000
Remember what the identity matrix is... it is just that matrix with 1's all along the diagonals, because I need matrix subtraction to be defined.
00:16:50.000 --> 00:16:56.000
This is the equation that I solve. So, for every λ that I get... I put it into this equation which is the thing that I had in the previous page.
00:16:56.000 --> 00:17:10.000
I put it into this equation, I solve the homogeneous system, I get my Eigenvectors for that Eigenvalue, and then I do the same for 3.
00:17:10.000 --> 00:17:38.000
So, now let us actually go through the process. Okay. This if you recall, was this. λ - 1 × x1 - x2 = 0, and it was 2x1 + λ - 4x2 is equal to 0.
00:17:38.000 --> 00:17:50.000
So, if I were going to take me λ = 2 Eigenvalue, I would put this 2 in here, and solve the associated homogeneous system.
00:17:50.000 --> 00:18:18.000
So, I would get 2 - 1 is 1. So, I would get x1 - x2 = 0, and I would get 2x1 + λ is 2... 2 - 4 is -2, so it is -2x2 is equal to 0.
00:18:18.000 --> 00:18:31.000
Well, that x1 is equal to x2 so which means that I can choose x2 anything that I want, so let us just call it R.
00:18:31.000 --> 00:18:45.000
Therefore, any vector of the form (R,R), is an Eigenvector for this Eigenvalue 2.
00:18:45.000 --> 00:19:04.000
Okay. Alright. What this means is if I take a, and if I take any vector of the form (R,R)... (1,1), (2,2), (3,3), (4,4)... all I end up doing is I end up multiplying it... that is what this is telling me.
00:19:04.000 --> 00:19:20.000
All vectors of this form, that have the same entry... when I multiply by the matrix a, all I do is end up doubling its length. That is what this telling me. Only vectors of this form are associated with this Eigenvalue.
00:19:20.000 --> 00:19:39.000
Now, let us do the λ = 3. Well, λ = 3... we end up putting it back into that original equations, so that is 3 - 1 × x1 - x2 = 0.
00:19:39.000 --> 00:20:12.000
We have 2x1 + 3 - 4, because 3 is our Eigenvalue, x2 = 0. We end up with 2x1 - x2 = 0... 2x1 - 2x2 = 0... This tells us that 2x1 = x2... x1 = x2/2.
00:20:12.000 --> 00:20:24.000
Therefore, our vector x is, well, if x2 is equal to R, then x1 = R/2.
00:20:24.000 --> 00:20:43.000
So, every vector of the form (R/2,R), like for example (4,2), (8,4), (16,8), (24,12)... those are the Eigenvectors associated with the Eigenvalue 3... that is going to actually end up equaling 3 × (18,9).
00:20:43.000 --> 00:21:05.000
That means that if I take the matrix a which was given, and if I take some vector like (18,9), which is of the form (R/2,R), all I am going to do is I am going to multiply that vector by a factor of 3. That is what is happening here.
00:21:05.000 --> 00:21:23.000
Okay. Let us move on, we are going to have a little bit of a definition here. We just did this, so now we are going to actually... this equation that we came up with that we solved to get the Eigenvalue, we are going to give it a special name.
00:21:23.000 --> 00:22:24.000
So, definition, we will let a < font size="-6" > ij < /font > be an n by n matrix... the determinant of λ × the identity matrix - a matrix, which is equal to this following determinant in symbolic form, λ - (a1,1) - (a1,2) - (a1,3) - (a1,n).
00:22:24.000 --> 00:22:47.000
Then, of course, - (a2,1) - (an,1) - (an,2)... λ - (a2,2), λ - (a < font size="-6" > n < /font > ,n)...
00:22:47.000 --> 00:23:08.000
This determinant is called the characteristic polynomial... characteristic polynomial of a.
00:23:08.000 --> 00:23:30.000
Now, when I set that characteristic polynomial, in other words the determinant of λ × in - a, when I set it equal to 0, it is called the characteristic poly... it is called the characteristic equation -- I am sorry.
00:23:30.000 --> 00:23:41.000
That is the polynomial... it is the characteristic equation... is the characteristic equation of a.
00:23:41.000 --> 00:24:06.000
Okay. Let us do an example. Let a = 1 - 2, 1 - 2, 1, 1, 0, -1, 4, 4, -5.
00:24:06.000 --> 00:24:28.000
Okay, so, we want to find the determinant of λ × in - a, which is... so you see what this looks like, let me actually do this... this is 3 by 3.
00:24:28.000 --> 00:24:51.000
So, it is going to be λ × in - 1 - 2, 1, 1, 0, - 1, 4, 4, -5...
00:24:51.000 --> 00:25:20.000
We are going to take the determinant of this thing... which means I have λ, 0, 0, 0, λ, 0, 0, 0, λ, -1, -2, 1, 1, 0, -1, 4, 4, -5.
00:25:20.000 --> 00:25:52.000
I end up with λ, -1, - -2 is 2, -1, 0 - 1 is -1, λ - 0 is λ, - -1 is 1, -4, -4, λ + 5.
00:25:52.000 --> 00:26:12.000
Then I take the determinant of that, and when I actually end up doing that and going through it, I end up with λ³ + 4λ² - 3λ - 6.
00:26:12.000 --> 00:26:24.000
This is a characteristic polynomial. If I want to find the Eigenvalues, I have to find the roots of this characteristic polynomial. I set it equal to 0, that will give me the Eigenvalues.
00:26:24.000 --> 00:26:34.000
When I get the Eigenvalues, I put it back into this form and I solve the homogeneous system to get the Eigenvectors. We will do more of that in just a minute.
00:26:34.000 --> 00:26:44.000
Okay. So, let us close off this section with just a theorem.
00:26:44.000 --> 00:27:08.000
An n by n matrix is singular... does not have an inverse if and only if 0 is an Eigenvalue of a.
00:27:08.000 --> 00:27:20.000
In other words, if 0 was not an Eigenvalue of matrix a, that matrix is non-singular. It has an inverse, so this is one item that we are going to add to our list of non-singular equivalences.
00:27:20.000 --> 00:27:26.000
We had 9 of them, now we have are going to have 10. We are going to add a 10th item.
00:27:26.000 --> 00:27:45.000
Okay. That 10th item added to the list of non-singular equivalences... 0 is not an Eigenvalue of a... that is the same as saying that a is non-singular.
00:27:45.000 --> 00:28:01.000
It is the same as saying that the determinant exists... all of those things that -- you know -- we have for those... for that list. So, this is the 10th equivalence.
00:28:01.000 --> 00:28:37.000
Another theorem. The Eigenvalues of a are the real roots of the characteristic polynomial... okay. So, you might have a polynomial fifth degree... it has 5 roots.
00:28:37.000 --> 00:28:45.000
Well, there is no guarantee that all of those 5 roots are going to be real. Some of them might become complex... if they are complex, they are going to come in complex conjugate pairs.
00:28:45.000 --> 00:28:51.000
So, if you know one of them is complex, you know 2 of them are complex. That means only 3 of them can be real.
00:28:51.000 --> 00:29:08.000
If you have 3 of them that are complex, that means the 4th is also complex. That means only 1 of them is going to be real. For caller algebra, polynomial equations, and solutions to polynomial equations, roots where the graph hits the x axis.
00:29:08.000 --> 00:29:16.000
So, when you have a characteristic polynomial, it is the real values that are the Eigenvalues of that associated matrix. Okay.
00:29:16.000 --> 00:29:41.000
Let us try something here. We will try an example. We will let a = (2, 2, 3, 1, 2, 1, 2, -2, 1,)... this is our matrix a.
00:29:41.000 --> 00:30:04.000
Our characteristic polynomial when we set it up... again, this is something that you can do on the mathematical software... our characteristic polynomial... is λ³ - 5λ² + 2λ + 8.
00:30:04.000 --> 00:30:32.000
When we actually factor this out, we end up with λ1 = 2... λ2 = 4. λ3 = -1. The degree of the polynomial is 3, which means we have 3 roots. We found those 3 roots... (2,4,-1), they are all real.
00:30:32.000 --> 00:30:45.000
All of these are Eigenvalues. Okay. Now, let us find the Eigenvectors associated with these Eigenvalues. Let us actually find a specific Eigenvector, not like we did last time where we found a general Eigenvector.
00:30:45.000 --> 00:31:00.000
Okay. In so doing, we are going to solve, of course, when we do... so we are going to solve this... λ × i3 - the matrix a × x = 0.
00:31:00.000 --> 00:31:22.000
This is the equation -- okay, this is not going to work... too many lines all over the place... this is too strange, let us try this again -- λ × i3 - a... x = 0, the vector.
00:31:22.000 --> 00:31:29.000
We are going to solve this equation, homogeneous system in order to find the associated Eigenvector.
00:31:29.000 --> 00:32:08.000
So, 4λ = 2... I get the following system... 0, -2, -3, 0, -1, 0, -1, 0... I am hoping to god my arithmetic is correct here... -2, 2, 1, 0... 2302 when I subject it to reduced row echelon... I want you to see at least one of them.
00:32:08.000 --> 00:32:42.000
We end up with 1, 0, 1, 0, 0, 1, 3/2, 0, 0, 0, 0, 0. So, leading entries here and here. Leading entry not there. Therefore I can take x3 = R, x2 = well, x2 = -3/2R, and x1 = -R.
00:32:42.000 --> 00:33:05.000
Well, I can set R to anything, so why do I not just like R = 1. So, a particular Eigenvector... a specific Eigenvector is -1, -3/2, and 1. This is an Eigenvector associated with the Eigenvalue 2.
00:33:05.000 --> 00:33:11.000
There are an infinite number of them, just different values of R. That is all it is.
00:33:11.000 --> 00:33:30.000
Okay. When I take the λ, when my Eigenvalue is a -1, I get some matrix, I subject it to reduced row echelon form, and I get a vector of the form -R, 0, R.
00:33:30.000 --> 00:33:44.000
Well, let us just take specific values. Let us just take R = 1, so (1,0,-1). This Eigenvector is associated with this Eigenvalue, λ2.
00:33:44.000 --> 00:34:00.000
Now, we will do λ = 4... λ3 = 4 -- get some notation here, make sure that I am correct -- λ1, λ2, and λ3 = 4... yes, we are correct.
00:34:00.000 --> 00:34:20.000
For that Eigenvalue, we end up with the general 4R, 5/2R, R, which gives us 4, 5/2, and 1.
00:34:20.000 --> 00:34:33.000
So, given a certain matrix, we can find its Eigenvalues, we can solve the homogeneous system to find its Eigenvectors, so we had a nice structure developing here for a particular matrix.
00:34:33.000 --> 00:35:01.000
So, let us do a quick recap. We have the definition of Eigenvalue, Eigenvector... If I have a matrix a, if I have a vector that I multiplied by, if what I end up with is some scalar multiple of that vector, well the scalar multiple is called an Eigenvalue.
00:35:01.000 --> 00:35:11.000
The vectors that actually satisfy this condition are called Eigenvectors associated with that Eigenvalue.
00:35:11.000 --> 00:35:39.000
I solve this for 0. I move this over to that side, and I end up with 0 = λx - ax, let me just bring this 0 over here, so I end up with λ × in - a... × x = 0.
00:35:39.000 --> 00:36:07.000
Okay. For this to have a non-trivial solution, well, the determinant of this thing λ i < font size="-6" > n < /font > - a has to equal 0.
00:36:07.000 --> 00:36:17.000
So I take the determinant of that... this matrix that I get, set it equal to 0, that gives me the Eigenvalues. Okay?
00:36:17.000 --> 00:36:35.000
That is the characteristic polynomial. This is the characteristic equation, and for each λ < font size="-6" > i < /font > , for each Eigenvalue that I get, for each root, real root, of the characteristic polynomial.
00:36:35.000 --> 00:36:48.000
We put each λ < font size="-6" > i < /font > back into this equation and we solve that homogeneous system and find our basis... our vectors that satisfy that.
00:36:48.000 --> 00:37:13.000
We find the associated Eigenvectors by solving λ < font size="-6" > i < /font > × i < font size="-6" > n < /font > - a × x = 0.
00:37:13.000 --> 00:37:26.000
So, we have a matrix a. We set this up, we take the parameter λ × the identity matrix... we subtract from it the a matrix.
00:37:26.000 --> 00:37:36.000
Now, you can do it either way. You can go a - λ, it does not matter. I did λ - a because I like λ to be positive, that is just a personal choice of mine.
00:37:36.000 --> 00:37:49.000
You end up with this equation. Well, you take the determinant of the matrix that you get... this thing λ × in - a, you set it equal to 0, you find the roots... those are the Eigenvalues.
00:37:49.000 --> 00:38:00.000
When you take each of those Eigenvalues and put it in turn back into this equation, solve the homogeneous system to get the associated Eigenvector with that respective Eigenvalue.
00:38:00.000 --> 00:38:09.000
So, that takes us through the basic structure of Eigenvalues and Eigenvectors. In our next lesson we are going to continue on and dig a little deeper into the structure of these things.
00:38:09.000 --> 00:38:11.000
Thank you for joining us at Educator.com, we will see you next time.