  Raffi Hovasapian

Eigenvalues and Eigenvectors

Slide Duration:

Section 1: Linear Equations and Matrices
Linear Systems

39m 3s

Intro
0:00
Linear Systems
1:20
Introduction to Linear Systems
1:21
Examples
10:35
Example 1
10:36
Example 2
13:44
Example 3
16:12
Example 4
23:48
Example 5
28:23
Example 6
32:32
Number of Solutions
35:08
One Solution, No Solution, Infinitely Many Solutions
35:09
Method of Elimination
36:57
Method of Elimination
36:58
Matrices

30m 34s

Intro
0:00
Matrices
0:47
Definition and Example of Matrices
0:48
Square Matrix
7:55
Diagonal Matrix
9:31
Operations with Matrices
10:35
10:36
Scalar Multiplication
15:01
Transpose of a Matrix
17:51
Matrix Types
23:17
Regular: m x n Matrix of m Rows and n Column
23:18
Square: n x n Matrix With an Equal Number of Rows and Columns
23:44
Diagonal: A Square Matrix Where All Entries OFF the Main Diagonal are '0'
24:07
Matrix Operations
24:37
Matrix Operations
24:38
Example
25:55
Example
25:56
Dot Product & Matrix Multiplication

41m 42s

Intro
0:00
Dot Product
1:04
Example of Dot Product
1:05
Matrix Multiplication
7:05
Definition
7:06
Example 1
12:26
Example 2
17:38
Matrices and Linear Systems
21:24
Matrices and Linear Systems
21:25
Example 1
29:56
Example 2
32:30
Summary
33:56
Dot Product of Two Vectors and Matrix Multiplication
33:57
Summary, cont.
35:06
Matrix Representations of Linear Systems
35:07
Examples
35:34
Examples
35:35
Properties of Matrix Operation

43m 17s

Intro
0:00
1:11
1:12
2:30
2:57
4:20
5:22
5:23
Properties of Multiplication
6:47
Properties of Multiplication: A
7:46
Properties of Multiplication: B
8:13
Properties of Multiplication: C
9:18
Example: Properties of Multiplication
9:35
Definitions and Properties (Multiplication)
14:02
Identity Matrix: n x n matrix
14:03
Let A Be a Matrix of m x n
15:23
Definitions and Properties (Multiplication)
18:36
Definitions and Properties (Multiplication)
18:37
Properties of Scalar Multiplication
22:54
Properties of Scalar Multiplication: A
23:39
Properties of Scalar Multiplication: B
24:04
Properties of Scalar Multiplication: C
24:29
Properties of Scalar Multiplication: D
24:48
Properties of the Transpose
25:30
Properties of the Transpose
25:31
Properties of the Transpose
30:28
Example
30:29
33:25
Let A, B, C, and D Be m x n Matrices
33:26
There is a Unique m x n Matrix, 0, Such That…
33:48
Unique Matrix D
34:17
Properties of Matrix Multiplication
34:58
Let A, B, and C Be Matrices of the Appropriate Size
34:59
Let A Be Square Matrix (n x n)
35:44
Properties of Scalar Multiplication
36:35
Let r and s Be Real Numbers, and A and B Matrices
36:36
Properties of the Transpose
37:10
Let r Be a Scalar, and A and B Matrices
37:12
Example
37:58
Example
37:59
Solutions of Linear Systems, Part 1

38m 14s

Intro
0:00
Reduced Row Echelon Form
0:29
An m x n Matrix is in Reduced Row Echelon Form If:
0:30
Reduced Row Echelon Form
2:58
Example: Reduced Row Echelon Form
2:59
Theorem
8:30
Every m x n Matrix is Row-Equivalent to a UNIQUE Matrix in Reduced Row Echelon Form
8:31
Systematic and Careful Example
10:02
Step 1
10:54
Step 2
11:33
Step 3
12:50
Step 4
14:02
Step 5
15:31
Step 6
17:28
Example
30:39
Find the Reduced Row Echelon Form of a Given m x n Matrix
30:40
Solutions of Linear Systems, Part II

28m 54s

Intro
0:00
Solutions of Linear Systems
0:11
Solutions of Linear Systems
0:13
Example I
3:25
Solve the Linear System 1
3:26
Solve the Linear System 2
14:31
Example II
17:41
Solve the Linear System 3
17:42
Solve the Linear System 4
20:17
Homogeneous Systems
21:54
Homogeneous Systems Overview
21:55
Theorem and Example
24:01
Inverse of a Matrix

40m 10s

Intro
0:00
Finding the Inverse of a Matrix
0:41
Finding the Inverse of a Matrix
0:42
Properties of Non-Singular Matrices
6:38
Practical Procedure
9:15
Step1
9:16
Step 2
10:10
Step 3
10:46
Example: Finding Inverse
12:50
Linear Systems and Inverses
17:01
Linear Systems and Inverses
17:02
Theorem and Example
21:15
Theorem
26:32
Theorem
26:33
List of Non-Singular Equivalences
28:37
Example: Does the Following System Have a Non-trivial Solution?
30:13
Example: Inverse of a Matrix
36:16
Section 2: Determinants
Determinants

21m 25s

Intro
0:00
Determinants
0:37
Introduction to Determinants
0:38
Example
6:12
Properties
9:00
Properties 1-5
9:01
Example
10:14
Properties, cont.
12:28
Properties 6 & 7
12:29
Example
14:14
Properties, cont.
18:34
Properties 8 & 9
18:35
Example
19:21
Cofactor Expansions

59m 31s

Intro
0:00
Cofactor Expansions and Their Application
0:42
Cofactor Expansions and Their Application
0:43
Example 1
3:52
Example 2
7:08
Evaluation of Determinants by Cofactor
9:38
Theorem
9:40
Example 1
11:41
Inverse of a Matrix by Cofactor
22:42
Inverse of a Matrix by Cofactor and Example
22:43
More Example
36:22
List of Non-Singular Equivalences
43:07
List of Non-Singular Equivalences
43:08
Example
44:38
Cramer's Rule
52:22
Introduction to Cramer's Rule and Example
52:23
Section 3: Vectors in Rn
Vectors in the Plane

46m 54s

Intro
0:00
Vectors in the Plane
0:38
Vectors in the Plane
0:39
Example 1
8:25
Example 2
15:23
19:33
19:34
Scalar Multiplication
24:08
Example
26:25
The Angle Between Two Vectors
29:33
The Angle Between Two Vectors
29:34
Example
33:54
Properties of the Dot Product and Unit Vectors
38:17
Properties of the Dot Product and Unit Vectors
38:18
Defining Unit Vectors
40:01
2 Very Important Unit Vectors
41:56
n-Vector

52m 44s

Intro
0:00
n-Vectors
0:58
4-Vector
0:59
7-Vector
1:50
2:43
Scalar Multiplication
3:37
Theorem: Part 1
4:24
Theorem: Part 2
11:38
Right and Left Handed Coordinate System
14:19
Projection of a Point Onto a Coordinate Line/Plane
17:20
Example
21:27
Cauchy-Schwarz Inequality
24:56
Triangle Inequality
36:29
Unit Vector
40:34
Vectors and Dot Products
44:23
Orthogonal Vectors
44:24
Cauchy-Schwarz Inequality
45:04
Triangle Inequality
45:21
Example 1
45:40
Example 2
48:16
Linear Transformation

48m 53s

Intro
0:00
Introduction to Linear Transformations
0:44
Introduction to Linear Transformations
0:45
Example 1
9:01
Example 2
11:33
Definition of Linear Mapping
14:13
Example 3
22:31
Example 4
26:07
Example 5
30:36
Examples
36:12
Projection Mapping
36:13
Images, Range, and Linear Mapping
38:33
Example of Linear Transformation
42:02
Linear Transformations, Part II

34m 8s

Intro
0:00
Linear Transformations
1:29
Linear Transformations
1:30
Theorem 1
7:15
Theorem 2
9:20
Example 1: Find L (-3, 4, 2)
11:17
Example 2: Is It Linear?
17:11
Theorem 3
25:57
Example 3: Finding the Standard Matrix
29:09
Lines and Planes

37m 54s

Intro
0:00
Lines and Plane
0:36
Example 1
0:37
Example 2
7:07
Lines in IR3
9:53
Parametric Equations
14:58
Example 3
17:26
Example 4
20:11
Planes in IR3
25:19
Example 5
31:12
Example 6
34:18
Section 4: Real Vector Spaces
Vector Spaces

42m 19s

Intro
0:00
Vector Spaces
3:43
Definition of Vector Spaces
3:44
Vector Spaces 1
5:19
Vector Spaces 2
9:34
Real Vector Space and Complex Vector Space
14:01
Example 1
15:59
Example 2
18:42
Examples
26:22
More Examples
26:23
Properties of Vector Spaces
32:53
Properties of Vector Spaces Overview
32:54
Property A
34:31
Property B
36:09
Property C
36:38
Property D
37:54
Property F
39:00
Subspaces

43m 37s

Intro
0:00
Subspaces
0:47
Defining Subspaces
0:48
Example 1
3:08
Example 2
3:49
Theorem
7:26
Example 3
9:11
Example 4
12:30
Example 5
16:05
Linear Combinations
23:27
Definition 1
23:28
Example 1
25:24
Definition 2
29:49
Example 2
31:34
Theorem
32:42
Example 3
34:00
Spanning Set for a Vector Space

33m 15s

Intro
0:00
A Spanning Set for a Vector Space
1:10
A Spanning Set for a Vector Space
1:11
Procedure to Check if a Set of Vectors Spans a Vector Space
3:38
Example 1
6:50
Example 2
14:28
Example 3
21:06
Example 4
22:15
Linear Independence

17m 20s

Intro
0:00
Linear Independence
0:32
Definition
0:39
Meaning
3:00
Procedure for Determining if a Given List of Vectors is Linear Independence or Linear Dependence
5:00
Example 1
7:21
Example 2
10:20
Basis & Dimension

31m 20s

Intro
0:00
Basis and Dimension
0:23
Definition
0:24
Example 1
3:30
Example 2: Part A
4:00
Example 2: Part B
6:53
Theorem 1
9:40
Theorem 2
11:32
Procedure for Finding a Subset of S that is a Basis for Span S
14:20
Example 3
16:38
Theorem 3
21:08
Example 4
25:27
Homogeneous Systems

24m 45s

Intro
0:00
Homogeneous Systems
0:51
Homogeneous Systems
0:52
Procedure for Finding a Basis for the Null Space of Ax = 0
2:56
Example 1
7:39
Example 2
18:03
Relationship Between Homogeneous and Non-Homogeneous Systems
19:47
Rank of a Matrix, Part I

35m 3s

Intro
0:00
Rank of a Matrix
1:47
Definition
1:48
Theorem 1
8:14
Example 1
9:38
Defining Row and Column Rank
16:53
If We Want a Basis for Span S Consisting of Vectors From S
22:00
If We want a Basis for Span S Consisting of Vectors Not Necessarily in S
24:07
Example 2: Part A
26:44
Example 2: Part B
32:10
Rank of a Matrix, Part II

29m 26s

Intro
0:00
Rank of a Matrix
0:17
Example 1: Part A
0:18
Example 1: Part B
5:58
Rank of a Matrix Review: Rows, Columns, and Row Rank
8:22
Procedure for Computing the Rank of a Matrix
14:36
Theorem 1: Rank + Nullity = n
16:19
Example 2
17:48
Rank & Singularity
20:09
Example 3
21:08
Theorem 2
23:25
List of Non-Singular Equivalences
24:24
List of Non-Singular Equivalences
24:25
Coordinates of a Vector

27m 3s

Intro
0:00
Coordinates of a Vector
1:07
Coordinates of a Vector
1:08
Example 1
8:35
Example 2
15:28
Example 3: Part A
19:15
Example 3: Part B
22:26
Change of Basis & Transition Matrices

33m 47s

Intro
0:00
Change of Basis & Transition Matrices
0:56
Change of Basis & Transition Matrices
0:57
Example 1
10:44
Example 2
20:44
Theorem
23:37
Example 3: Part A
26:21
Example 3: Part B
32:05
Orthonormal Bases in n-Space

32m 53s

Intro
0:00
Orthonormal Bases in n-Space
1:02
Orthonormal Bases in n-Space: Definition
1:03
Example 1
4:31
Theorem 1
6:55
Theorem 2
8:00
Theorem 3
9:04
Example 2
10:07
Theorem 2
13:54
Procedure for Constructing an O/N Basis
16:11
Example 3
21:42
Orthogonal Complements, Part I

21m 27s

Intro
0:00
Orthogonal Complements
0:19
Definition
0:20
Theorem 1
5:36
Example 1
6:58
Theorem 2
13:26
Theorem 3
15:06
Example 2
18:20
Orthogonal Complements, Part II

33m 49s

Intro
0:00
Relations Among the Four Fundamental Vector Spaces Associated with a Matrix A
2:16
Four Spaces Associated With A (If A is m x n)
2:17
Theorem
4:49
Example 1
7:17
Null Space and Column Space
10:48
Projections and Applications
16:50
Projections and Applications
16:51
Projection Illustration
21:00
Example 1
23:51
Projection Illustration Review
30:15
Section 5: Eigenvalues and Eigenvectors
Eigenvalues and Eigenvectors

38m 11s

Intro
0:00
Eigenvalues and Eigenvectors
0:38
Eigenvalues and Eigenvectors
0:39
Definition 1
3:30
Example 1
7:20
Example 2
10:19
Definition 2
21:15
Example 3
23:41
Theorem 1
26:32
Theorem 2
27:56
Example 4
29:14
Review
34:32
Similar Matrices & Diagonalization

29m 55s

Intro
0:00
Similar Matrices and Diagonalization
0:25
Definition 1
0:26
Example 1
2:00
Properties
3:38
Definition 2
4:57
Theorem 1
6:12
Example 3
9:37
Theorem 2
12:40
Example 4
19:12
Example 5
20:55
Procedure for Diagonalizing Matrix A: Step 1
24:21
Procedure for Diagonalizing Matrix A: Step 2
25:04
Procedure for Diagonalizing Matrix A: Step 3
25:38
Procedure for Diagonalizing Matrix A: Step 4
27:02
Diagonalization of Symmetric Matrices

30m 14s

Intro
0:00
Diagonalization of Symmetric Matrices
1:15
Diagonalization of Symmetric Matrices
1:16
Theorem 1
2:24
Theorem 2
3:27
Example 1
4:47
Definition 1
6:44
Example 2
8:15
Theorem 3
10:28
Theorem 4
12:31
Example 3
18:00
Section 6: Linear Transformations
Linear Mappings Revisited

24m 5s

Intro
0:00
Linear Mappings
2:08
Definition
2:09
Linear Operator
7:36
Projection
8:48
Dilation
9:40
Contraction
10:07
Reflection
10:26
Rotation
11:06
Example 1
13:00
Theorem 1
18:16
Theorem 2
19:20
Kernel and Range of a Linear Map, Part I

26m 38s

Intro
0:00
Kernel and Range of a Linear Map
0:28
Definition 1
0:29
Example 1
4:36
Example 2
8:12
Definition 2
10:34
Example 3
13:34
Theorem 1
16:01
Theorem 2
18:26
Definition 3
21:11
Theorem 3
24:28
Kernel and Range of a Linear Map, Part II

25m 54s

Intro
0:00
Kernel and Range of a Linear Map
1:39
Theorem 1
1:40
Example 1: Part A
2:32
Example 1: Part B
8:12
Example 1: Part C
13:11
Example 1: Part D
14:55
Theorem 2
16:50
Theorem 3
23:00
Matrix of a Linear Map

33m 21s

Intro
0:00
Matrix of a Linear Map
0:11
Theorem 1
1:24
Procedure for Computing to Matrix: Step 1
7:10
Procedure for Computing to Matrix: Step 2
8:58
Procedure for Computing to Matrix: Step 3
9:50
Matrix of a Linear Map: Property
10:41
Example 1
14:07
Example 2
18:12
Example 3
24:31
Bookmark & Share Embed

## Copy & Paste this embed code into your website’s HTML

Please ensure that your website editor is in text mode when you paste the code.
(In Wordpress, the mode button is on the top right corner.)
×
• - Allow users to view the embedded video in full-size.
Since this lesson is not free, only the preview will appear on your website.

• ## Related Books 2 answersLast reply by: John LinsTue Jan 31, 2017 2:25 PMPost by John Lins on January 30, 2017Sorry! I was trying to write the matrix as {{2,1},{0,2}}Here is the question:can't solve this question: Let M = {{2,1},{0,2}}.                                                     Find all eigenvalues of M. Does M have two independent eigenvector? Can M be diagonalized? 0 answersPost by John Lins on January 30, 2017Please, Help!! I can't solve this question: Let M = (2 1).                                                     0 2 Find all eigenvalues of M. Does M have two independent eigenvector? Can M be diagonalized? 0 answersPost by John Lins on January 30, 2017Please, Help!! I can't solve this question:Let M = (2 1). Find all eigenvalues of M. Does M have two independent eigenvector?         0 2Can M be diagonalized? 1 answer Last reply by: Professor HovasapianSun Jul 6, 2014 7:31 PMPost by E D on July 6, 2014Please help!! :'(At 32:53, i don't understand why x3 = r, when every value in row3 is zero.i can see why    x1 = -x3            and there fore x1 = -rand              x2 = (-3/2)x3       and there fore x2 = (-3/2)rbut x3 looks like it could be: x3 = -x1or                             x3 = (-2/3)x2 1 answer Last reply by: Professor HovasapianTue Jun 3, 2014 7:12 PMPost by ALI SAAD on June 3, 2014I have a question, Why do we use (A-rI) at differential equation class but here  we use (rI-A)?  does it not matter which one comes first? 4 answersLast reply by: Christian FischerSat Nov 30, 2013 11:16 AMPost by Christian Fischer on November 24, 2013Hi Raffi! Great lecture, now i understand it 100%. Just to make sure: at 21:02 you say that the eigenvector is ((r/2),r) And then you say the product A*x = (1,1;-1,4)(18,9) is the same as 3*(18,9) but don't you mean 3*(9,18) because i get (1,1;-1,4)(18,9)=(27,72)=3(9,18) Thank you again and have a great day,Christian 0 answersPost by Manfred Berger on June 22, 2013I may be jumping ahead here but with the zero vector mapping to the zero vector under every linear mapping and 0*any vector being the scalar zero I don't get why you wouldn't exclude both. Can you help me out? 4 answersLast reply by: Matt CSun Apr 28, 2013 11:57 AMPost by Matt C on April 27, 2013Hello again. In the book it gives me a 3x3 matrix [[0, -1, 0], [0,0,-1], [a,b,c]]. I wrote the matrix in column form. It then asks me to find a condition on a,b,c such that 1 is an eigenvalue of A. I figured out the question with my first try by just plugging 1 in for each variable (a=1,b=1,c=1). That gives eigenvalues = (1, 1,-1). My question is, is there a method or mathematical way of doing this, instead of just guessing and plugging numbers in? I don't want to see this question on the test and just be plugging in numbers and guessing. Any help or advice would be nice. Thanks. 2 answers Last reply by: Professor HovasapianFri Aug 10, 2012 10:42 PMPost by Shahaz Shajahan on August 10, 2012H, just a quick questionWhy is the n x n matrix M with its only eigenvalue is 0 is not invertible? Just came across this question and its asking for a proof, yet i'm stumped! would really appreciate any help.

### Eigenvalues and Eigenvectors

Lecture Slides are screen-captured images of important points in the lecture. Students can download and print out these lecture slide images to do practice problems as well as take notes while watching the lecture.

• Intro 0:00
• Eigenvalues and Eigenvectors 0:38
• Eigenvalues and Eigenvectors
• Definition 1
• Example 1
• Example 2
• Definition 2
• Example 3
• Theorem 1
• Theorem 2
• Example 4
• Review

### Transcription: Eigenvalues and Eigenvectors

Welcome back to educator.com and welcome back to linear algebra.0000

Today, we are going to start on a new topic. A very, very, very important topic. 0008 Probably the single most important topic both in terms of the underlying structure of linear mappings, matrices, and also of profoundly practical importance, in all areas of science and math... quantum mechanics, engineering, all areas of physics, all areas of mathematics.0004

Also, we are going to be discussing Eigenvalues and Eigenvectors. So, let us solve this and jump on in and see if we can make sense of this.0030

Okay. Recall if you will, so if... a is n by n... and for the discussion of Eigenvalues and Eigenvectors, we are always going to be talking about matrices that are n by n.0039

So, we are no longer going to be talking about 5 by 6, 3 by 9, it is always going to be 3 by 3, 4 by 4, 2 by 2... things like that.0056

Okay. So, if a is n by n, we know that the function L, which is a mapping from RN to RN defined by the multiplication of some vector x by that matrix... we know that it is a linear mapping.0063

So, this we know. That when we are given a n by n matrix, and we use that matrix to multiply on the left of some vector in RN, we know that what we get is a linear mapping. Okay.0107

What we want to do, so we wish to discuss this situation where the vector x and anything I do to x which is multiply it by the matrix on the left are parallel to each other... by parallel, what we are really saying is that they are scalar multiples of each other.0122

In other words, or when a × x is just a scalar multiple of x.0155

In other words, I do not just map it to a completely different vector all together, all I do is take the vector x and I either expand it or contract it or leave it the same length.0175

So, I am keeping it in its own space. I do not jump to another space. That is really what is going on here with this idea of Eigenvalue and Eigenvector.0184

It has to do with starting in a space, taking a vector in that space, multiplying it by a matrix, and instead of twisting it and turning that vector, turn it into something else... just -- you know -- dilating it, making it bigger or smaller.0193

That is all we are doing. Still staying in that space. Staying parallel. Okay.0206

Now, let us see what we have got. Let us start with a definition. Let a be an n by n matrix, the real number Λ.0213

It is always symbolized with a Λ this is traditional.0234

It is called an Eigenvalue of a if there exists a non-zero vector x such that the matrix a × x just gives me some scalar Λ × x.0240

So, again, all this is saying is that we are starting with a vector x, if I multiply it by a matrix, it is the same as multiplying that vector by some scalar multiple.0272

Instead of twisting it and turning it, all I have done is expand it, contract it, or left it the same. Okay.0285

Now, every non-zero vector satisfying this relation is called an Eigenvector... whoa, that was interesting... an Eigenvector of a associated with the Eigenvalue Λ.0292

Okay. So, our central equation here is this one, our definition. It basically says, again, if I take a vector x in a subspace, and if I multiply it by a matrix, and n by n matrix, I am going to be transforming that vector... turning it into something else.0346

If, all I do to it is expand or contract that vector x, and there is actually some number by which I expand or contract it.0362

That is called an Eigenvalue, and every vector that satisfies this condition... meaning every vector that when multiplied by the matrix only ends up being expanded or contracted or left the same.0371

That is called an Eigenvector associated with the Eigenvalue, associated with the matrix a.0384

Very, very important relation. Again we are staying in this space. We are not doing anything to it. We are just moving along that space in a parallel fashion.0391

Okay. One thing we definitely want to note here is that the 0 vector cannot be an Eigenvector, but 0, the real number can be an Eigenvalue.0401

So, once again, the 0 vector cannot be an Eigenvector. We just exclude that possibility, but the number 0 can be an Eigenvalue. Okay. That is the only caveat with respect to this.0427

Quick example... let us say that a is the matrix (0,1/2,1/2,0). Okay.0445

Well, if we take a × the vector -- let us just say (1,1) -- well that is equal to (0,1/2,1/2,0) × (1, 1), that is what this is.0458

That is equal to 0 × 1 + 1/2 × that is equal to 1/2, and then 1/2 ×1 + 0 × 1, 1/2... all that is equal to 1/2 × (1,1).0473

Notice what I have done here. a × the vector (1,1) is equal to 1/2 × (1,1).0486

My Eigenvalue Λ = 1/2, because that is all a did... just simply by virtue of this multiplication... all I did was shrink it by 1/2.0496

Λ = 1/2, and the vector (1,1) happens to be one of the Eigenvectors. It is an Eigenvector... not the only Eigenvector.0509

Oftentimes, for a given Eigenvalue you have an infinite number of Eigenvectors. We will show you why in a minute.0521

Again, ax does nothing but expand or contract a vector. Okay. Now, a given Λ can have many Eigenvectors.0528

Often, we are only interested in 1... we do not necessarily need to list them all... so 1 will do.0546

So a given Λ can have many Eigenvectors associated with it, and here is why.0557

Well, if I take a × some number R × x, if I just take any vector x and I multiply it by any number, that is an infinite number of vectors that I can get.0566

Then, if I multiply that by a, we can reverse this... we can do R × a × x is equal to r × Λ x... because a(x) is equal to Λx, right? Λ is an Eigenvalue.0580

Well, that is equal to Λ × r(x). Notice what I have got. a × R(x) = Λ × R(x).0595

If I have a given vector x, any scalar multiply of x is also an Eigenvector associated with that Eigenvalue.0603

Okay. Let us do a example again. This time, we will let a equal to (1,1) - 2 and 4. Okay.0614

This time we want to actually find the Eigenvalues and associated Eigenvectors of a.0635

So, a given matrix can have Eigenvalues and Eigenvectors associated with it. Okay. Now, what do we want. So, we want real numbers Λ and all of the variables x, which I will write in component form... x1, x2, such that, well, ax = Λx.0659

Well, a is (1,1,-2,4)... x is (x1,x2) = Λ × x1... Λ's x's, all these symbols everywhere... x2... okay.0694

When we actually multiply this out, we get the following system: x1 + x2 = Λx1, and we get -2x1 + 4x2 = Λx2... let us fiddle around with this a little bit.0714

Let me bring this over here, and this over here... and set it equal to 0, so I am going to write the equivalent version. It is going to be Λ - 1 × x1, right? I have Λ x1 - x1... I can pull out the x1 and I get Λ - 1 × x1 - x2 = 0.0742

I also get 2x1, moved it over to that side... + Λx2 - 4x2, which is Λ - 4 × x2 = 0. Right? Okay.0771

Now, take a look at this linear system right here. It is a homogeneous system, okay? 2 by 2.0794

Let me go back to my blue ink. This system has a non-trivial solution... remember the list of non-singular equivalences? It has a non-trivial solution, if and only if the determinant of the coefficient matrix is equal to 0.0802

So, if I have -- no, this one I definitely want to write as clear as possible... start again -- coefficient matrix is (Λ - 1, - 1, 2, Λ - 4)... the determinant = 0.0820

This homogeneous system has the non-trivial solution and the determinant is 0. Well, the determinant is this. The determinant of a 2 by 2 is this × this - that × that.0843

So, I end up with Λ - 1 × Λ - 4 -2 = 0. I get Λ2 - 5Λ + 4 + 2. I get Λ2 - 5Λ + 6 = 0. All I am doing is following the map... that is all I am doing.0856

Let me rewrite this, there are too many lines here... + 6 = 0... rewrite it again... go to red... Λ2 - 5Λ + 6 = 0.0891

This factors into Λ - 2, Λ - 3, this implies that Λ1 = 2, Λ2 = 3. These are my Eigenvalues associated with that matrix, and all I did was solved this homogeneous system, right? Okay.0910

Now we want to find the Eigenvectors associated with the Eigenvalues. Well, I have 2 Eigenvalues, so I am going to be solving 2 systems to find the associated Eigenvectors.0934

Let me show you what I just did here. I started with ax = Λ × x. Bring this over here and set it equal to 0.0948

Λ... let us make the Λ look like a Λ and the x look like an x... Λx - a × x is equal to 0. So, let me put the 0 vector over here... so I am working on the right because that is our habit.0964

Let me factor out the x... well, Λ ×... we are talking about matrices here, so since this is a matrix a, Λ is a scalar... I just multiply that scalar by the identity matrix.0985

Remember what the identity matrix is... it is just that matrix with 1's all along the diagonals, because I need matrix subtraction to be defined.0997

This is the equation that I solve. So, for every Λ that I get... I put it into this equation which is the thing that I had in the previous page.1010

I put it into this equation, I solve the homogeneous system, I get my Eigenvectors for that Eigenvalue, and then I do the same for 3.1016

So, now let us actually go through the process. Okay. This if you recall, was this. Λ - 1 × x1 - x2 = 0, and it was 2x1 + Λ - 4x2 is equal to 0.1030

So, if I were going to take me Λ = 2 Eigenvalue, I would put this 2 in here, and solve the associated homogeneous system.1058

So, I would get 2 - 1 is 1. So, I would get x1 - x2 = 0, and I would get 2x1 + Λ is 2... 2 - 4 is -2, so it is -2x2 is equal to 0.1070

Well, that x1 is equal to x2 so which means that I can choose x2 anything that I want, so let us just call it R.1098

Therefore, any vector of the form (R,R), is an Eigenvector for this Eigenvalue 2.1111

Okay. Alright. What this means is if I take a, and if I take any vector of the form (R,R)... (1,1), (2,2), (3,3), (4,4)... all I end up doing is I end up multiplying it... that is what this is telling me.1125

All vectors of this form, that have the same entry... when I multiply by the matrix a, all I do is end up doubling its length. That is what this telling me. Only vectors of this form are associated with this Eigenvalue.1144

Now, let us do the Λ = 3. Well, Λ = 3... we end up putting it back into that original equations, so that is 3 - 1 × x1 - x2 = 0.1160

We have 2x1 + 3 - 4, because 3 is our Eigenvalue, x2 = 0. We end up with 2x1 - x2 = 0... 2x1 - 2x2 = 0... This tells us that 2x1 = x2... x1 = x2/2.1179

Therefore, our vector x is, well, if x2 is equal to R, then x1 = R/2.1212

So, every vector of the form (R/2,R), like for example (4,2), (8,4), (16,8), (24,12)... those are the Eigenvectors associated with the Eigenvalue 3... that is going to actually end up equaling 3 × (18,9).1224

That means that if I take the matrix a which was given, and if I take some vector like (18,9), which is of the form (R/2,R), all I am going to do is I am going to multiply that vector by a factor of 3. That is what is happening here.1243

Okay. Let us move on, we are going to have a little bit of a definition here. We just did this, so now we are going to actually... this equation that we came up with that we solved to get the Eigenvalue, we are going to give it a special name.1265

So, definition, we will let aij be an n by n matrix... the determinant of Λ × the identity matrix - a matrix, which is equal to this following determinant in symbolic form, Λ - (a1,1) - (a1,2) - (a1,3) - (a1,n).1283

Then, of course, - (a2,1) - (an,1) - (an,2)... Λ - (a2,2), Λ - (an,n)...1344

This determinant is called the characteristic polynomial... characteristic polynomial of a.1367

Now, when I set that characteristic polynomial, in other words the determinant of Λ × in - a, when I set it equal to 0, it is called the characteristic poly... it is called the characteristic equation -- I am sorry.1388

That is the polynomial... it is the characteristic equation... is the characteristic equation of a.1410

Okay. Let us do an example. Let a = 1 - 2, 1 - 2, 1, 1, 0, -1, 4, 4, -5.1421

Okay, so, we want to find the determinant of Λ × in - a, which is... so you see what this looks like, let me actually do this... this is 3 by 3.1446

So, it is going to be Λ × in - 1 - 2, 1, 1, 0, - 1, 4, 4, -5...1468

We are going to take the determinant of this thing... which means I have Λ, 0, 0, 0, Λ, 0, 0, 0, Λ, -1, -2, 1, 1, 0, -1, 4, 4, -5.1491

I end up with Λ, -1, - -2 is 2, -1, 0 - 1 is -1, Λ - 0 is Λ, - -1 is 1, -4, -4, Λ + 5.1520

Then I take the determinant of that, and when I actually end up doing that and going through it, I end up with Λ3 + 4Λ2 - 3Λ - 6.1552

This is a characteristic polynomial. If I want to find the Eigenvalues, I have to find the roots of this characteristic polynomial. I set it equal to 0, that will give me the Eigenvalues.1572

When I get the Eigenvalues, I put it back into this form and I solve the homogeneous system to get the Eigenvectors. We will do more of that in just a minute.1584

Okay. So, let us close off this section with just a theorem.1594

An n by n matrix is singular... does not have an inverse if and only if 0 is an Eigenvalue of a.1604

In other words, if 0 was not an Eigenvalue of matrix a, that matrix is non-singular. It has an inverse, so this is one item that we are going to add to our list of non-singular equivalences.1628

We had 9 of them, now we have are going to have 10. We are going to add a 10th item.1640

Okay. That 10th item added to the list of non-singular equivalences... 0 is not an Eigenvalue of a... that is the same as saying that a is non-singular.1646

It is the same as saying that the determinant exists... all of those things that -- you know -- we have for those... for that list. So, this is the 10th equivalence.1665

Another theorem. The Eigenvalues of a are the real roots of the characteristic polynomial... okay. So, you might have a polynomial fifth degree... it has 5 roots.1681

Well, there is no guarantee that all of those 5 roots are going to be real. Some of them might become complex... if they are complex, they are going to come in complex conjugate pairs.1717

So, if you know one of them is complex, you know 2 of them are complex. That means only 3 of them can be real.1725

If you have 3 of them that are complex, that means the 4th is also complex. That means only 1 of them is going to be real. For caller algebra, polynomial equations, and solutions to polynomial equations, roots where the graph hits the x axis.1731

So, when you have a characteristic polynomial, it is the real values that are the Eigenvalues of that associated matrix. Okay.1748

Let us try something here. We will try an example. We will let a = (2, 2, 3, 1, 2, 1, 2, -2, 1,)... this is our matrix a.1756

Our characteristic polynomial when we set it up... again, this is something that you can do on the mathematical software... our characteristic polynomial... is Λ3 - 5Λ2 + 2Λ + 8.1781

When we actually factor this out, we end up with Λ1 = 2... Λ2 = 4. Λ3 = -1. The degree of the polynomial is 3, which means we have 3 roots. We found those 3 roots... (2,4,-1), they are all real.1804

All of these are Eigenvalues. Okay. Now, let us find the Eigenvectors associated with these Eigenvalues. Let us actually find a specific Eigenvector, not like we did last time where we found a general Eigenvector.1832

Okay. In so doing, we are going to solve, of course, when we do... so we are going to solve this... Λ × i3 - the matrix a × x = 0.1845

This is the equation -- okay, this is not going to work... too many lines all over the place... this is too strange, let us try this again -- Λ × i3 - a... x = 0, the vector.1860

We are going to solve this equation, homogeneous system in order to find the associated Eigenvector.1882

So, 4Λ = 2... I get the following system... 0, -2, -3, 0, -1, 0, -1, 0... I am hoping to god my arithmetic is correct here... -2, 2, 1, 0... 2302 when I subject it to reduced row echelon... I want you to see at least one of them.1889

We end up with 1, 0, 1, 0, 0, 1, 3/2, 0, 0, 0, 0, 0. So, leading entries here and here. Leading entry not there. Therefore I can take x3 = R, x2 = well, x2 = -3/2R, and x1 = -R.1928

Well, I can set R to anything, so why do I not just like R = 1. So, a particular Eigenvector... a specific Eigenvector is -1, -3/2, and 1. This is an Eigenvector associated with the Eigenvalue 2.1962

There are an infinite number of them, just different values of R. That is all it is.1985

Okay. When I take the Λ, when my Eigenvalue is a -1, I get some matrix, I subject it to reduced row echelon form, and I get a vector of the form -R, 0, R.1991

Well, let us just take specific values. Let us just take R = 1, so (1,0,-1). This Eigenvector is associated with this Eigenvalue, Λ2.2010

Now, we will do Λ = 4... Λ3 = 4 -- get some notation here, make sure that I am correct -- Λ1, Λ2, and Λ3 = 4... yes, we are correct.2024

For that Eigenvalue, we end up with the general 4R, 5/2R, R, which gives us 4, 5/2, and 1.2040

So, given a certain matrix, we can find its Eigenvalues, we can solve the homogeneous system to find its Eigenvectors, so we had a nice structure developing here for a particular matrix.2060

So, let us do a quick recap. We have the definition of Eigenvalue, Eigenvector... If I have a matrix a, if I have a vector that I multiplied by, if what I end up with is some scalar multiple of that vector, well the scalar multiple is called an Eigenvalue.2073

The vectors that actually satisfy this condition are called Eigenvectors associated with that Eigenvalue.2101

I solve this for 0. I move this over to that side, and I end up with 0 = Λx - ax, let me just bring this 0 over here, so I end up with Λ × in - a... × x = 0.2111

Okay. For this to have a non-trivial solution, well, the determinant of this thing Λ in - a has to equal 0.2139

So I take the determinant of that... this matrix that I get, set it equal to 0, that gives me the Eigenvalues. Okay?2167

That is the characteristic polynomial. This is the characteristic equation, and for each Λi, for each Eigenvalue that I get, for each root, real root, of the characteristic polynomial.2177

We put each Λi back into this equation and we solve that homogeneous system and find our basis... our vectors that satisfy that.2195

We find the associated Eigenvectors by solving Λi × in - a × x = 0.2208

So, we have a matrix a. We set this up, we take the parameter Λ × the identity matrix... we subtract from it the a matrix.2233

Now, you can do it either way. You can go a - Λ, it does not matter. I did Λ - a because I like Λ to be positive, that is just a personal choice of mine.2246

You end up with this equation. Well, you take the determinant of the matrix that you get... this thing Λ × in - a, you set it equal to 0, you find the roots... those are the Eigenvalues.2256

When you take each of those Eigenvalues and put it in turn back into this equation, solve the homogeneous system to get the associated Eigenvector with that respective Eigenvalue.2269

So, that takes us through the basic structure of Eigenvalues and Eigenvectors. In our next lesson we are going to continue on and dig a little deeper into the structure of these things.2280

Thank you for joining us at Educator.com, we will see you next time.2289

OR

### Start Learning Now

Our free lessons will get you started (Adobe Flash® required).