Raffi Hovasapian

Raffi Hovasapian

Similar Matrices & Diagonalization

Slide Duration:

Table of Contents

Section 1: Linear Equations and Matrices
Linear Systems

39m 3s

Intro
0:00
Linear Systems
1:20
Introduction to Linear Systems
1:21
Examples
10:35
Example 1
10:36
Example 2
13:44
Example 3
16:12
Example 4
23:48
Example 5
28:23
Example 6
32:32
Number of Solutions
35:08
One Solution, No Solution, Infinitely Many Solutions
35:09
Method of Elimination
36:57
Method of Elimination
36:58
Matrices

30m 34s

Intro
0:00
Matrices
0:47
Definition and Example of Matrices
0:48
Square Matrix
7:55
Diagonal Matrix
9:31
Operations with Matrices
10:35
Matrix Addition
10:36
Scalar Multiplication
15:01
Transpose of a Matrix
17:51
Matrix Types
23:17
Regular: m x n Matrix of m Rows and n Column
23:18
Square: n x n Matrix With an Equal Number of Rows and Columns
23:44
Diagonal: A Square Matrix Where All Entries OFF the Main Diagonal are '0'
24:07
Matrix Operations
24:37
Matrix Operations
24:38
Example
25:55
Example
25:56
Dot Product & Matrix Multiplication

41m 42s

Intro
0:00
Dot Product
1:04
Example of Dot Product
1:05
Matrix Multiplication
7:05
Definition
7:06
Example 1
12:26
Example 2
17:38
Matrices and Linear Systems
21:24
Matrices and Linear Systems
21:25
Example 1
29:56
Example 2
32:30
Summary
33:56
Dot Product of Two Vectors and Matrix Multiplication
33:57
Summary, cont.
35:06
Matrix Representations of Linear Systems
35:07
Examples
35:34
Examples
35:35
Properties of Matrix Operation

43m 17s

Intro
0:00
Properties of Addition
1:11
Properties of Addition: A
1:12
Properties of Addition: B
2:30
Properties of Addition: C
2:57
Properties of Addition: D
4:20
Properties of Addition
5:22
Properties of Addition
5:23
Properties of Multiplication
6:47
Properties of Multiplication: A
7:46
Properties of Multiplication: B
8:13
Properties of Multiplication: C
9:18
Example: Properties of Multiplication
9:35
Definitions and Properties (Multiplication)
14:02
Identity Matrix: n x n matrix
14:03
Let A Be a Matrix of m x n
15:23
Definitions and Properties (Multiplication)
18:36
Definitions and Properties (Multiplication)
18:37
Properties of Scalar Multiplication
22:54
Properties of Scalar Multiplication: A
23:39
Properties of Scalar Multiplication: B
24:04
Properties of Scalar Multiplication: C
24:29
Properties of Scalar Multiplication: D
24:48
Properties of the Transpose
25:30
Properties of the Transpose
25:31
Properties of the Transpose
30:28
Example
30:29
Properties of Matrix Addition
33:25
Let A, B, C, and D Be m x n Matrices
33:26
There is a Unique m x n Matrix, 0, Such That…
33:48
Unique Matrix D
34:17
Properties of Matrix Multiplication
34:58
Let A, B, and C Be Matrices of the Appropriate Size
34:59
Let A Be Square Matrix (n x n)
35:44
Properties of Scalar Multiplication
36:35
Let r and s Be Real Numbers, and A and B Matrices
36:36
Properties of the Transpose
37:10
Let r Be a Scalar, and A and B Matrices
37:12
Example
37:58
Example
37:59
Solutions of Linear Systems, Part 1

38m 14s

Intro
0:00
Reduced Row Echelon Form
0:29
An m x n Matrix is in Reduced Row Echelon Form If:
0:30
Reduced Row Echelon Form
2:58
Example: Reduced Row Echelon Form
2:59
Theorem
8:30
Every m x n Matrix is Row-Equivalent to a UNIQUE Matrix in Reduced Row Echelon Form
8:31
Systematic and Careful Example
10:02
Step 1
10:54
Step 2
11:33
Step 3
12:50
Step 4
14:02
Step 5
15:31
Step 6
17:28
Example
30:39
Find the Reduced Row Echelon Form of a Given m x n Matrix
30:40
Solutions of Linear Systems, Part II

28m 54s

Intro
0:00
Solutions of Linear Systems
0:11
Solutions of Linear Systems
0:13
Example I
3:25
Solve the Linear System 1
3:26
Solve the Linear System 2
14:31
Example II
17:41
Solve the Linear System 3
17:42
Solve the Linear System 4
20:17
Homogeneous Systems
21:54
Homogeneous Systems Overview
21:55
Theorem and Example
24:01
Inverse of a Matrix

40m 10s

Intro
0:00
Finding the Inverse of a Matrix
0:41
Finding the Inverse of a Matrix
0:42
Properties of Non-Singular Matrices
6:38
Practical Procedure
9:15
Step1
9:16
Step 2
10:10
Step 3
10:46
Example: Finding Inverse
12:50
Linear Systems and Inverses
17:01
Linear Systems and Inverses
17:02
Theorem and Example
21:15
Theorem
26:32
Theorem
26:33
List of Non-Singular Equivalences
28:37
Example: Does the Following System Have a Non-trivial Solution?
30:13
Example: Inverse of a Matrix
36:16
Section 2: Determinants
Determinants

21m 25s

Intro
0:00
Determinants
0:37
Introduction to Determinants
0:38
Example
6:12
Properties
9:00
Properties 1-5
9:01
Example
10:14
Properties, cont.
12:28
Properties 6 & 7
12:29
Example
14:14
Properties, cont.
18:34
Properties 8 & 9
18:35
Example
19:21
Cofactor Expansions

59m 31s

Intro
0:00
Cofactor Expansions and Their Application
0:42
Cofactor Expansions and Their Application
0:43
Example 1
3:52
Example 2
7:08
Evaluation of Determinants by Cofactor
9:38
Theorem
9:40
Example 1
11:41
Inverse of a Matrix by Cofactor
22:42
Inverse of a Matrix by Cofactor and Example
22:43
More Example
36:22
List of Non-Singular Equivalences
43:07
List of Non-Singular Equivalences
43:08
Example
44:38
Cramer's Rule
52:22
Introduction to Cramer's Rule and Example
52:23
Section 3: Vectors in Rn
Vectors in the Plane

46m 54s

Intro
0:00
Vectors in the Plane
0:38
Vectors in the Plane
0:39
Example 1
8:25
Example 2
15:23
Vector Addition and Scalar Multiplication
19:33
Vector Addition
19:34
Scalar Multiplication
24:08
Example
26:25
The Angle Between Two Vectors
29:33
The Angle Between Two Vectors
29:34
Example
33:54
Properties of the Dot Product and Unit Vectors
38:17
Properties of the Dot Product and Unit Vectors
38:18
Defining Unit Vectors
40:01
2 Very Important Unit Vectors
41:56
n-Vector

52m 44s

Intro
0:00
n-Vectors
0:58
4-Vector
0:59
7-Vector
1:50
Vector Addition
2:43
Scalar Multiplication
3:37
Theorem: Part 1
4:24
Theorem: Part 2
11:38
Right and Left Handed Coordinate System
14:19
Projection of a Point Onto a Coordinate Line/Plane
17:20
Example
21:27
Cauchy-Schwarz Inequality
24:56
Triangle Inequality
36:29
Unit Vector
40:34
Vectors and Dot Products
44:23
Orthogonal Vectors
44:24
Cauchy-Schwarz Inequality
45:04
Triangle Inequality
45:21
Example 1
45:40
Example 2
48:16
Linear Transformation

48m 53s

Intro
0:00
Introduction to Linear Transformations
0:44
Introduction to Linear Transformations
0:45
Example 1
9:01
Example 2
11:33
Definition of Linear Mapping
14:13
Example 3
22:31
Example 4
26:07
Example 5
30:36
Examples
36:12
Projection Mapping
36:13
Images, Range, and Linear Mapping
38:33
Example of Linear Transformation
42:02
Linear Transformations, Part II

34m 8s

Intro
0:00
Linear Transformations
1:29
Linear Transformations
1:30
Theorem 1
7:15
Theorem 2
9:20
Example 1: Find L (-3, 4, 2)
11:17
Example 2: Is It Linear?
17:11
Theorem 3
25:57
Example 3: Finding the Standard Matrix
29:09
Lines and Planes

37m 54s

Intro
0:00
Lines and Plane
0:36
Example 1
0:37
Example 2
7:07
Lines in IR3
9:53
Parametric Equations
14:58
Example 3
17:26
Example 4
20:11
Planes in IR3
25:19
Example 5
31:12
Example 6
34:18
Section 4: Real Vector Spaces
Vector Spaces

42m 19s

Intro
0:00
Vector Spaces
3:43
Definition of Vector Spaces
3:44
Vector Spaces 1
5:19
Vector Spaces 2
9:34
Real Vector Space and Complex Vector Space
14:01
Example 1
15:59
Example 2
18:42
Examples
26:22
More Examples
26:23
Properties of Vector Spaces
32:53
Properties of Vector Spaces Overview
32:54
Property A
34:31
Property B
36:09
Property C
36:38
Property D
37:54
Property F
39:00
Subspaces

43m 37s

Intro
0:00
Subspaces
0:47
Defining Subspaces
0:48
Example 1
3:08
Example 2
3:49
Theorem
7:26
Example 3
9:11
Example 4
12:30
Example 5
16:05
Linear Combinations
23:27
Definition 1
23:28
Example 1
25:24
Definition 2
29:49
Example 2
31:34
Theorem
32:42
Example 3
34:00
Spanning Set for a Vector Space

33m 15s

Intro
0:00
A Spanning Set for a Vector Space
1:10
A Spanning Set for a Vector Space
1:11
Procedure to Check if a Set of Vectors Spans a Vector Space
3:38
Example 1
6:50
Example 2
14:28
Example 3
21:06
Example 4
22:15
Linear Independence

17m 20s

Intro
0:00
Linear Independence
0:32
Definition
0:39
Meaning
3:00
Procedure for Determining if a Given List of Vectors is Linear Independence or Linear Dependence
5:00
Example 1
7:21
Example 2
10:20
Basis & Dimension

31m 20s

Intro
0:00
Basis and Dimension
0:23
Definition
0:24
Example 1
3:30
Example 2: Part A
4:00
Example 2: Part B
6:53
Theorem 1
9:40
Theorem 2
11:32
Procedure for Finding a Subset of S that is a Basis for Span S
14:20
Example 3
16:38
Theorem 3
21:08
Example 4
25:27
Homogeneous Systems

24m 45s

Intro
0:00
Homogeneous Systems
0:51
Homogeneous Systems
0:52
Procedure for Finding a Basis for the Null Space of Ax = 0
2:56
Example 1
7:39
Example 2
18:03
Relationship Between Homogeneous and Non-Homogeneous Systems
19:47
Rank of a Matrix, Part I

35m 3s

Intro
0:00
Rank of a Matrix
1:47
Definition
1:48
Theorem 1
8:14
Example 1
9:38
Defining Row and Column Rank
16:53
If We Want a Basis for Span S Consisting of Vectors From S
22:00
If We want a Basis for Span S Consisting of Vectors Not Necessarily in S
24:07
Example 2: Part A
26:44
Example 2: Part B
32:10
Rank of a Matrix, Part II

29m 26s

Intro
0:00
Rank of a Matrix
0:17
Example 1: Part A
0:18
Example 1: Part B
5:58
Rank of a Matrix Review: Rows, Columns, and Row Rank
8:22
Procedure for Computing the Rank of a Matrix
14:36
Theorem 1: Rank + Nullity = n
16:19
Example 2
17:48
Rank & Singularity
20:09
Example 3
21:08
Theorem 2
23:25
List of Non-Singular Equivalences
24:24
List of Non-Singular Equivalences
24:25
Coordinates of a Vector

27m 3s

Intro
0:00
Coordinates of a Vector
1:07
Coordinates of a Vector
1:08
Example 1
8:35
Example 2
15:28
Example 3: Part A
19:15
Example 3: Part B
22:26
Change of Basis & Transition Matrices

33m 47s

Intro
0:00
Change of Basis & Transition Matrices
0:56
Change of Basis & Transition Matrices
0:57
Example 1
10:44
Example 2
20:44
Theorem
23:37
Example 3: Part A
26:21
Example 3: Part B
32:05
Orthonormal Bases in n-Space

32m 53s

Intro
0:00
Orthonormal Bases in n-Space
1:02
Orthonormal Bases in n-Space: Definition
1:03
Example 1
4:31
Theorem 1
6:55
Theorem 2
8:00
Theorem 3
9:04
Example 2
10:07
Theorem 2
13:54
Procedure for Constructing an O/N Basis
16:11
Example 3
21:42
Orthogonal Complements, Part I

21m 27s

Intro
0:00
Orthogonal Complements
0:19
Definition
0:20
Theorem 1
5:36
Example 1
6:58
Theorem 2
13:26
Theorem 3
15:06
Example 2
18:20
Orthogonal Complements, Part II

33m 49s

Intro
0:00
Relations Among the Four Fundamental Vector Spaces Associated with a Matrix A
2:16
Four Spaces Associated With A (If A is m x n)
2:17
Theorem
4:49
Example 1
7:17
Null Space and Column Space
10:48
Projections and Applications
16:50
Projections and Applications
16:51
Projection Illustration
21:00
Example 1
23:51
Projection Illustration Review
30:15
Section 5: Eigenvalues and Eigenvectors
Eigenvalues and Eigenvectors

38m 11s

Intro
0:00
Eigenvalues and Eigenvectors
0:38
Eigenvalues and Eigenvectors
0:39
Definition 1
3:30
Example 1
7:20
Example 2
10:19
Definition 2
21:15
Example 3
23:41
Theorem 1
26:32
Theorem 2
27:56
Example 4
29:14
Review
34:32
Similar Matrices & Diagonalization

29m 55s

Intro
0:00
Similar Matrices and Diagonalization
0:25
Definition 1
0:26
Example 1
2:00
Properties
3:38
Definition 2
4:57
Theorem 1
6:12
Example 3
9:37
Theorem 2
12:40
Example 4
19:12
Example 5
20:55
Procedure for Diagonalizing Matrix A: Step 1
24:21
Procedure for Diagonalizing Matrix A: Step 2
25:04
Procedure for Diagonalizing Matrix A: Step 3
25:38
Procedure for Diagonalizing Matrix A: Step 4
27:02
Diagonalization of Symmetric Matrices

30m 14s

Intro
0:00
Diagonalization of Symmetric Matrices
1:15
Diagonalization of Symmetric Matrices
1:16
Theorem 1
2:24
Theorem 2
3:27
Example 1
4:47
Definition 1
6:44
Example 2
8:15
Theorem 3
10:28
Theorem 4
12:31
Example 3
18:00
Section 6: Linear Transformations
Linear Mappings Revisited

24m 5s

Intro
0:00
Linear Mappings
2:08
Definition
2:09
Linear Operator
7:36
Projection
8:48
Dilation
9:40
Contraction
10:07
Reflection
10:26
Rotation
11:06
Example 1
13:00
Theorem 1
18:16
Theorem 2
19:20
Kernel and Range of a Linear Map, Part I

26m 38s

Intro
0:00
Kernel and Range of a Linear Map
0:28
Definition 1
0:29
Example 1
4:36
Example 2
8:12
Definition 2
10:34
Example 3
13:34
Theorem 1
16:01
Theorem 2
18:26
Definition 3
21:11
Theorem 3
24:28
Kernel and Range of a Linear Map, Part II

25m 54s

Intro
0:00
Kernel and Range of a Linear Map
1:39
Theorem 1
1:40
Example 1: Part A
2:32
Example 1: Part B
8:12
Example 1: Part C
13:11
Example 1: Part D
14:55
Theorem 2
16:50
Theorem 3
23:00
Matrix of a Linear Map

33m 21s

Intro
0:00
Matrix of a Linear Map
0:11
Theorem 1
1:24
Procedure for Computing to Matrix: Step 1
7:10
Procedure for Computing to Matrix: Step 2
8:58
Procedure for Computing to Matrix: Step 3
9:50
Matrix of a Linear Map: Property
10:41
Example 1
14:07
Example 2
18:12
Example 3
24:31
Loading...
This is a quick preview of the lesson. For full access, please Log In or Sign up.
For more information, please see full course syllabus of Linear Algebra
Bookmark & Share Embed

Share this knowledge with your friends!

Copy & Paste this embed code into your website’s HTML

Please ensure that your website editor is in text mode when you paste the code.
(In Wordpress, the mode button is on the top right corner.)
  ×
  • - Allow users to view the embedded video in full-size.
Since this lesson is not free, only the preview will appear on your website.
  • Discussion

  • Answer Engine

  • Download Lecture Slides

  • Table of Contents

  • Transcription

  • Related Books

Lecture Comments (10)

2 answers

Last reply by: Hen McGibbons
Sun Apr 24, 2016 5:54 PM

Post by Ahmed Alzayer on September 28, 2015

I have B 2x2 matrix =

Cos x       -Sin x
Sin x.       Cos x

It still can be diagonalized even though the roots are imaginary, can u clarify.

1 answer

Last reply by: Professor Hovasapian
Wed Nov 13, 2013 3:05 AM

Post by Eddie Chan on November 12, 2013

Hi Raffi,

I received a question about "If A and B are diagonalisable n x n matrix, so is A + B." I have no idea how to prove or disprove it.

0 answers

Post by Manfred Berger on June 23, 2013

In Theorem 2 I get why the multiplicity of the eigenvalues impacts whether or not a matrix is diagonalizable, but why does it matter that the values are real. If the characteristic polynomial has complex roots P has complex entries. Somehow I don't see why that changes anything

0 answers

Post by Manfred Berger on June 23, 2013

Could 2 eigenvectors of an n*n matrix ever be linearly dependent?

1 answer

Last reply by: Carlo Lam
Tue Apr 30, 2013 12:54 AM

Post by Carlo Lam on April 30, 2013

P is always an arbitrary matrix?

0 answers

Post by Matt C on April 27, 2013

I guess I don't understand what you are saying for example 4 at 20:18 when you say this is only one vector. When I plugged in all the values for lambda, I got three eigenvectors [[1,0,0], [0,1,0], [0,1,0]]. If there is a way where you could explain it would be nice. I was feeling pretty good up until this point with eigenvalues, eigenvectors, and diagonalization.

Similar Matrices & Diagonalization

Lecture Slides are screen-captured images of important points in the lecture. Students can download and print out these lecture slide images to do practice problems as well as take notes while watching the lecture.

  • Intro 0:00
  • Similar Matrices and Diagonalization 0:25
    • Definition 1
    • Example 1
    • Properties
    • Definition 2
    • Theorem 1
    • Example 3
    • Theorem 2
    • Example 4
    • Example 5
    • Procedure for Diagonalizing Matrix A: Step 1
    • Procedure for Diagonalizing Matrix A: Step 2
    • Procedure for Diagonalizing Matrix A: Step 3
    • Procedure for Diagonalizing Matrix A: Step 4

Transcription: Similar Matrices & Diagonalization

Welcome back to Educator.com and welcome back to linear algebra.0000

In our previous lesson, we introduced the notion of Eigenvector and Eigenvalue.0004

Again, very, very profoundly profoundly important concepts throughout mathematics and science.0009

Today, we are going to dig a little bit deeper and we are going to introduce the notion of similar matrices and the idea of diagonalization.0015

So, let us jump right on in. Let us start with a definition... Let me go to blue ink here.0025

Okay. A matrix b is said to be similar to matrix a if there is a non-singular matrix, p, such that... let us see what this definition says.0036

If I have some matrix a, and I find some other matrix p, and if I multiply on the left of a by p inverse, and on the right by p, so if I take p inverse a × p, the matrix that I get, b, I say that b is similar to a.0083

So, there is a relationship that exists between, if I can actually sandwich this matrix a between some matrix p and the inverse of p.0102

In the course of this lesson, we are going to talk to you actually about how to find this matrix p, and about what this matrix b looks like. Really, really quite beautiful.0109

Okay. A quick example just so you see what this looks like in real life. So, if I let the matrix a equal (1,1,-2,4), just a little 2 by 2... and if I say I had p which is (1,1,1,2), well, if I calculate the inverse of p, that is going to equal (2,-1,-1,1)... okay?0120

Now, as it turns out, if I take b, if I actually do p inverse × a × p so I multiply that by that, then by that, I end up with the following... I end up with the matrix (2,0,0,3).0149

Now, what you might want to do is take a look at the previous lesson. This matrix a, you remember, and I will discuss this in a minute... just to let you know what is coming up, what is coming ahead, this matrix a, we have dealt with this matrix before in the previous lesson.0170

We found the Eigenvalues for it. The Eigenvalues were 2 and 3. Well, notice what we did.0183

We found this matrix p... and we took the inverse of that, we multiplied p inverse a p, and we end up with a diagonal matrix, where the entries on the diagonal are exactly the Eigenvalues of a. That is what is going to end up being so beautiful.0189

Here is what is even better. If you remember the two Eigenvectors that we found for the two Eigenvalues 2 and 3 were exactly of the form (1,1,1,2).0203

So, this matrix p is going to end up being made up of the actual Eigenvectors for the Eigenvalues. We will just... a little preview of what is coming.0212

Okay. Just some quick properties of similarity. So, the first property is a is similar to a, of course... intuitively clear.0223

If b is similar to a, then a is similar to b. That just means that... we can always multiply on the left by p, here, and p × p inverse, this goes away, and we multiply on the right by p inverse so this is just the same.0244

And three... If a is similar to b, and b is similar to c, then, by transitivity, c is similar to a... I am sorry, a is similar to c, which means that c is similar to a by property too... then a is similar to c.0262

So, standard properties... we will be using those in a second. Another definition. We say matrix a is diagonalizable... did I spell that correct?... diagonalizable... if it is similar to a diagonal matrix.0293

In this case, we say a can be diagonalized -- put a comma there, in this case we say... a can be diagonalized.0344

Okay. Now, let us see what we have got. Alright. Profoundly, profoundly, profoundly important theorem.0367

An n by n matrix is diagonalizable if, and only if, it has n linearly independent Eigenvectors.0387

In this case, a is similar to a diagonal matrix d, where d is equal to p inverse a... p... and the diagonal elements of p, whose diagonal elements are the Eigenvalues of a.0423

It is like we did before. It is similar to the diagonal matrix b, and the entries on that diagonal are precisely the Eigenvalues of a.0468

Well, p is the matrix whose columns respectively... it means respective to the individual Eigenvalues... so Eigenvalue Λ1 gives column 1, Λ2 gives column 2, Λ3 gives column 3... so on and so forth, respectively.0481

Each of these columns, respectively, are the n linearly independent Eigenvectors of a.0512

So, again, this is what we did in our example. So now we state it as a theorem... an n by n matrix is diagonalizable if and only if it has n linearly independent Eigenvectors.0529

So, if I have a 3 by 3, I need 3 Eigenvectors. If I have a 5 by 5, I need 5 linearly independent Eigenvectors.0542

In this case, the a is similar to a diagonal matrix d, whose diagonal elements are precisely the Eigenvalues of a, and the matrix p in this relation here, is the matrix made up of the columns that are the n independent, linearly independent Eigenvectors of a.0548

So, from a, I can derive the matrix d, I can derive the matrix p, and the relationship is precisely this.0570

Let us just do an example here. Okay. We will let a equal the matrix (1,2,3)... (0,1,0)... (2,1,2).0577

I am not going to go through the entire process, again I use mathematical software to do this... to find the Eigenvalues and to find the Eigenvectors.0594

Here is how it works out. As it turns out, one of the Eigenvalues is 4. 4 generates the following Eigenvector, when I do solve the homogenous system, I get (1,0,1).0602

A second Eigenvalue is -1. Also real. It generates the Eigenvalue... the Eigenvector (3,0,2). The third Eigenvalue is equal to 1, all distinct, all real, and it generates 1 - 6 and 4, when I solve the homogeneous system.0619

Therefore, p = (1,0,1), (-3,0,2), (1,-6,4).0651

If I want to define p inverse, which I can, it is not a problem -- you know what I will go ahead and write it out here... it is not going to be too big of an issue -- I have (2/5, -1/5, 0), (7/15, 1/10, -1/6), (3/5, -1/5, 0), and of course my diagonal matrix d is going to end up being (4,0,0), (0,-1,0), (0,0,1).0667

If I were to confirm... yes, I would find out that d does in fact equal p inverse × a × p. Excuse me... found the Eigenvalues... found the associated Eigenvector...put those Eigenvectors... these are linearly independent, put them as columns in p... if I take p, multiply and find p inverse, if I multiply a by p inverse on the left, p on the right, I end up with a diagonal matrix where the entries on the main diagonal are precisely the Eigenvalues (4,-1,1), (4,-1,1).0714

So, 4 is the first, that is why its Eigenvector is the first column, that is what we meant by respectively.0753

Okay. Now, if all the roots of the characteristic polynomial, which is what we solve to find the Eigenvalues, of a are real and distinct... if the roots of the characteristic polynomial are real and distinct... in other words if the Eigenvalues of the matrix are real and distinct, then a is diagonalizable. Always.0761

Note how we wrote this. If all of the roots of the characteristic polynomial are real and distinct, then a is diagonalizable. There is no if and only if here.0812

That does not mean that if a is diagonalizable, that the roots of the characteristic polynomial, the Eigenvalues are real and distinct.0822

It is possible for a to be diagonalizable and have roots that are not distinct. You might have an Eigenvalue, you might have a 4 by 4, and you might have an Eigenvalue 4, and then (1,1,1). That 1 might be an Eigenvalue 3 times over, but it will still be diagonalizable.0829

So, again. If then does not mean that it works the same backwards. It is not the same. It is not the same as if and only... if and only if means it goes both ways.0846

So, if the roots of the polynomial are real and distinct. If it is diagonalizable, the Eigenvalues may or may not be real and distinct.0855

Okay. Now, the characteristic polynomial... so the characteristic poly for non-distinct Eigenvalues looks like this.0866

Well, we know we are dealing with some characteristic polynomial... Λ3 + something Λ2, lambda;n.0890

Well, every time we find a value, we can factor... that is what the fundamental theorem of algebra says... every polynomial can be factored into linear factors.0899

For the non-distinct, we end up with something like this... Λ - Λi to the ki power.0910

So, if some root ends up showing up 5 times, that means that I have 5 factors for that root. Λ - 1, Λ - 1, Λ - 1, Λ - 1, Λ - 1, well, that factor is Λ - Λi... you know, to this power. Okay.0917

This is Λ1, this is Λ 2, in the case of 2, in the case of i... Okay.0939

Well, this ki is called multiplicity of the Eigenvalue... Λi0974

It can be shown that if the Eigenvalues of a are all real and distinct, in this case we already dealt with the fact they are distinct, we know they are diagonalizable if they are all real and distinct.0977

So, we can also show, now that we have introduced this idea of multiplicity... so if our characteristic polynomial has multiple roots, so if I have a fourth degree equation, one of the roots is 4, the other root is 3, and the other roots are 2 and 2.1019

Well, 4 has a multiplicity 1, 3 has a multiplicity 1, 2, because it shows up twice, has a multiplicity 2.1032

It can be shown that if the Eigenvalues of a are all real, then a can be diagonalized, it can still be diagonalized if and only if for each Λi of multiplicity ki, we can find... if we can find ki linearly independent Eigenvectors.1038

This means that the null space for that thing that we solved, Li, in - a, × x = 0, this means that the null space of that equation that we solved to find the Eigenvectors has dimension ki.1088

In other words, if I have a root of the characteristic polynomial, an Eigenvalue that has a multiplicity 3... let us say its 1 shows up 3 times.1118

Well, when I put it into the equation, this homogeneous equation, if I can actually find 3 linearly independent vectors... if I can find that the dimension of that null space is 3, I can diagonalize that matrix.1128

If not, I cannot diagonalize that matrix. Okay.1145

So, let us see what we have got. You will let a = (0,0,1), (0,1,2), (0,0,1).1150

That is interesting... okay.1169

When we take the characteristic polynomial of this, we end up with the following. Λ × (Λ - 1)2... so we have λ1 = 0, Λ2 = 1, so one has a multiplicity of 2, and Λ3 is also equal to 1.1173

Well, let us just deal with this multiplicity of 1 Eigenvalue. When we solved the homogeneous system, we end up with the following. We end up finding that the Eigenvector x is this... (0,r,0).1195

Well, this is only 1 vector. Here we have 2 Eigenvalues, multiplicity is 2. Well, in order for this matrix to be diagonalizable, I have to... when I solve that homogeneous system to find the actual Eigenvectors, I need 2 vectors, not just the 1. This is only one vector so this is not diagonalizable.1214

Now, mind you, it still has Eigenvalues, and it still has an associated Eigenvector, but it is not diagonalizable. I cannot find some matrix that satisfies that other property.1240

Okay. So, now, let us try this one. Let us let a equal to (0,0,0), (0,1,0), (1,0,1).1255

Well, as it turns out, this characteristic polynomial is also Λ × (Λ - 1)2.1272

So, again we have Λ = 0, Λ2 = 1, Λ3 = 1, so our Eigenvalue 1 has a multiplicity of 2, so we want to find 2 Eigenvectors.1280

We want to find the dimension of this null space associated with this Eigenvalue if it has a dimension 2, we are good. We can actually diagonalize this.1294

Let us see what we have. Let us see. When we solve the associated homogeneous system, we end up with (1,0,0,0), (0,0,0,0), (-1,0,0,0).1302

When we subject this to reduced row echelon form, it becomes (1,0,0,0), (0,0,0), (0,0,0,0)... so we have... there we go... we have x1 = 0, we have x2 = let us see, this is x2, x3, 3 parameters equals r... x3 = s, and we can rewrite this as equal to r × (0,1,0) + s × (0,0,1), so there you go.1326

We have two Eigenvectors, a basis... our basis for that null space has 2 Eigenvectors. It is of dimension 2. It matches the multiplicity of the Eigenvalue, therefore this can be diagonalized.1375

So, let us go ahead and actually finish the diagonalization process... when I go back and solve for the Eigenvalue Λ1 = 0, for that Eigenvalue, I get the following vector... r0 - r, as a general, and its specific would be -- let us see -- (1,0,-1).1392

Therefore our matrix p would be (1,0,-1), (0,1,0), (0,0,1). This is our matrix p.1417

It is of course diagonalizable. Matrix d is going to end up being (0,0,0), (0,1,0), (0,0,1). These along the main diagonal are the Eigenvalues of our matrix. This is our matrix p. We can find the inverse for it, and when we multiply we will find that d, in fact, equals p inverse, a p.1428

So, now, procedure for diagonalizing matrix a, this is going to be our recap. First thing we want to do is form the characteristic polynomial, which is symbolized also with f(Λ) equals the determinant of Λ × in... Λ is our variable... Λ times the identity matrix minus a.1458

That is our polynomial. Then, we find the roots of the characteristic polynomial... okay?1502

If not all real, if they are not all real, then you can stop... it cannot be diagonalized.1517

Okay. Three. For each Eigenvalue, Λi of multiplicity ki... find a basis for the null space of Λi, in - a × x = 0.1537

So, for each Eigenvalue Λi of multiplicity ki, find a basis for the null space... also called the Eigenspace.1581

If the dimension... the dimension that we just found... the dimension of this null space that we just found is less than the ki, then you can stop. a cannot be diagonalized.1594

If so, well, then, let p be the matrix whose columns are the n linearly independent Eigenvectors... found above... then, the inverse a × p = d, where d has Λi along the diagonals. The diagonal entries are the Eigenvalues.1622

Okay. So, let us recap. We want to form the characteristic polynomial once we are given a matrix a, n by n.1697

Once we find the characteristic polynomial, we want to find its roots. If the roots are not all real, if there is... they have to all be real.1704

If they are not all real, then you can stop. You cannot diagonalize the matrix.1713

If they are real, well, for each Eigenvalue of multiplicity... you know ki, for each one that has a multiplicity, you want to find a basis for the null space of that equation... we solve the homogeneous system.1718

Well, if the dimension of that null space is equal to ki, we can continue. If not, we can stop, we cannot diagonalize the matrix.1735

But, if we can, and for each distinct Eigenvalue, we are going to have 1 Eigenvector. We will let... take those n linearly independent Eigenvectors that we just found, we will arrange them as columns respectively.1742

So Λ1, column 1, Λ2, column 2... that is going to be our matrix p. When we take p inverse × a × p, that actually is equal to our diagonal matrix d, the entries of which are the respective Eigenvalues of our original matrix a.1757

So, we will be dealing more with Eigenvalues and Eigenvectors in our next lesson, so we are not quite finished with this, it gets a little bit deeper. We will have a little more exposure with it.1780

So, until then, thank you for joining us at Educator.com, and we will see you next time.1791

Educator®

Please sign in to participate in this lecture discussion.

Resetting Your Password?
OR

Start Learning Now

Our free lessons will get you started (Adobe Flash® required).
Get immediate access to our entire library.

Membership Overview

  • Available 24/7. Unlimited Access to Our Entire Library.
  • Search and jump to exactly what you want to learn.
  • *Ask questions and get answers from the community and our teachers!
  • Practice questions with step-by-step solutions.
  • Download lecture slides for taking notes.
  • Track your course viewing progress.
  • Accessible anytime, anywhere with our Android and iOS apps.