Enter your Sign on user name and password.

Forgot password?
Sign In | Subscribe
Start learning today, and be successful in your academic & professional career. Start Today!
Use Chrome browser to play professor video
Raffi Hovasapian

Raffi Hovasapian

Rank of a Matrix, Part II

Slide Duration:

Table of Contents

I. Linear Equations and Matrices
Linear Systems

39m 3s

Intro
0:00
Linear Systems
1:20
Introduction to Linear Systems
1:21
Examples
10:35
Example 1
10:36
Example 2
13:44
Example 3
16:12
Example 4
23:48
Example 5
28:23
Example 6
32:32
Number of Solutions
35:08
One Solution, No Solution, Infinitely Many Solutions
35:09
Method of Elimination
36:57
Method of Elimination
36:58
Matrices

30m 34s

Intro
0:00
Matrices
0:47
Definition and Example of Matrices
0:48
Square Matrix
7:55
Diagonal Matrix
9:31
Operations with Matrices
10:35
Matrix Addition
10:36
Scalar Multiplication
15:01
Transpose of a Matrix
17:51
Matrix Types
23:17
Regular: m x n Matrix of m Rows and n Column
23:18
Square: n x n Matrix With an Equal Number of Rows and Columns
23:44
Diagonal: A Square Matrix Where All Entries OFF the Main Diagonal are '0'
24:07
Matrix Operations
24:37
Matrix Operations
24:38
Example
25:55
Example
25:56
Dot Product & Matrix Multiplication

41m 42s

Intro
0:00
Dot Product
1:04
Example of Dot Product
1:05
Matrix Multiplication
7:05
Definition
7:06
Example 1
12:26
Example 2
17:38
Matrices and Linear Systems
21:24
Matrices and Linear Systems
21:25
Example 1
29:56
Example 2
32:30
Summary
33:56
Dot Product of Two Vectors and Matrix Multiplication
33:57
Summary, cont.
35:06
Matrix Representations of Linear Systems
35:07
Examples
35:34
Examples
35:35
Properties of Matrix Operation

43m 17s

Intro
0:00
Properties of Addition
1:11
Properties of Addition: A
1:12
Properties of Addition: B
2:30
Properties of Addition: C
2:57
Properties of Addition: D
4:20
Properties of Addition
5:22
Properties of Addition
5:23
Properties of Multiplication
6:47
Properties of Multiplication: A
7:46
Properties of Multiplication: B
8:13
Properties of Multiplication: C
9:18
Example: Properties of Multiplication
9:35
Definitions and Properties (Multiplication)
14:02
Identity Matrix: n x n matrix
14:03
Let A Be a Matrix of m x n
15:23
Definitions and Properties (Multiplication)
18:36
Definitions and Properties (Multiplication)
18:37
Properties of Scalar Multiplication
22:54
Properties of Scalar Multiplication: A
23:39
Properties of Scalar Multiplication: B
24:04
Properties of Scalar Multiplication: C
24:29
Properties of Scalar Multiplication: D
24:48
Properties of the Transpose
25:30
Properties of the Transpose
25:31
Properties of the Transpose
30:28
Example
30:29
Properties of Matrix Addition
33:25
Let A, B, C, and D Be m x n Matrices
33:26
There is a Unique m x n Matrix, 0, Such That…
33:48
Unique Matrix D
34:17
Properties of Matrix Multiplication
34:58
Let A, B, and C Be Matrices of the Appropriate Size
34:59
Let A Be Square Matrix (n x n)
35:44
Properties of Scalar Multiplication
36:35
Let r and s Be Real Numbers, and A and B Matrices
36:36
Properties of the Transpose
37:10
Let r Be a Scalar, and A and B Matrices
37:12
Example
37:58
Example
37:59
Solutions of Linear Systems, Part 1

38m 14s

Intro
0:00
Reduced Row Echelon Form
0:29
An m x n Matrix is in Reduced Row Echelon Form If:
0:30
Reduced Row Echelon Form
2:58
Example: Reduced Row Echelon Form
2:59
Theorem
8:30
Every m x n Matrix is Row-Equivalent to a UNIQUE Matrix in Reduced Row Echelon Form
8:31
Systematic and Careful Example
10:02
Step 1
10:54
Step 2
11:33
Step 3
12:50
Step 4
14:02
Step 5
15:31
Step 6
17:28
Example
30:39
Find the Reduced Row Echelon Form of a Given m x n Matrix
30:40
Solutions of Linear Systems, Part II

28m 54s

Intro
0:00
Solutions of Linear Systems
0:11
Solutions of Linear Systems
0:13
Example I
3:25
Solve the Linear System 1
3:26
Solve the Linear System 2
14:31
Example II
17:41
Solve the Linear System 3
17:42
Solve the Linear System 4
20:17
Homogeneous Systems
21:54
Homogeneous Systems Overview
21:55
Theorem and Example
24:01
Inverse of a Matrix

40m 10s

Intro
0:00
Finding the Inverse of a Matrix
0:41
Finding the Inverse of a Matrix
0:42
Properties of Non-Singular Matrices
6:38
Practical Procedure
9:15
Step1
9:16
Step 2
10:10
Step 3
10:46
Example: Finding Inverse
12:50
Linear Systems and Inverses
17:01
Linear Systems and Inverses
17:02
Theorem and Example
21:15
Theorem
26:32
Theorem
26:33
List of Non-Singular Equivalences
28:37
Example: Does the Following System Have a Non-trivial Solution?
30:13
Example: Inverse of a Matrix
36:16
II. Determinants
Determinants

21m 25s

Intro
0:00
Determinants
0:37
Introduction to Determinants
0:38
Example
6:12
Properties
9:00
Properties 1-5
9:01
Example
10:14
Properties, cont.
12:28
Properties 6 & 7
12:29
Example
14:14
Properties, cont.
18:34
Properties 8 & 9
18:35
Example
19:21
Cofactor Expansions

59m 31s

Intro
0:00
Cofactor Expansions and Their Application
0:42
Cofactor Expansions and Their Application
0:43
Example 1
3:52
Example 2
7:08
Evaluation of Determinants by Cofactor
9:38
Theorem
9:40
Example 1
11:41
Inverse of a Matrix by Cofactor
22:42
Inverse of a Matrix by Cofactor and Example
22:43
More Example
36:22
List of Non-Singular Equivalences
43:07
List of Non-Singular Equivalences
43:08
Example
44:38
Cramer's Rule
52:22
Introduction to Cramer's Rule and Example
52:23
III. Vectors in Rn
Vectors in the Plane

46m 54s

Intro
0:00
Vectors in the Plane
0:38
Vectors in the Plane
0:39
Example 1
8:25
Example 2
15:23
Vector Addition and Scalar Multiplication
19:33
Vector Addition
19:34
Scalar Multiplication
24:08
Example
26:25
The Angle Between Two Vectors
29:33
The Angle Between Two Vectors
29:34
Example
33:54
Properties of the Dot Product and Unit Vectors
38:17
Properties of the Dot Product and Unit Vectors
38:18
Defining Unit Vectors
40:01
2 Very Important Unit Vectors
41:56
n-Vector

52m 44s

Intro
0:00
n-Vectors
0:58
4-Vector
0:59
7-Vector
1:50
Vector Addition
2:43
Scalar Multiplication
3:37
Theorem: Part 1
4:24
Theorem: Part 2
11:38
Right and Left Handed Coordinate System
14:19
Projection of a Point Onto a Coordinate Line/Plane
17:20
Example
21:27
Cauchy-Schwarz Inequality
24:56
Triangle Inequality
36:29
Unit Vector
40:34
Vectors and Dot Products
44:23
Orthogonal Vectors
44:24
Cauchy-Schwarz Inequality
45:04
Triangle Inequality
45:21
Example 1
45:40
Example 2
48:16
Linear Transformation

48m 53s

Intro
0:00
Introduction to Linear Transformations
0:44
Introduction to Linear Transformations
0:45
Example 1
9:01
Example 2
11:33
Definition of Linear Mapping
14:13
Example 3
22:31
Example 4
26:07
Example 5
30:36
Examples
36:12
Projection Mapping
36:13
Images, Range, and Linear Mapping
38:33
Example of Linear Transformation
42:02
Linear Transformations, Part II

34m 8s

Intro
0:00
Linear Transformations
1:29
Linear Transformations
1:30
Theorem 1
7:15
Theorem 2
9:20
Example 1: Find L (-3, 4, 2)
11:17
Example 2: Is It Linear?
17:11
Theorem 3
25:57
Example 3: Finding the Standard Matrix
29:09
Lines and Planes

37m 54s

Intro
0:00
Lines and Plane
0:36
Example 1
0:37
Example 2
7:07
Lines in IR3
9:53
Parametric Equations
14:58
Example 3
17:26
Example 4
20:11
Planes in IR3
25:19
Example 5
31:12
Example 6
34:18
IV. Real Vector Spaces
Vector Spaces

42m 19s

Intro
0:00
Vector Spaces
3:43
Definition of Vector Spaces
3:44
Vector Spaces 1
5:19
Vector Spaces 2
9:34
Real Vector Space and Complex Vector Space
14:01
Example 1
15:59
Example 2
18:42
Examples
26:22
More Examples
26:23
Properties of Vector Spaces
32:53
Properties of Vector Spaces Overview
32:54
Property A
34:31
Property B
36:09
Property C
36:38
Property D
37:54
Property F
39:00
Subspaces

43m 37s

Intro
0:00
Subspaces
0:47
Defining Subspaces
0:48
Example 1
3:08
Example 2
3:49
Theorem
7:26
Example 3
9:11
Example 4
12:30
Example 5
16:05
Linear Combinations
23:27
Definition 1
23:28
Example 1
25:24
Definition 2
29:49
Example 2
31:34
Theorem
32:42
Example 3
34:00
Spanning Set for a Vector Space

33m 15s

Intro
0:00
A Spanning Set for a Vector Space
1:10
A Spanning Set for a Vector Space
1:11
Procedure to Check if a Set of Vectors Spans a Vector Space
3:38
Example 1
6:50
Example 2
14:28
Example 3
21:06
Example 4
22:15
Linear Independence

17m 20s

Intro
0:00
Linear Independence
0:32
Definition
0:39
Meaning
3:00
Procedure for Determining if a Given List of Vectors is Linear Independence or Linear Dependence
5:00
Example 1
7:21
Example 2
10:20
Basis & Dimension

31m 20s

Intro
0:00
Basis and Dimension
0:23
Definition
0:24
Example 1
3:30
Example 2: Part A
4:00
Example 2: Part B
6:53
Theorem 1
9:40
Theorem 2
11:32
Procedure for Finding a Subset of S that is a Basis for Span S
14:20
Example 3
16:38
Theorem 3
21:08
Example 4
25:27
Homogeneous Systems

24m 45s

Intro
0:00
Homogeneous Systems
0:51
Homogeneous Systems
0:52
Procedure for Finding a Basis for the Null Space of Ax = 0
2:56
Example 1
7:39
Example 2
18:03
Relationship Between Homogeneous and Non-Homogeneous Systems
19:47
Rank of a Matrix, Part I

35m 3s

Intro
0:00
Rank of a Matrix
1:47
Definition
1:48
Theorem 1
8:14
Example 1
9:38
Defining Row and Column Rank
16:53
If We Want a Basis for Span S Consisting of Vectors From S
22:00
If We want a Basis for Span S Consisting of Vectors Not Necessarily in S
24:07
Example 2: Part A
26:44
Example 2: Part B
32:10
Rank of a Matrix, Part II

29m 26s

Intro
0:00
Rank of a Matrix
0:17
Example 1: Part A
0:18
Example 1: Part B
5:58
Rank of a Matrix Review: Rows, Columns, and Row Rank
8:22
Procedure for Computing the Rank of a Matrix
14:36
Theorem 1: Rank + Nullity = n
16:19
Example 2
17:48
Rank & Singularity
20:09
Example 3
21:08
Theorem 2
23:25
List of Non-Singular Equivalences
24:24
List of Non-Singular Equivalences
24:25
Coordinates of a Vector

27m 3s

Intro
0:00
Coordinates of a Vector
1:07
Coordinates of a Vector
1:08
Example 1
8:35
Example 2
15:28
Example 3: Part A
19:15
Example 3: Part B
22:26
Change of Basis & Transition Matrices

33m 47s

Intro
0:00
Change of Basis & Transition Matrices
0:56
Change of Basis & Transition Matrices
0:57
Example 1
10:44
Example 2
20:44
Theorem
23:37
Example 3: Part A
26:21
Example 3: Part B
32:05
Orthonormal Bases in n-Space

32m 53s

Intro
0:00
Orthonormal Bases in n-Space
1:02
Orthonormal Bases in n-Space: Definition
1:03
Example 1
4:31
Theorem 1
6:55
Theorem 2
8:00
Theorem 3
9:04
Example 2
10:07
Theorem 2
13:54
Procedure for Constructing an O/N Basis
16:11
Example 3
21:42
Orthogonal Complements, Part I

21m 27s

Intro
0:00
Orthogonal Complements
0:19
Definition
0:20
Theorem 1
5:36
Example 1
6:58
Theorem 2
13:26
Theorem 3
15:06
Example 2
18:20
Orthogonal Complements, Part II

33m 49s

Intro
0:00
Relations Among the Four Fundamental Vector Spaces Associated with a Matrix A
2:16
Four Spaces Associated With A (If A is m x n)
2:17
Theorem
4:49
Example 1
7:17
Null Space and Column Space
10:48
Projections and Applications
16:50
Projections and Applications
16:51
Projection Illustration
21:00
Example 1
23:51
Projection Illustration Review
30:15
V. Eigenvalues and Eigenvectors
Eigenvalues and Eigenvectors

38m 11s

Intro
0:00
Eigenvalues and Eigenvectors
0:38
Eigenvalues and Eigenvectors
0:39
Definition 1
3:30
Example 1
7:20
Example 2
10:19
Definition 2
21:15
Example 3
23:41
Theorem 1
26:32
Theorem 2
27:56
Example 4
29:14
Review
34:32
Similar Matrices & Diagonalization

29m 55s

Intro
0:00
Similar Matrices and Diagonalization
0:25
Definition 1
0:26
Example 1
2:00
Properties
3:38
Definition 2
4:57
Theorem 1
6:12
Example 3
9:37
Theorem 2
12:40
Example 4
19:12
Example 5
20:55
Procedure for Diagonalizing Matrix A: Step 1
24:21
Procedure for Diagonalizing Matrix A: Step 2
25:04
Procedure for Diagonalizing Matrix A: Step 3
25:38
Procedure for Diagonalizing Matrix A: Step 4
27:02
Diagonalization of Symmetric Matrices

30m 14s

Intro
0:00
Diagonalization of Symmetric Matrices
1:15
Diagonalization of Symmetric Matrices
1:16
Theorem 1
2:24
Theorem 2
3:27
Example 1
4:47
Definition 1
6:44
Example 2
8:15
Theorem 3
10:28
Theorem 4
12:31
Example 3
18:00
VI. Linear Transformations
Linear Mappings Revisited

24m 5s

Intro
0:00
Linear Mappings
2:08
Definition
2:09
Linear Operator
7:36
Projection
8:48
Dilation
9:40
Contraction
10:07
Reflection
10:26
Rotation
11:06
Example 1
13:00
Theorem 1
18:16
Theorem 2
19:20
Kernel and Range of a Linear Map, Part I

26m 38s

Intro
0:00
Kernel and Range of a Linear Map
0:28
Definition 1
0:29
Example 1
4:36
Example 2
8:12
Definition 2
10:34
Example 3
13:34
Theorem 1
16:01
Theorem 2
18:26
Definition 3
21:11
Theorem 3
24:28
Kernel and Range of a Linear Map, Part II

25m 54s

Intro
0:00
Kernel and Range of a Linear Map
1:39
Theorem 1
1:40
Example 1: Part A
2:32
Example 1: Part B
8:12
Example 1: Part C
13:11
Example 1: Part D
14:55
Theorem 2
16:50
Theorem 3
23:00
Matrix of a Linear Map

33m 21s

Intro
0:00
Matrix of a Linear Map
0:11
Theorem 1
1:24
Procedure for Computing to Matrix: Step 1
7:10
Procedure for Computing to Matrix: Step 2
8:58
Procedure for Computing to Matrix: Step 3
9:50
Matrix of a Linear Map: Property
10:41
Example 1
14:07
Example 2
18:12
Example 3
24:31
Loading...
This is a quick preview of the lesson. For full access, please Log In or Sign up.
For more information, please see full course syllabus of Linear Algebra
  • Discussion

  • Download Lecture Slides

  • Table of Contents

  • Transcription

  • Related Books

Lecture Comments (10)

1 answer

Last reply by: Professor Hovasapian
Sat Aug 8, 2015 10:15 PM

Post by matt kruk on July 19, 2015

hi professor so if a system of equations only has the trivial solution is it still possible to solve it in differential equations. for example variation of parameters relies on having the solutions to the homogeneous system. would you just be forced to use undetermined coefficients to solve the system?

2 answers

Last reply by: Matt C
Thu Apr 4, 2013 7:06 PM

Post by Matt C on April 3, 2013

I am just curious on this question? Is it possible to take a basis of a null space and work all the way back into the rref form? But you are not giving the size of the matrix, just the basis of the null space. I know, I can figure out the number of free variables and the how many columns are in the matrix, but is it possible to figure out how many pivots and rows are in the matrix? I was just bored so I tried to work backwards, but seem to be getting stuck.

1 answer

Last reply by: Professor Hovasapian
Wed Apr 3, 2013 12:58 AM

Post by Matt C on April 2, 2013

For the first example part I entered the matrix into my calculator and put it in row reduce echelon form and I did not get what you did. I then went back in the notes (because you used this matrix in the previous episode) and I noticed that you forgot to put a negative in front of the 1 in row 4 column 1 in the 4x5 matrix. So the last row in the 4x5 matrix should be (-1,2,0,4,-3). That seems to work. Let me know if that is correct please.

1 answer

Last reply by: Professor Hovasapian
Mon Dec 3, 2012 12:39 AM

Post by Eduardo Voloch on December 2, 2012

There is an error on "b)", 4th row, 5th column, should be -3 and not 3. Excellent video though. Really well explained. Thank you so much!

0 answers

Post by Ken Mullin on January 21, 2012

Very well demonstated and explained.
However, unless row and column manipulations were "drilled" to watchers in earlier videos (I skipped immediately to this section, having some familiarity with the topic), I'm wondering how many may be following the final matrix appearance in RREF.

Rank of a Matrix, Part II

Lecture Slides are screen-captured images of important points in the lecture. Students can download and print out these lecture slide images to do practice problems as well as take notes while watching the lecture.

  • Intro 0:00
  • Rank of a Matrix 0:17
    • Example 1: Part A
    • Example 1: Part B
    • Rank of a Matrix Review: Rows, Columns, and Row Rank
    • Procedure for Computing the Rank of a Matrix
    • Theorem 1: Rank + Nullity = n
    • Example 2
    • Rank & Singularity
    • Example 3
    • Theorem 2
  • List of Non-Singular Equivalences 24:24
    • List of Non-Singular Equivalences

Transcription: Rank of a Matrix, Part II

Welcome back to Educator.com, and welcome back to linear algebra.0000

This lesson, we are going to continue the discussion of row rank and column rank that we started in the last lesson.0004

So, this is going to be the rank of a matrix, part 2.0010

Let us just go ahead and jump right in -- go ahead and switch over to a blue ink here.0015

Recall from a previous lesson the following matrix.0022

So, we have a matrix a... it is (1,3,2,1), (-2,2,3,2), (0,8,7,0), (3,1,2,4), (-4,4,3,-3).0032

Okay. So, this is a 4 by 5.0058

A 4 by 5 matrix. Okay. Now, let us consider just the columns of this matrix.0064

I will call it the set c. So, we have the (1,3,2,1), we have (-2,2,3,2), we have (0,8,7,0), we have (3,1,2,4), (-4,4,3,-3).0070

What we have is 1, 2, 3, 4, 5 vectors in R4... 5 vector in R4. Okay.0099

Now, we said that the rows of a matrix form the span as we treat it as vectors, individual vectors... they span a space called the row space.0112

Well, similarly, the columns of the matrix, they span a space like we define in the previous lesson. They span a space, a subspace called... the column space.0124

Now, what we want to do is find a basis for the column space.0137

Consisting of arbitrary vectors... they do not necessarily have to be from this set.0156

We want to find a basis for the span of this set, but I do not necessarily want them to be from this set.0162

So, find a basis for the column space consisting of arbitrary vectors.0168

Now, if you remember from our last lesson, when we have a set of vectors, and we want to find a basis for the span of that set of vectors, but we do not care if the vectors in that basis come from the original set... we set up those vectors as rows.0182

Then, we do reduced row echelon form, and then the number of non-zero rows, those actually form a basis.0202

So, let us do that. Here, the column... the columns are this way.0206

We want to find a basis for the column space arbitrary vectors, so I am going to write the columns as rows, because that is the procedure.0214

So, I am going to write (1,3,2,1), (-2,2,3,2), (0,8,7,0), (3,1,2,4), (-4,4,3,-3).0225

I am going to convert that to reduced row echelon form, and I end up with (1,0,0,11/24), (0,1,0,-49/24), (0,0,1,7/3), and 0's everywhere else.0248

My 3 non-zero rows are these. They form a basis for the span of the columns, the original matrix.0277

So, I can choose the set (1,0,0,11/24)... that is one vector.0292

Notice, in the matrix I had written them as rows, but now I am just writing them as columns because I just tend to prefer writing them this way.0310

(0,1,0,-49/24), and (0,0,1,7/3), if I am not mistaken, that is correct.0320

Yes. This set forms a basis for the column space.0332

Column rank, three. There are three vectors in there.0347

Okay. Now, we want to find a basis for the original set of vectors consisting of vectors from that actual set... either all of them, or few of them.0356

So, when we do that, we set them up as columns, and we solve the associated homogeneous system.0371

So, here is what we are going to set up... (1,3,2,-1), (-2,2,3,2), (0,8,7,0), (3,1,2,4), (-4,4,3,3), and of course the associated system goes that way.0377

We convert to reduced row echelon form, we end up with the following.0406

We end up with (1,0,0,0), (0,1,0,0), (2,1,0,0), (0,0,1,0), (1,1,-1,0), and 0's in the final column.0412

Let us go to blue. Leading entry, leading entry, leading entry. In other words, the first, the second, and the fourth column.0431

Therefore, the first, second and fourth column form a basis. Therefore, I can take the vectors (1,3,2,-1), here.0440

I can take the vector (-2,2,3,2).0460

The fourth column, (3,1,2,4)... this set forms a basis.0467

A good basis for the column space.0475

Column rank equals 3, because there are 3 vectors that go into the basis. Again, the rank is the dimension of that space.0485

Okay. Now, let us recap what we did. Just now, and from the previous lesson. Here is what we did.0501

We had a, okay? I will write it one more time. I know it is getting a little tedious, but I suppose it is always good to see it... (2,8,7,0), (3,1,2,4), (4,4,3,3).0516

We had this original matrix a. Okay, it is a 4 by 5 matrix.0542

The column... the row space consists of 4 vectors in R5.0549

The column space consists of 5 vectors in R4. Okay.0555

Using two different techniques, we found a basis for the row space, alright? that was in the previous lesson.0566

For the row space, we dealt with the rows using two different techniques. One, we set them up as rows, and we got vectors arbitrary vectors for a basis consisting of arbitrary vectors.0600

Then, we set up these rows as columns, we solve the associated system and we got a basis consisting of vectors from the original set.0611

So, 2 different techniques, we ended up with a row rank equal to 3.0619

Okay. Now the columns, like we said, the columns form a set of 1,2,3,4,5 vectors in R4.0628

Well, again, this was for rows, now for columns.0640

The problem we just did, using 2 different techniques, we found a basis in the column space.0650

Column rank is equal to 3. Let me stop this for a second.0679

Random matrix... random matrix a... the rows consist of 4 vectors in R5. Using the two techniques, we found a basis, and the basis consists of the 3 vectors a piece. Row rank was 3.0686

The columns, the columns are 5 vectors in R4. R5 and R4 have nothing to do with each other, they are completely different spaces.0707

I mean, their underlying structure might be the same, but they are completely different spaces. One has 4 elements, the vectors in the other space have 5 elements in them.0715

Using two different techniques, we found a basis for the column space. Column rank ends up being 3. Okay.0723

This 3, this is not a coincidence... not a coincidence.0730

As it turns out, for any random matrix, m by n, row rank equals the column rank.0742

So, let me put this in perspective for you. I took a rectangular array, random, in this case 4 by 5.0760

It could be 7 by 8... it could be 4 by 13... if I treat the rows as vectors, and if I treat the columns as vectors, and if I calculate...0766

If I find a basis for both of the bases that those two... a basis for the span of the collection of vectors that make up the rows... collection of vectors that make up the columns, they have nothing to do with each other.0777

Yet, they end up with the same number of vectors.0793

Well, the column rank is the row rank, now we call it the rank.0796

So, because it is the case, because the column space and the row space end up having the same number of vectors in their bases, we just call it the rank.0800

So, we no longer refer to it as the row rank of a matrix, or the column rank of a matrix, we call it the rank of a matrix.0813

Now, I want you to stop and think about how extraordinary this is.0820

A collection, a rectangular array of numbers... let us say 3 by 17.0823

You have some vectors in R3, and you have vectors in R17. They have absolutely nothing to do with each other, and yet a basis for this space that these vectors span... they end up with the same number of vectors.0830

There is no reason in the world for believing that that should be the case. There is no reason in the world for believing that that should be the case, and yet there it is.0846

Simply by virtue of a rectangular arrangement of numbers. That is extraordinary beyond belief, and we have not even gotten to the best part yet.0852

Now, we are just going to call it the rank from now on. So, I do not necessarily have to find the row rank and the column rank of a matrix, I can just take my pick.0861

So, let us just stick with rows. I go with rows. You are welcome to go with columns if you want.0870

Okay. So, as a recap... our procedure for computing the rank of a matrix a.0874

Okay. 1. Transform the matrix a to reduced row echelon matrix b.0900

2. The number of non-zero rows is the rank, that is it. Nice and easy.0911

Now, recall from a previous lesson. We defined something called the nullity, defined the nullity.0928

That was the dimension of the null space. In other words, it is the dimension of the solution space for the homogeneous system ax = 0.0945

Okay? Theorem. Profoundly important result, insanely beautiful result. We have to know this.0975

If you do not walk away with anything else from linear algebra, know this theorem, because I promise you, if you can drop this theorem in one of your classes in graduate school, you will make one hell of an impression on your professors.0989

They probably do not even know this themselves, some of them... but beautiful, beautiful theorem.1000

The rank of a matrix a plus the nullity of a matrix a is equal to n.1007

So, think about what this means. If I have an m by n matrix, a 5 by 6 matrix... 5 by 6... 5 rows, 6 columns.1018

n is 6. The rank of that matrix plus the nullity of that matrix equals 6.1028

If I know that I have a matrix that is n = 6, and I find the nullity, I know what the rank is automatically, by virtue of this equation.1038

If I know what the rank is, I know what the nullity is. If I know what the rank and nullity is, I know what space I am dealing with.1046

If I have a rank of 5, and if I have a nullity of 3, then I know that I am dealing with an 8-dimensional space. Amazing, amazing, amazing theorem. Comes up in a lot of places.1053

Okay. Let us do some examples here.1064

Let us go... okay... simply by virtue of a random arrangement of numbers in a rectangular array that we call a matrix.1069

(1,1,4,1,2), (0,1,2,1,1), (0,0,0,1,2), (1,-1,0,0,2), (2,1,6,0,1)... okay.1083

Reduced row echelon. We have this random matrix, it is 1, 2, 3, 4, 5... 1, 2, 3, 4, 5... this is a 5 by 5 matrix, so here n = 5.1106

Okay. Reduced row echelon form, you get (1,0,2,0,1), (0,1,2,0,-1), (0,0,0,1,2), and we get (0,0,0,0,0), (0,0,0,0,0)...1118

We have 1, 2, 3 non-zero rows. Rank = 3.1140

Well, if rank = 3 and n = 5, I know that my solution space to the associated homogeneous system that goes with this matrix... I know that it has to have a dimension 2, because rank + nullity = n.1149

3 + 2 = 5. That is extraordinary. In fact, this is from a previous example.1164

If you go back to a previous lesson where we actually calculated the solution space, you will find that there were 2 vectors.1173

So, 2 vectors, dimension 2, here you have dimension 3 and it confirms the fact that 3 + 2 = 5.1181

Okay. Now, let us throw out a theorem here, that has to do with rank in singularity.1189

Actually, you know what, we define here... let me go to blue... rank and singularity, and if you remember singularity has to do with determinance.1200

So, a non-singular matrix is one whose determinant... well a non-singular matrix is one that actually has an inverse, that is the actual definition of singularity, something that has an inverse and that corresponds to a determinant not being equal to 0.1220

And... you remember that list of non-singular equivalences? We are actually going to recap it at the end of this lesson and add a few more things to it.1235

So, rank and singularity... a n by n, a n by n matrix is non-singular... it means it has an inverse, if and only if rank = n.1241

So, if I calculate the rank and the rank equals n, that means it is not singular. That means it has an inverse. That means its determinant is non-zero.1260

Okay. Let us do some quick examples of this one. We will let a equal (1,2,0,0,1,3,2,1,3).1269

We convert to reduced row echelon. We get (1,0,0,0,1,0,0,0,1).1284

Okay. There is 1, there is 2, there is 3 non-zero vectors in that reduced row echelon. Rank = 3.1293

Well, we have a 3 by 3, the rank = 3, therefore that implies that this is non-singular, and it implies that the solution space... okay... has only the trivial solution.1305

Again, this goes back to that list of equivalences. One thing implies a whole bunch of other things.1330

Okay. Another example. Let us let matrix b equal (1,2,0), (1,1,-3), (1,3,3).1338

Let us convert to reduced row echelon form. We end up with (1,0,-6), (0,1,3), we get (0,0,0).1352

We have that, we have that... we have 2. So, rank is equal to 2.1364

Well, the n is 3, the rank is 2. It is less than 3, it is not equal to 3, therefore... so 2 less than 3, it implies that a is singular.1370

It does not have an inverse. It implies that there does exist a non-trivial solution for the homogeneous system.1386

Okay. One more theorem here, that is very, very nice.1402

We will not necessarily do an example of this, but it is good to know.1413

The non-homogeneous system, ax = b, has a solution if and only if the rank of the matrix a is equal to the rank of the matrix a augmented by b.1419

So, if I take a, take the rank, and if I take a, make the augmented matrix, and then calculate the length of that matrix, if those are equal... then I know that the actual system has a solution.1441

Now, of course we have techniques for finding this solution, you know, and that is important, but sometimes it is nice just to know that it does have a solution.1454

Okay. Now, let us talk about our list of non-singular equivalences, and let us add to that list.1464

So, list of non-singular equivalences. This is for a n by n... you remember, because an n by n matrix is the only one for which a determinant is actually defined. Okay.1471

All of the following are equivalent. In other words, one is the same as the other.1489

Each one implies each and every other one.1496

One, well, a is non-singular.1502

Two, ax = 0, the homogeneous system has only the trivial solution... and you remember the trivial solution is just 0, all 0's.1511

Three, a is row-equivalent to in, the identity matrix. That is the one with all 1's in the main diagonal. Everything else is 0.1532

Four, ax = b, the associated non-homogeneous solution has a weak solution... only 1.1552

Five, the determinant of a is non-zero.1569

Six, a has rank n.1578

Seven, a has nullity 0.1585

Eight, rows of a form a linearly independent set of vectors in RN.1593

Nine, the columns do the same thing. The columns of a form a linearly independent -- I will just abbreviate it as LI -- set of vectors in RN.1625

So, I can make all of these statements if I have some random a matrix, which is n by n, let us say 5 by 5.1646

If I know that it is... let us say I know... I calculate its rank and its rank ends up being n. I know that all of these other things are true.1656

A is non-singular, that means that it has an inverse. I know that the associated homogeneous system has the trivial solution only.1665

I know that I can convert a to the identity matrix, in this case i5... I know that the associated non-homogeneous solution for any particular b has one and only one solution.1671

I know that the determinant is not 0.1683

I know that the nullity, the solution space is 0, which is the same as the... yeah, only the trivial solution.1686

I know that the rows of a form a linearly independent set of vectors in RN.1694

I know that the columns of a form a linearly independent set of vectors in RN.1699

So, again, we have a matrix... the rows are a set of vectors, and they behave a certain way, they span a space. The dimension of that space is the row rank.1708

The columns of that matrix span a space. The dimension of that subspace is called the column rank.1720

The row rank and the column rank end up being the same, no matter what rectangular array we have. We call that the rank.1731

The rank + the nullity, which is the dimension of the solution space, of the associated homogeneous system is always equal to n.1740

That is amazing. That is beautiful, and it is going to have even further consequences as we see in our subsequent lessons.1754

Thank you for joining us here at Educator.com, thank you for joining us for linear algebra, we will see you next time, bye-bye.1760

Educator®

Please sign in for full access to this lesson.

Sign-InORCreate Account

Enter your Sign-on user name and password.

Forgot password?

Start Learning Now

Our free lessons will get you started (Adobe Flash® required).
Get immediate access to our entire library.

Sign up for Educator.com

Membership Overview

  • Unlimited access to our entire library of courses.
  • Search and jump to exactly what you want to learn.
  • *Ask questions and get answers from the community and our teachers!
  • Practice questions with step-by-step solutions.
  • Download lesson files for programming and software training practice.
  • Track your course viewing progress.
  • Download lecture slides for taking notes.