  Raffi Hovasapian

Homogeneous Systems

Slide Duration:

Section 1: Linear Equations and Matrices
Linear Systems

39m 3s

Intro
0:00
Linear Systems
1:20
Introduction to Linear Systems
1:21
Examples
10:35
Example 1
10:36
Example 2
13:44
Example 3
16:12
Example 4
23:48
Example 5
28:23
Example 6
32:32
Number of Solutions
35:08
One Solution, No Solution, Infinitely Many Solutions
35:09
Method of Elimination
36:57
Method of Elimination
36:58
Matrices

30m 34s

Intro
0:00
Matrices
0:47
Definition and Example of Matrices
0:48
Square Matrix
7:55
Diagonal Matrix
9:31
Operations with Matrices
10:35
10:36
Scalar Multiplication
15:01
Transpose of a Matrix
17:51
Matrix Types
23:17
Regular: m x n Matrix of m Rows and n Column
23:18
Square: n x n Matrix With an Equal Number of Rows and Columns
23:44
Diagonal: A Square Matrix Where All Entries OFF the Main Diagonal are '0'
24:07
Matrix Operations
24:37
Matrix Operations
24:38
Example
25:55
Example
25:56
Dot Product & Matrix Multiplication

41m 42s

Intro
0:00
Dot Product
1:04
Example of Dot Product
1:05
Matrix Multiplication
7:05
Definition
7:06
Example 1
12:26
Example 2
17:38
Matrices and Linear Systems
21:24
Matrices and Linear Systems
21:25
Example 1
29:56
Example 2
32:30
Summary
33:56
Dot Product of Two Vectors and Matrix Multiplication
33:57
Summary, cont.
35:06
Matrix Representations of Linear Systems
35:07
Examples
35:34
Examples
35:35
Properties of Matrix Operation

43m 17s

Intro
0:00
1:11
1:12
2:30
2:57
4:20
5:22
5:23
Properties of Multiplication
6:47
Properties of Multiplication: A
7:46
Properties of Multiplication: B
8:13
Properties of Multiplication: C
9:18
Example: Properties of Multiplication
9:35
Definitions and Properties (Multiplication)
14:02
Identity Matrix: n x n matrix
14:03
Let A Be a Matrix of m x n
15:23
Definitions and Properties (Multiplication)
18:36
Definitions and Properties (Multiplication)
18:37
Properties of Scalar Multiplication
22:54
Properties of Scalar Multiplication: A
23:39
Properties of Scalar Multiplication: B
24:04
Properties of Scalar Multiplication: C
24:29
Properties of Scalar Multiplication: D
24:48
Properties of the Transpose
25:30
Properties of the Transpose
25:31
Properties of the Transpose
30:28
Example
30:29
33:25
Let A, B, C, and D Be m x n Matrices
33:26
There is a Unique m x n Matrix, 0, Such That…
33:48
Unique Matrix D
34:17
Properties of Matrix Multiplication
34:58
Let A, B, and C Be Matrices of the Appropriate Size
34:59
Let A Be Square Matrix (n x n)
35:44
Properties of Scalar Multiplication
36:35
Let r and s Be Real Numbers, and A and B Matrices
36:36
Properties of the Transpose
37:10
Let r Be a Scalar, and A and B Matrices
37:12
Example
37:58
Example
37:59
Solutions of Linear Systems, Part 1

38m 14s

Intro
0:00
Reduced Row Echelon Form
0:29
An m x n Matrix is in Reduced Row Echelon Form If:
0:30
Reduced Row Echelon Form
2:58
Example: Reduced Row Echelon Form
2:59
Theorem
8:30
Every m x n Matrix is Row-Equivalent to a UNIQUE Matrix in Reduced Row Echelon Form
8:31
Systematic and Careful Example
10:02
Step 1
10:54
Step 2
11:33
Step 3
12:50
Step 4
14:02
Step 5
15:31
Step 6
17:28
Example
30:39
Find the Reduced Row Echelon Form of a Given m x n Matrix
30:40
Solutions of Linear Systems, Part II

28m 54s

Intro
0:00
Solutions of Linear Systems
0:11
Solutions of Linear Systems
0:13
Example I
3:25
Solve the Linear System 1
3:26
Solve the Linear System 2
14:31
Example II
17:41
Solve the Linear System 3
17:42
Solve the Linear System 4
20:17
Homogeneous Systems
21:54
Homogeneous Systems Overview
21:55
Theorem and Example
24:01
Inverse of a Matrix

40m 10s

Intro
0:00
Finding the Inverse of a Matrix
0:41
Finding the Inverse of a Matrix
0:42
Properties of Non-Singular Matrices
6:38
Practical Procedure
9:15
Step1
9:16
Step 2
10:10
Step 3
10:46
Example: Finding Inverse
12:50
Linear Systems and Inverses
17:01
Linear Systems and Inverses
17:02
Theorem and Example
21:15
Theorem
26:32
Theorem
26:33
List of Non-Singular Equivalences
28:37
Example: Does the Following System Have a Non-trivial Solution?
30:13
Example: Inverse of a Matrix
36:16
Section 2: Determinants
Determinants

21m 25s

Intro
0:00
Determinants
0:37
Introduction to Determinants
0:38
Example
6:12
Properties
9:00
Properties 1-5
9:01
Example
10:14
Properties, cont.
12:28
Properties 6 & 7
12:29
Example
14:14
Properties, cont.
18:34
Properties 8 & 9
18:35
Example
19:21
Cofactor Expansions

59m 31s

Intro
0:00
Cofactor Expansions and Their Application
0:42
Cofactor Expansions and Their Application
0:43
Example 1
3:52
Example 2
7:08
Evaluation of Determinants by Cofactor
9:38
Theorem
9:40
Example 1
11:41
Inverse of a Matrix by Cofactor
22:42
Inverse of a Matrix by Cofactor and Example
22:43
More Example
36:22
List of Non-Singular Equivalences
43:07
List of Non-Singular Equivalences
43:08
Example
44:38
Cramer's Rule
52:22
Introduction to Cramer's Rule and Example
52:23
Section 3: Vectors in Rn
Vectors in the Plane

46m 54s

Intro
0:00
Vectors in the Plane
0:38
Vectors in the Plane
0:39
Example 1
8:25
Example 2
15:23
19:33
19:34
Scalar Multiplication
24:08
Example
26:25
The Angle Between Two Vectors
29:33
The Angle Between Two Vectors
29:34
Example
33:54
Properties of the Dot Product and Unit Vectors
38:17
Properties of the Dot Product and Unit Vectors
38:18
Defining Unit Vectors
40:01
2 Very Important Unit Vectors
41:56
n-Vector

52m 44s

Intro
0:00
n-Vectors
0:58
4-Vector
0:59
7-Vector
1:50
2:43
Scalar Multiplication
3:37
Theorem: Part 1
4:24
Theorem: Part 2
11:38
Right and Left Handed Coordinate System
14:19
Projection of a Point Onto a Coordinate Line/Plane
17:20
Example
21:27
Cauchy-Schwarz Inequality
24:56
Triangle Inequality
36:29
Unit Vector
40:34
Vectors and Dot Products
44:23
Orthogonal Vectors
44:24
Cauchy-Schwarz Inequality
45:04
Triangle Inequality
45:21
Example 1
45:40
Example 2
48:16
Linear Transformation

48m 53s

Intro
0:00
Introduction to Linear Transformations
0:44
Introduction to Linear Transformations
0:45
Example 1
9:01
Example 2
11:33
Definition of Linear Mapping
14:13
Example 3
22:31
Example 4
26:07
Example 5
30:36
Examples
36:12
Projection Mapping
36:13
Images, Range, and Linear Mapping
38:33
Example of Linear Transformation
42:02
Linear Transformations, Part II

34m 8s

Intro
0:00
Linear Transformations
1:29
Linear Transformations
1:30
Theorem 1
7:15
Theorem 2
9:20
Example 1: Find L (-3, 4, 2)
11:17
Example 2: Is It Linear?
17:11
Theorem 3
25:57
Example 3: Finding the Standard Matrix
29:09
Lines and Planes

37m 54s

Intro
0:00
Lines and Plane
0:36
Example 1
0:37
Example 2
7:07
Lines in IR3
9:53
Parametric Equations
14:58
Example 3
17:26
Example 4
20:11
Planes in IR3
25:19
Example 5
31:12
Example 6
34:18
Section 4: Real Vector Spaces
Vector Spaces

42m 19s

Intro
0:00
Vector Spaces
3:43
Definition of Vector Spaces
3:44
Vector Spaces 1
5:19
Vector Spaces 2
9:34
Real Vector Space and Complex Vector Space
14:01
Example 1
15:59
Example 2
18:42
Examples
26:22
More Examples
26:23
Properties of Vector Spaces
32:53
Properties of Vector Spaces Overview
32:54
Property A
34:31
Property B
36:09
Property C
36:38
Property D
37:54
Property F
39:00
Subspaces

43m 37s

Intro
0:00
Subspaces
0:47
Defining Subspaces
0:48
Example 1
3:08
Example 2
3:49
Theorem
7:26
Example 3
9:11
Example 4
12:30
Example 5
16:05
Linear Combinations
23:27
Definition 1
23:28
Example 1
25:24
Definition 2
29:49
Example 2
31:34
Theorem
32:42
Example 3
34:00
Spanning Set for a Vector Space

33m 15s

Intro
0:00
A Spanning Set for a Vector Space
1:10
A Spanning Set for a Vector Space
1:11
Procedure to Check if a Set of Vectors Spans a Vector Space
3:38
Example 1
6:50
Example 2
14:28
Example 3
21:06
Example 4
22:15
Linear Independence

17m 20s

Intro
0:00
Linear Independence
0:32
Definition
0:39
Meaning
3:00
Procedure for Determining if a Given List of Vectors is Linear Independence or Linear Dependence
5:00
Example 1
7:21
Example 2
10:20
Basis & Dimension

31m 20s

Intro
0:00
Basis and Dimension
0:23
Definition
0:24
Example 1
3:30
Example 2: Part A
4:00
Example 2: Part B
6:53
Theorem 1
9:40
Theorem 2
11:32
Procedure for Finding a Subset of S that is a Basis for Span S
14:20
Example 3
16:38
Theorem 3
21:08
Example 4
25:27
Homogeneous Systems

24m 45s

Intro
0:00
Homogeneous Systems
0:51
Homogeneous Systems
0:52
Procedure for Finding a Basis for the Null Space of Ax = 0
2:56
Example 1
7:39
Example 2
18:03
Relationship Between Homogeneous and Non-Homogeneous Systems
19:47
Rank of a Matrix, Part I

35m 3s

Intro
0:00
Rank of a Matrix
1:47
Definition
1:48
Theorem 1
8:14
Example 1
9:38
Defining Row and Column Rank
16:53
If We Want a Basis for Span S Consisting of Vectors From S
22:00
If We want a Basis for Span S Consisting of Vectors Not Necessarily in S
24:07
Example 2: Part A
26:44
Example 2: Part B
32:10
Rank of a Matrix, Part II

29m 26s

Intro
0:00
Rank of a Matrix
0:17
Example 1: Part A
0:18
Example 1: Part B
5:58
Rank of a Matrix Review: Rows, Columns, and Row Rank
8:22
Procedure for Computing the Rank of a Matrix
14:36
Theorem 1: Rank + Nullity = n
16:19
Example 2
17:48
Rank & Singularity
20:09
Example 3
21:08
Theorem 2
23:25
List of Non-Singular Equivalences
24:24
List of Non-Singular Equivalences
24:25
Coordinates of a Vector

27m 3s

Intro
0:00
Coordinates of a Vector
1:07
Coordinates of a Vector
1:08
Example 1
8:35
Example 2
15:28
Example 3: Part A
19:15
Example 3: Part B
22:26
Change of Basis & Transition Matrices

33m 47s

Intro
0:00
Change of Basis & Transition Matrices
0:56
Change of Basis & Transition Matrices
0:57
Example 1
10:44
Example 2
20:44
Theorem
23:37
Example 3: Part A
26:21
Example 3: Part B
32:05
Orthonormal Bases in n-Space

32m 53s

Intro
0:00
Orthonormal Bases in n-Space
1:02
Orthonormal Bases in n-Space: Definition
1:03
Example 1
4:31
Theorem 1
6:55
Theorem 2
8:00
Theorem 3
9:04
Example 2
10:07
Theorem 2
13:54
Procedure for Constructing an O/N Basis
16:11
Example 3
21:42
Orthogonal Complements, Part I

21m 27s

Intro
0:00
Orthogonal Complements
0:19
Definition
0:20
Theorem 1
5:36
Example 1
6:58
Theorem 2
13:26
Theorem 3
15:06
Example 2
18:20
Orthogonal Complements, Part II

33m 49s

Intro
0:00
Relations Among the Four Fundamental Vector Spaces Associated with a Matrix A
2:16
Four Spaces Associated With A (If A is m x n)
2:17
Theorem
4:49
Example 1
7:17
Null Space and Column Space
10:48
Projections and Applications
16:50
Projections and Applications
16:51
Projection Illustration
21:00
Example 1
23:51
Projection Illustration Review
30:15
Section 5: Eigenvalues and Eigenvectors
Eigenvalues and Eigenvectors

38m 11s

Intro
0:00
Eigenvalues and Eigenvectors
0:38
Eigenvalues and Eigenvectors
0:39
Definition 1
3:30
Example 1
7:20
Example 2
10:19
Definition 2
21:15
Example 3
23:41
Theorem 1
26:32
Theorem 2
27:56
Example 4
29:14
Review
34:32
Similar Matrices & Diagonalization

29m 55s

Intro
0:00
Similar Matrices and Diagonalization
0:25
Definition 1
0:26
Example 1
2:00
Properties
3:38
Definition 2
4:57
Theorem 1
6:12
Example 3
9:37
Theorem 2
12:40
Example 4
19:12
Example 5
20:55
Procedure for Diagonalizing Matrix A: Step 1
24:21
Procedure for Diagonalizing Matrix A: Step 2
25:04
Procedure for Diagonalizing Matrix A: Step 3
25:38
Procedure for Diagonalizing Matrix A: Step 4
27:02
Diagonalization of Symmetric Matrices

30m 14s

Intro
0:00
Diagonalization of Symmetric Matrices
1:15
Diagonalization of Symmetric Matrices
1:16
Theorem 1
2:24
Theorem 2
3:27
Example 1
4:47
Definition 1
6:44
Example 2
8:15
Theorem 3
10:28
Theorem 4
12:31
Example 3
18:00
Section 6: Linear Transformations
Linear Mappings Revisited

24m 5s

Intro
0:00
Linear Mappings
2:08
Definition
2:09
Linear Operator
7:36
Projection
8:48
Dilation
9:40
Contraction
10:07
Reflection
10:26
Rotation
11:06
Example 1
13:00
Theorem 1
18:16
Theorem 2
19:20
Kernel and Range of a Linear Map, Part I

26m 38s

Intro
0:00
Kernel and Range of a Linear Map
0:28
Definition 1
0:29
Example 1
4:36
Example 2
8:12
Definition 2
10:34
Example 3
13:34
Theorem 1
16:01
Theorem 2
18:26
Definition 3
21:11
Theorem 3
24:28
Kernel and Range of a Linear Map, Part II

25m 54s

Intro
0:00
Kernel and Range of a Linear Map
1:39
Theorem 1
1:40
Example 1: Part A
2:32
Example 1: Part B
8:12
Example 1: Part C
13:11
Example 1: Part D
14:55
Theorem 2
16:50
Theorem 3
23:00
Matrix of a Linear Map

33m 21s

Intro
0:00
Matrix of a Linear Map
0:11
Theorem 1
1:24
Procedure for Computing to Matrix: Step 1
7:10
Procedure for Computing to Matrix: Step 2
8:58
Procedure for Computing to Matrix: Step 3
9:50
Matrix of a Linear Map: Property
10:41
Example 1
14:07
Example 2
18:12
Example 3
24:31
Bookmark & Share Embed

## Copy & Paste this embed code into your website’s HTML

Please ensure that your website editor is in text mode when you paste the code.
(In Wordpress, the mode button is on the top right corner.)
×
• - Allow users to view the embedded video in full-size.
Since this lesson is not free, only the preview will appear on your website.

• ## Related Books 1 answer Last reply by: Professor HovasapianMon Oct 30, 2017 3:41 AMPost by sorin dragon on October 28, 2017Hello professor Raffi!First, I want you to know that you do a great job here and your lessons are really helping me at uni.Also, I have a question. A few lessons ago you gave the almost same example (I'm talking about the example 2) when you were demonstrating the spanning of the null space. Now you say that the span is actually a basis. How comes that? And if it is a basis, how is linear independent?Thank you so much :) 1 answer Last reply by: Professor HovasapianThu Feb 27, 2014 7:23 PMPost by Josh Winfield on February 27, 2014The relationship between the Homogeneous and Non-Homogeneous systems is truly beautiful. As well as being beautiful , it is also very curious to me.  You said "There is no reason for it to actually be that way, and yet there it is.". Which makes me think, if there is no concrete path leading to this concept, then how did mathematicians find it? Its applications seems to powerful and important to be the product of a mathematician saying "well, lets try this for no good reason and see what happens" and striking gold. 1 answer Last reply by: Professor HovasapianTue Jun 18, 2013 3:33 PMPost by Manfred Berger on June 17, 2013I'm used to thinking of a hyperplane as a n-1 dimensional subspace of any n dimensional VS. You called a 2-dimensional nullity in a 5-dimensional space a hyperplane. Can you clarify this for me?

### Homogeneous Systems

Lecture Slides are screen-captured images of important points in the lecture. Students can download and print out these lecture slide images to do practice problems as well as take notes while watching the lecture.

• Intro 0:00
• Homogeneous Systems 0:51
• Homogeneous Systems
• Procedure for Finding a Basis for the Null Space of Ax = 0
• Example 1
• Example 2
• Relationship Between Homogeneous and Non-Homogeneous Systems

### Transcription: Homogeneous Systems

Welcome back to Educator.com and welcome back to linear algebra.0000

We have been investigating the structure of vector spaces and subspaces recently getting a little bit more deeper into it to understand what it is that is actually going on in the vector space.0004

Today we are going to continue to talk about that of course, and we are going to talk about homogeneous systems.0015

As you gleaned already, homogeneous systems are profoundly important, not only in linear algebra, but they end up being of central importance in the theory of differential equations.0022

In fact, at the end of this lesson, I am going to take a little bit of a digression to talk about the nature of solutions in the field of differential equations as related to linear algebra.0033

Linear algebra and differential equations, they are very, very closely tied together. So, it is very difficult to separate one from the other.0043

Let us go ahead and get started.0050

Okay, so, we know that the null space of... so what we have is this homogeneous system, for example, ax = 0, where a is some m by n matrix.0054

This is just the matrix form of the normal linear system that we are used to, in this case a homogeneous system because everything on the right hand side is 0.0072

x is, again, is a vector, so it is a n by m, n by 1 vector that is a solution to this particular equation.0080

Now, we know that given this, we know that the null space, or the space of all solutions to this equation.0091

In other words, all of the vectors that satisfy, all of the vectors x that satisfy this equation is a subspace of RN.0102

Well, very important question, a very important problem is finding the basis for this subspace.0120

If you recall what a basis is, a basis is a set of vectors that actually spans the entire space that we are dealing with and is also linearly independent.0137

So, you remember, you might have a series of vectors that spans a space, but it might not be linearly independent.0148

Or, you might have some vectors that are linearly independent, but they might not span the space.0153

A basis is something that satisfies both of those properties.0158

Again, it spans the space that we are talking about, and the vectors in it are linearly independent.0162

In this case, the space that we are talking about is the null space. The space of solutions to the homogeneous system ax = 0.0167

So, the procedure for finding a basis for the null space of ax = 0, where is a n by m, of course I will not go ahead and mention that.0177

The first thing we do, well, we solve ax = 0 by Gauss Jordan elimination to reduced row echelon form, as always.0210

Now, if there are no arbitrary constants, in other words, if there are no columns that have no leading entries, then, the null space equals the set 0 vector.0229

What that means is that there is no basis, no null space. There is no null space, essentially -- well, there is, it is the 0 vector, but there is no basis for it.0263

In other words, there is no collection of vectors. Okay.0272

The dimension of the null space, which we called the nullity if you remember, equals 0.0277

Okay. Our second possibility is if arbitrary constants do exist after we reduce it to reduced row echelon form.0293

What that means is that if there are columns that do not have leading entries, those are... the x... the values corresponding to those columns, let us say it is the third and fifth column, so x3 and x5, I can give them any value I want. That is what the arbitrary constant means.0302

So, if arbitrary constants exist, then, write the solution x = c1x1 + c2x2 + ... + ckxk, however many of these vectors and constants there are.0321

Well, once you do that, the set s, which consists of this x1, this x2, all the way up to xk... x1, x2... xk... is a basis for the null space.0358

Let me write space over here. So, again, what we do when we want to find the basis of a null space of a homogeneous system.0390

We solve the homogeneous system with reduced row echelon form.0402

We check to see if there are no columns that do not have a leading entry, meaning if all of the columns have a leading entry, there are no arbitrary constants.0407

Our null space is the 0 vector. It has no basis, and the dimension of the null space, the nullity in other words equals 0.0414

Let me go ahead and put nullity here. Nullity is the dimension of the null space. In other words it is the number of vectors that span that space, it is 0.0423

If arbitrary constants do exist, meaning if there are columns that do not have a leading entry, then we can read off the solution for that homogeneous system.0435

We can write it this way, and the vectors that we get end up being our basis.0442

Let us do an example, and as always, it will make sense, hopefully.0450

Let us see, so our example... find, not find the... find a basis.0458

So, there is usually more than one basis. Find a basis. Basis is not necessarily unique.0476

Find a basis for, and the nullity of the null space for the following system.0484

(1,1,4,1,2), (0,1,2,1,1), this is a one, sorry about that, it looks like a 7, (0,0,0,1,2), (1,-1,0,0,2), and (2,1,6,0,1).0506

x1, x2, x3, x4, x5 = (0,0,0,0,0).0538

This is our homogeneous system. We are looking for this right here.0552

We are looking for all vectors, x1, x2, x... all vectors whose components x1, x2, x3, x4, x5... that is what we are looking for.0558

It is a solution space. So, we are going to try to find the solution, and when we do, we are going to try to write it as a linear combination of certain number of vectors.0566

Those certain number of vectors are going to be our basis for that solution space.0574

Okay. So, let us go ahead and just resubmit it to reduced row echelon form.0578

What we end up getting is the following... (1,0,2,1,0), (0,1,2,0 -- oops, I forgot a number here.0586

(1,0,2... let me go back... what I am doing here is... I am going to take the augmented matrix.0608

I am going to take this matrix and I am going to augment it with this one, so let me actually rewrite the whole thing.0627

It is (1,1,4,1,2,0), (0,1,2,1,1,0), (0,0,0,1,2,0), (1,-1,0,0,2,0), (2,1,6,0,1,0). Okay.0632

So, this is the matrix that we want to submit to reduced row echelon form.0655

I apologize, I ended up doing just the matrix a.0659

When we submit it to reduced row echelon form using our software, we get the following.0663

(1,0,2,0,1,0), (0,1,2,0,-1,0), (0,0,0,1,2,0), (0,0,0,0,0,0), and of course the last... not of course... but... there is no way of knowing ahead of time.0669

What we end up with is something like this. So, now let us check to see which columns actually do not have leading entries.0695

This first column does, this second column does, this third column does not... so, third.0702

The fourth column has a leading entry, it is right there... the fifth column... okay.0711

So the third and the fifth columns, they are arbitrary constants.0719

In other words, these of course correspond to x1, x2, x3, x4, and x5.0723

So, x3 and x5, I can let them be anything that I want.0732

I am going to give them arbitrary parameters. So, let me go ahead and write x3... I am going to call x3 R.0739

It just stands for any number.0748

And... x5 is going to be S, any parameter. Columns with non-leading entries.0752

Now, when I solve for, like for example if I take x1, it is going to end up being the following.0759

x1 = -2R - S, and here is why... that row... well this is x1 + 2x3 + x5.0767

So, x1 = -2x3 - x5.0785

Well, -2x3 -x5 is -R - S, that is where this x1 comes from.0793

I am just solving now for the x1, x2 and x4. That is where I am going to get the following equations.0800

Then, I am going to rewrite these in such a way that I can turn them into a vector that I can read right off.0809

So, x2 here is equal to -2R + S, because here, this is 2R over here, this is -S, it becomes +S when I move it over.0815

x5, our last one, is equal to -2S, and here is the reason why: because it is -- I am sorry, this is x4 not x5, x5 is S.0828

x4 + 2s = 0, therefore x = -2s.0843

So, this is our solution. Notice our infinite number of solutions, now we are going to rewrite this in order... x1, 2, 3, 4, 5 as opposed to how I wrote it here, which was just reading it off, make it as simple as possible.0851

So, let me move forward. What I end up with is x1 = -2R - S, x2 = -2R + S, x3 = R, x4 = -2S, and x5 = S.0866

So, now x1, 2, 3, 4, 5... this is the vector that I wanted. This is the arrangement.0896

Well, take a look. R and S are my parameters. This is equivalent to my following.0901

I can write this as this vector, which is just x1 x2 in vector form... is equal to... I pull out an R and this becomes (-2, -2, 1, 0, 0).0908

That is what this column right here is... -2R, -2R, R, 0, R, 0, R.0925

Then this one right here, I pull out an S... +S × (-1,1,0,-2,1). There you go.0935

This particular homogeneous system has the following solution, as it is some arbitrary number × this vector + some arbitrary number × this vector.0950

Well, these are just constants. Any sort of number that I can imagine from the real numbers... any number at all, so this is a linear combination.0960

Therefore, that vector and that vector form a basis for our solution space.0970

Therefore, we have our basis is the set (-2,-2,1,0,0), and (-1,1,0,2,1).0977

This is the basis of the null space. Well, the dimension is, well how many are there... there are 2.0997

So, the null space has a dimension 2. The nullity is 2.1005

So, notice what it is that we actually had. We had the system of 1, 2, 3, 4, 5... 1, 2, 3, 4, 5, we had this 5 by 5 system.1016

So, R5 all the way around. Well, within that 5 dimensional space, 2 of those dimensions are occupied by solutions to the homogeneous system.1026

That is what is going on here. If you think about it, what you have is any time you have 2 vectors, you have essentially a plane, what we call a hyper-plane.1040

Because, you know, we are talking about, you know, a 5-dimensional space.1051

But, that is all that is going on here. So, we solve a homogeneous system, we reduce the solution to this, and because it is a linear combination of vectors, the 2 vectors that we get actually form a basis for our solution space.1054

We could how many vectors we get to and that is the dimension of our null space, which is a subspace of all of it.1068

Let us do another example. Let us take the system 2, let us go (1,0,2), (2,1,3), (3,1,2) × x1, x2, x3 = (0,0,0).1081

We set up the augmented matrix for the homogeneous system which is (1,0,2,0, (2,1,3,0), (3,1,2,0), just to let you know, that is what the augment is.1110

We subject it to reduced row echelon form, and we get the following... We get (1,0,0,0), (0,1,0,0), (0,0,1,0).1130

There are no columns here that do not have leading entries. Basically what this is saying is that x1 is 0, x2 is 0, x3 is 0. That is our solution.1144

We have x1 = 0, x2 = 0, x3 = 0, all of these are equivalent the vector = 0 vector.1155

When we have the 0 vector as the solution space, only the trivial solution, we have no basis for this system.1168

The nullity is 0. The dimension of the solution space is 0. The only solution to this is the 0 vector.1176

Okay. Now, let us go ahead and talk... take a little bit of a digression and talk about the relation between the homogeneous and non-homogeneous systems.1186

So, we have been dealing with homogeneous systems, but as we know there is also the associated non-homogeneous system. Some vector v.1197

So, that is the non-homogeneous and this is the homogeneous version of it.1206

0 on the right, or some b on the right. There is a relationship that exists between the two. This relationship becomes profoundly important not only for linear algebra, but especially for differential equations.1213

Because, often times, we will have a particular solution to a differential equation, but we want to know some other solution.1224

But, maybe we cannot actually solve the equation. Believe it or not, the situation comes up all the time.1232

As it turns out, we do not have to solve the equation. We can solve a simpler equation. The homogeneous version, which is actually quite easy to solve for differential equations, and here is the relationship.1237

Okay. If, some vector xb is a solution, some solution that I just happen to have, to the system ax = b, and x0 is a solution to the associated homogeneous system, okay?1248

Then, if I add these two, xb and x0, then the some of those 2 as a solution is also the solution. I should say also.1285

Is also a solution to ax = b. So, if I happen to have the non-homogeneous system and if I happen to know some solution for it, and if I happen to know some solution to the homogeneous system, which is usually pretty easy to find, then I can add those 2 solutions.1306

That is a third solution, or that is a second solution to ax = b.1322

That is actually kind of extraordinary. There is no reason for it to actually be that way, and yet there it is.1328

What is more, every solution to the non-homogeneous system, ax = b, every single solution can be written as a particular solution, which we called xb + something from the null space.1337

Symbolically, xb + x0, what this says is that when I have the system ax = b, and I happen to know a solution to it and I also happen to know the solution to the homogeneous system, all the solutions.1389

In other words, I happen to know the null space every single solution to the non-homogeneous system consists of some particular solution to the non-homogeneous system plus something from the null space.1406

So, if I have, I mean obviously I am going to have maybe an infinite number of solutions to the null space, all I have to do is take any one of those solutions from the null space, add it to the particular solution that I have for the non-homogeneous, and I have my collection of every single solution to the non-homogeneous system.1422

That is amazing. What makes this most amazing is that usually when we are dealing with a non-homogeneous system, more often than not, you can usually just guess a solution.1441

We can check it, and if it works, that is a good particular solution.1450

Then, what we do is instead of solving that equation, we actually end up solving the homogeneous system, which is pretty easy to solve.1453

Then we just take any solution we want from the homogeneous system, add it to the solution that we guessed, and we have any solution that we want for the particular problem at hand.1459

That is extraordinary. It is a technique that you as engineers and physicists will use all the time.1470

Very, very important for differential equations. Okay.1477

We will go ahead and stop it there today. Thank you for joining us at educator.com, we will see you next time.1480

OR

### Start Learning Now

Our free lessons will get you started (Adobe Flash® required).