Raffi Hovasapian

Linear Independence

Slide Duration:

Section 1: Linear Equations and Matrices
Linear Systems

39m 3s

Intro
0:00
Linear Systems
1:20
Introduction to Linear Systems
1:21
Examples
10:35
Example 1
10:36
Example 2
13:44
Example 3
16:12
Example 4
23:48
Example 5
28:23
Example 6
32:32
Number of Solutions
35:08
One Solution, No Solution, Infinitely Many Solutions
35:09
Method of Elimination
36:57
Method of Elimination
36:58
Matrices

30m 34s

Intro
0:00
Matrices
0:47
Definition and Example of Matrices
0:48
Square Matrix
7:55
Diagonal Matrix
9:31
Operations with Matrices
10:35
10:36
Scalar Multiplication
15:01
Transpose of a Matrix
17:51
Matrix Types
23:17
Regular: m x n Matrix of m Rows and n Column
23:18
Square: n x n Matrix With an Equal Number of Rows and Columns
23:44
Diagonal: A Square Matrix Where All Entries OFF the Main Diagonal are '0'
24:07
Matrix Operations
24:37
Matrix Operations
24:38
Example
25:55
Example
25:56
Dot Product & Matrix Multiplication

41m 42s

Intro
0:00
Dot Product
1:04
Example of Dot Product
1:05
Matrix Multiplication
7:05
Definition
7:06
Example 1
12:26
Example 2
17:38
Matrices and Linear Systems
21:24
Matrices and Linear Systems
21:25
Example 1
29:56
Example 2
32:30
Summary
33:56
Dot Product of Two Vectors and Matrix Multiplication
33:57
Summary, cont.
35:06
Matrix Representations of Linear Systems
35:07
Examples
35:34
Examples
35:35
Properties of Matrix Operation

43m 17s

Intro
0:00
1:11
1:12
2:30
2:57
4:20
5:22
5:23
Properties of Multiplication
6:47
Properties of Multiplication: A
7:46
Properties of Multiplication: B
8:13
Properties of Multiplication: C
9:18
Example: Properties of Multiplication
9:35
Definitions and Properties (Multiplication)
14:02
Identity Matrix: n x n matrix
14:03
Let A Be a Matrix of m x n
15:23
Definitions and Properties (Multiplication)
18:36
Definitions and Properties (Multiplication)
18:37
Properties of Scalar Multiplication
22:54
Properties of Scalar Multiplication: A
23:39
Properties of Scalar Multiplication: B
24:04
Properties of Scalar Multiplication: C
24:29
Properties of Scalar Multiplication: D
24:48
Properties of the Transpose
25:30
Properties of the Transpose
25:31
Properties of the Transpose
30:28
Example
30:29
33:25
Let A, B, C, and D Be m x n Matrices
33:26
There is a Unique m x n Matrix, 0, Such That…
33:48
Unique Matrix D
34:17
Properties of Matrix Multiplication
34:58
Let A, B, and C Be Matrices of the Appropriate Size
34:59
Let A Be Square Matrix (n x n)
35:44
Properties of Scalar Multiplication
36:35
Let r and s Be Real Numbers, and A and B Matrices
36:36
Properties of the Transpose
37:10
Let r Be a Scalar, and A and B Matrices
37:12
Example
37:58
Example
37:59
Solutions of Linear Systems, Part 1

38m 14s

Intro
0:00
Reduced Row Echelon Form
0:29
An m x n Matrix is in Reduced Row Echelon Form If:
0:30
Reduced Row Echelon Form
2:58
Example: Reduced Row Echelon Form
2:59
Theorem
8:30
Every m x n Matrix is Row-Equivalent to a UNIQUE Matrix in Reduced Row Echelon Form
8:31
Systematic and Careful Example
10:02
Step 1
10:54
Step 2
11:33
Step 3
12:50
Step 4
14:02
Step 5
15:31
Step 6
17:28
Example
30:39
Find the Reduced Row Echelon Form of a Given m x n Matrix
30:40
Solutions of Linear Systems, Part II

28m 54s

Intro
0:00
Solutions of Linear Systems
0:11
Solutions of Linear Systems
0:13
Example I
3:25
Solve the Linear System 1
3:26
Solve the Linear System 2
14:31
Example II
17:41
Solve the Linear System 3
17:42
Solve the Linear System 4
20:17
Homogeneous Systems
21:54
Homogeneous Systems Overview
21:55
Theorem and Example
24:01
Inverse of a Matrix

40m 10s

Intro
0:00
Finding the Inverse of a Matrix
0:41
Finding the Inverse of a Matrix
0:42
Properties of Non-Singular Matrices
6:38
Practical Procedure
9:15
Step1
9:16
Step 2
10:10
Step 3
10:46
Example: Finding Inverse
12:50
Linear Systems and Inverses
17:01
Linear Systems and Inverses
17:02
Theorem and Example
21:15
Theorem
26:32
Theorem
26:33
List of Non-Singular Equivalences
28:37
Example: Does the Following System Have a Non-trivial Solution?
30:13
Example: Inverse of a Matrix
36:16
Section 2: Determinants
Determinants

21m 25s

Intro
0:00
Determinants
0:37
Introduction to Determinants
0:38
Example
6:12
Properties
9:00
Properties 1-5
9:01
Example
10:14
Properties, cont.
12:28
Properties 6 & 7
12:29
Example
14:14
Properties, cont.
18:34
Properties 8 & 9
18:35
Example
19:21
Cofactor Expansions

59m 31s

Intro
0:00
Cofactor Expansions and Their Application
0:42
Cofactor Expansions and Their Application
0:43
Example 1
3:52
Example 2
7:08
Evaluation of Determinants by Cofactor
9:38
Theorem
9:40
Example 1
11:41
Inverse of a Matrix by Cofactor
22:42
Inverse of a Matrix by Cofactor and Example
22:43
More Example
36:22
List of Non-Singular Equivalences
43:07
List of Non-Singular Equivalences
43:08
Example
44:38
Cramer's Rule
52:22
Introduction to Cramer's Rule and Example
52:23
Section 3: Vectors in Rn
Vectors in the Plane

46m 54s

Intro
0:00
Vectors in the Plane
0:38
Vectors in the Plane
0:39
Example 1
8:25
Example 2
15:23
19:33
19:34
Scalar Multiplication
24:08
Example
26:25
The Angle Between Two Vectors
29:33
The Angle Between Two Vectors
29:34
Example
33:54
Properties of the Dot Product and Unit Vectors
38:17
Properties of the Dot Product and Unit Vectors
38:18
Defining Unit Vectors
40:01
2 Very Important Unit Vectors
41:56
n-Vector

52m 44s

Intro
0:00
n-Vectors
0:58
4-Vector
0:59
7-Vector
1:50
2:43
Scalar Multiplication
3:37
Theorem: Part 1
4:24
Theorem: Part 2
11:38
Right and Left Handed Coordinate System
14:19
Projection of a Point Onto a Coordinate Line/Plane
17:20
Example
21:27
Cauchy-Schwarz Inequality
24:56
Triangle Inequality
36:29
Unit Vector
40:34
Vectors and Dot Products
44:23
Orthogonal Vectors
44:24
Cauchy-Schwarz Inequality
45:04
Triangle Inequality
45:21
Example 1
45:40
Example 2
48:16
Linear Transformation

48m 53s

Intro
0:00
Introduction to Linear Transformations
0:44
Introduction to Linear Transformations
0:45
Example 1
9:01
Example 2
11:33
Definition of Linear Mapping
14:13
Example 3
22:31
Example 4
26:07
Example 5
30:36
Examples
36:12
Projection Mapping
36:13
Images, Range, and Linear Mapping
38:33
Example of Linear Transformation
42:02
Linear Transformations, Part II

34m 8s

Intro
0:00
Linear Transformations
1:29
Linear Transformations
1:30
Theorem 1
7:15
Theorem 2
9:20
Example 1: Find L (-3, 4, 2)
11:17
Example 2: Is It Linear?
17:11
Theorem 3
25:57
Example 3: Finding the Standard Matrix
29:09
Lines and Planes

37m 54s

Intro
0:00
Lines and Plane
0:36
Example 1
0:37
Example 2
7:07
Lines in IR3
9:53
Parametric Equations
14:58
Example 3
17:26
Example 4
20:11
Planes in IR3
25:19
Example 5
31:12
Example 6
34:18
Section 4: Real Vector Spaces
Vector Spaces

42m 19s

Intro
0:00
Vector Spaces
3:43
Definition of Vector Spaces
3:44
Vector Spaces 1
5:19
Vector Spaces 2
9:34
Real Vector Space and Complex Vector Space
14:01
Example 1
15:59
Example 2
18:42
Examples
26:22
More Examples
26:23
Properties of Vector Spaces
32:53
Properties of Vector Spaces Overview
32:54
Property A
34:31
Property B
36:09
Property C
36:38
Property D
37:54
Property F
39:00
Subspaces

43m 37s

Intro
0:00
Subspaces
0:47
Defining Subspaces
0:48
Example 1
3:08
Example 2
3:49
Theorem
7:26
Example 3
9:11
Example 4
12:30
Example 5
16:05
Linear Combinations
23:27
Definition 1
23:28
Example 1
25:24
Definition 2
29:49
Example 2
31:34
Theorem
32:42
Example 3
34:00
Spanning Set for a Vector Space

33m 15s

Intro
0:00
A Spanning Set for a Vector Space
1:10
A Spanning Set for a Vector Space
1:11
Procedure to Check if a Set of Vectors Spans a Vector Space
3:38
Example 1
6:50
Example 2
14:28
Example 3
21:06
Example 4
22:15
Linear Independence

17m 20s

Intro
0:00
Linear Independence
0:32
Definition
0:39
Meaning
3:00
Procedure for Determining if a Given List of Vectors is Linear Independence or Linear Dependence
5:00
Example 1
7:21
Example 2
10:20
Basis & Dimension

31m 20s

Intro
0:00
Basis and Dimension
0:23
Definition
0:24
Example 1
3:30
Example 2: Part A
4:00
Example 2: Part B
6:53
Theorem 1
9:40
Theorem 2
11:32
Procedure for Finding a Subset of S that is a Basis for Span S
14:20
Example 3
16:38
Theorem 3
21:08
Example 4
25:27
Homogeneous Systems

24m 45s

Intro
0:00
Homogeneous Systems
0:51
Homogeneous Systems
0:52
Procedure for Finding a Basis for the Null Space of Ax = 0
2:56
Example 1
7:39
Example 2
18:03
Relationship Between Homogeneous and Non-Homogeneous Systems
19:47
Rank of a Matrix, Part I

35m 3s

Intro
0:00
Rank of a Matrix
1:47
Definition
1:48
Theorem 1
8:14
Example 1
9:38
Defining Row and Column Rank
16:53
If We Want a Basis for Span S Consisting of Vectors From S
22:00
If We want a Basis for Span S Consisting of Vectors Not Necessarily in S
24:07
Example 2: Part A
26:44
Example 2: Part B
32:10
Rank of a Matrix, Part II

29m 26s

Intro
0:00
Rank of a Matrix
0:17
Example 1: Part A
0:18
Example 1: Part B
5:58
Rank of a Matrix Review: Rows, Columns, and Row Rank
8:22
Procedure for Computing the Rank of a Matrix
14:36
Theorem 1: Rank + Nullity = n
16:19
Example 2
17:48
Rank & Singularity
20:09
Example 3
21:08
Theorem 2
23:25
List of Non-Singular Equivalences
24:24
List of Non-Singular Equivalences
24:25
Coordinates of a Vector

27m 3s

Intro
0:00
Coordinates of a Vector
1:07
Coordinates of a Vector
1:08
Example 1
8:35
Example 2
15:28
Example 3: Part A
19:15
Example 3: Part B
22:26
Change of Basis & Transition Matrices

33m 47s

Intro
0:00
Change of Basis & Transition Matrices
0:56
Change of Basis & Transition Matrices
0:57
Example 1
10:44
Example 2
20:44
Theorem
23:37
Example 3: Part A
26:21
Example 3: Part B
32:05
Orthonormal Bases in n-Space

32m 53s

Intro
0:00
Orthonormal Bases in n-Space
1:02
Orthonormal Bases in n-Space: Definition
1:03
Example 1
4:31
Theorem 1
6:55
Theorem 2
8:00
Theorem 3
9:04
Example 2
10:07
Theorem 2
13:54
Procedure for Constructing an O/N Basis
16:11
Example 3
21:42
Orthogonal Complements, Part I

21m 27s

Intro
0:00
Orthogonal Complements
0:19
Definition
0:20
Theorem 1
5:36
Example 1
6:58
Theorem 2
13:26
Theorem 3
15:06
Example 2
18:20
Orthogonal Complements, Part II

33m 49s

Intro
0:00
Relations Among the Four Fundamental Vector Spaces Associated with a Matrix A
2:16
Four Spaces Associated With A (If A is m x n)
2:17
Theorem
4:49
Example 1
7:17
Null Space and Column Space
10:48
Projections and Applications
16:50
Projections and Applications
16:51
Projection Illustration
21:00
Example 1
23:51
Projection Illustration Review
30:15
Section 5: Eigenvalues and Eigenvectors
Eigenvalues and Eigenvectors

38m 11s

Intro
0:00
Eigenvalues and Eigenvectors
0:38
Eigenvalues and Eigenvectors
0:39
Definition 1
3:30
Example 1
7:20
Example 2
10:19
Definition 2
21:15
Example 3
23:41
Theorem 1
26:32
Theorem 2
27:56
Example 4
29:14
Review
34:32
Similar Matrices & Diagonalization

29m 55s

Intro
0:00
Similar Matrices and Diagonalization
0:25
Definition 1
0:26
Example 1
2:00
Properties
3:38
Definition 2
4:57
Theorem 1
6:12
Example 3
9:37
Theorem 2
12:40
Example 4
19:12
Example 5
20:55
Procedure for Diagonalizing Matrix A: Step 1
24:21
Procedure for Diagonalizing Matrix A: Step 2
25:04
Procedure for Diagonalizing Matrix A: Step 3
25:38
Procedure for Diagonalizing Matrix A: Step 4
27:02
Diagonalization of Symmetric Matrices

30m 14s

Intro
0:00
Diagonalization of Symmetric Matrices
1:15
Diagonalization of Symmetric Matrices
1:16
Theorem 1
2:24
Theorem 2
3:27
Example 1
4:47
Definition 1
6:44
Example 2
8:15
Theorem 3
10:28
Theorem 4
12:31
Example 3
18:00
Section 6: Linear Transformations
Linear Mappings Revisited

24m 5s

Intro
0:00
Linear Mappings
2:08
Definition
2:09
Linear Operator
7:36
Projection
8:48
Dilation
9:40
Contraction
10:07
Reflection
10:26
Rotation
11:06
Example 1
13:00
Theorem 1
18:16
Theorem 2
19:20
Kernel and Range of a Linear Map, Part I

26m 38s

Intro
0:00
Kernel and Range of a Linear Map
0:28
Definition 1
0:29
Example 1
4:36
Example 2
8:12
Definition 2
10:34
Example 3
13:34
Theorem 1
16:01
Theorem 2
18:26
Definition 3
21:11
Theorem 3
24:28
Kernel and Range of a Linear Map, Part II

25m 54s

Intro
0:00
Kernel and Range of a Linear Map
1:39
Theorem 1
1:40
Example 1: Part A
2:32
Example 1: Part B
8:12
Example 1: Part C
13:11
Example 1: Part D
14:55
Theorem 2
16:50
Theorem 3
23:00
Matrix of a Linear Map

33m 21s

Intro
0:00
Matrix of a Linear Map
0:11
Theorem 1
1:24
Procedure for Computing to Matrix: Step 1
7:10
Procedure for Computing to Matrix: Step 2
8:58
Procedure for Computing to Matrix: Step 3
9:50
Matrix of a Linear Map: Property
10:41
Example 1
14:07
Example 2
18:12
Example 3
24:31
Bookmark & Share Embed

## Copy & Paste this embed code into your website’s HTML

Please ensure that your website editor is in text mode when you paste the code.
(In Wordpress, the mode button is on the top right corner.)
×
• - Allow users to view the embedded video in full-size.
Since this lesson is not free, only the preview will appear on your website.

• ## Related Books

 1 answerLast reply by: Professor HovasapianWed Oct 4, 2017 2:37 AMPost by Cliff C Morris Jr on October 3, 2017Hello Prof, Cliff here.  I noticed your lecture on Linear Independence needs to be reset or re-inserted.  The 17-minute video only lasts for less than 2 minutes currently.  I am thoroughly enjoying your explanations.  My Linear Algebra course materials in about 45 years old.  Great getting back into the material.  Cheers! 1 answerLast reply by: Professor HovasapianTue Sep 23, 2014 1:49 AMPost by Sheena Patel on September 22, 2014  Would it be possible to just look at a matrix problem and figure it is non-trival, trival, linear independent or dependent, consistant, inconsisent all at once? 1 answerLast reply by: Professor HovasapianWed Sep 25, 2013 5:19 PMPost by Christian Fischer on September 25, 2013Hi Raffi, I know I have a bunch of questions but it's because I'm really into your lectures! There is one thing I can't understand mathematically from example 2 (14 minutes in) If you have the sum of 3 terms t^2(---) + t(---) + (---) = 0 How can you then set each individual term equal to zero? I know from multiplication that if a*(x-a)= zero then both terms a=0 and (x-a) = 0, but I did not think it applied to addition?  Is there something I'm not seeing properly? 1 answerLast reply by: Professor HovasapianTue Sep 24, 2013 2:52 PMPost by Vivek Sharma on September 24, 2013Hi Prof .Raffi..i am doing linear algebra course from uni in australia ...i was very depressed about my performance as i have only a month in exams.. but now i believe after hearing your lectures that i am still in the market.i will really appreciate if u can upload a video on inner product spaces and gram- schmidit process etc...With kind regards.... 1 answerLast reply by: Mary ShriverMon Feb 18, 2013 12:14 AMPost by Mary Shriver on February 17, 2013For example 2 you have written a non trivial solution= dependence. I could just be confused and I will watch the example again but should it be independence when there are non trivial solutions. Your teachings are very helpful.

### Linear Independence

Lecture Slides are screen-captured images of important points in the lecture. Students can download and print out these lecture slide images to do practice problems as well as take notes while watching the lecture.

• Intro 0:00
• Linear Independence 0:32
• Definition
• Meaning
• Procedure for Determining if a Given List of Vectors is Linear Independence or Linear Dependence
• Example 1
• Example 2

### Transcription: Linear Independence

Welcome back to Educator.com and welcome back to linear algebra.0000

In the last lesson we talked about something called the span of a given set of vectors.0004

In other words, any linear combination of those particular vectors represents, all the vectors that can be represented by what actually spans.0010

Today we are going to be talking about a related concept, again, a very, very profoundly important concept called linear independence, and its dual, linear dependence.0019

So, let us write out some definitions and get started right away and jump into the examples, because again, examples are the things that make it very, very clear... I think.0032

So, let us define linear dependence. Even though we speak about linear independence, the actual definition is written in terms of linear dependence.0041

Well, let us actually -- okay -- vectors v1, v2... vk are linearly dependent.0058

We will often just say dependent or independent without saying linear. We are talking about linear algebra so it goes without saying.0090

If there exists constants, c1, c2... all the way to ck... not all 0 -- that is important, because that is easy -- such that c1v1 + c2v2 + ckvk = 0, or the 0 vector.0096

So, let us look at this again. So, if vectors v1 to vk, if I have a set of vectors and if I can somehow, if there exist constants -- no matter what they are, but not all of them 0.0134

If I can arrange them in such a way, some linear combination of them, if they add up to 0, we call that linearly dependent, okay?0148

A couple of vectors are linearly dependent, and let us talk about what that actually means. Okay.0155

Oh, by the way, if it is not the case, that is when they are linearly independent.0165

If you cannot find this, or if the only way to make this true is if all of the individual c's are 0, or there is no way to find it otherwise, that means that they are linearly independent.0170

Here is what the meaning is, just so you get an idea of what is going on.0181

If we solve this equation... actually, I am going to number this equation. I have not done this before but this is definitely 1 equation that we are going to want to number.0189

We will call it equation 1, because we are going to refer to it again and again. It is a very important equation.0193

If you solve this equation for any one of these vectors, so let us just choose one arbitrarily, let us choose that.0200

I am going to write that as c2v2 = well, we are going to move all of these over to the right... -c1v1 - c3v3 - so on, all the way to the end... -ckvk.0209

Then I am going to go ahead and divide by c2, so I get v2 = this whole thing.0232

Let me just... I hope you do not mind, I am going to call this whole thing the capital Z.0241

Z/c2, well, as you can see, if this is true, then you can always solve for one of these vectors and this vector is always going to be some linear combination of the other vectors.0249

That is what dependence means. Each of those vectors is dependent on the other vectors.0265

In other words, it can be represented as some combination of the others. That is why it is called dependent.0271

So, that is all that means. It is nothing strange, it makes perfect sense, and if this relationship does not exist, then it is independent.0279

In other words, 1 vector cannot be represented as a combination of the others. It is independent. That is what independence means.0287

Okay. Let us jump into examples, because that is what is important.0297

Actually, let me talk about it... let me list at least the procedure. It is analogous to what we did before.0302

The procedure for determining if a given list of vectors is linearly independent or linearly dependent... LD, LI... abbreviations.0309

The first thing we do, well, we form equation 1. Remember when we were dealing with span over on the right hand side?0335

We did not have 0, we had some arbitrary vector. Now, for linear independence or linear dependence, we set it equal to 0.0343

So, we form equation 1, which is a homogeneous system.0352

Then, 2, then we solve that system, and here are the results.0365

If you find out it only has the trivial solution, that means all of the c's the constants are 0... that implies that it is independent.0373

The other thing is, if there exists a non-trivial solution... just one, could be many, but if there is just one, so again, you remember this reverse e means there exists.0391

So, if there exists a non-trivial solution, that implies that it is dependent.0411

Again, we are just solving linear systems. That is all we are doing, and the solutions to these linear systems give us all kinds of information about the underlying structure of this vector space.0422

Whether something spans it, whether something is linearly independent or dependent, and of course, all of these will make more sense as we delve deeper into the structure of the vector space. Okay.0432

So, let us start with our example... this is going to be not a continuation of what we did for the span, but I guess kind of a further discussion of it.0442

You remember in the last example of the last lesson, we found we had this homogeneous system and we found solutions for x, and we found 2 vectors that actually span the entire solution space, the null space.0455

Those vectors were as follows... (-1,1,0,0), (-2,0,1,1).0472

So, we know that these two vectors span the solution space to this particular equation based on what a was.0485

I will not write down what a is. It is not necessary.0492

Now, the question is... we know that they span the null space... the question is are they linearly independent or dependent.0497

So, our question here is... are these two vectors LI or LD.0506

Our procedure says form equation 1. So, form equation 1.0523

That is just c1 × this vector, (-1,1,0,0) + c2 × (-2,0,1,1)... and we set it equal to the 0 vector which is just all 0's.0530

So, we do not need the vector mark anymore... (0,0,0,0). Okay.0546

Then, what we end up having is the following... this is equivalent to the following... (-1,-2,0), we are just taking the coefficients, that is all we are doing.0552

(-1,0,0), (0,1,0), (0,1,0), and when we subject this to Gauss Jordan elimination, reduced row echelon, we end up with the following... c2 = 0, c1 = 0.0566

That means that all of the constants, we only have two constants in this case, so this is only the trivial solution.0585

Therefore, they are independent. There you go, that is it. It is that simple.0596

You set up the homogeneous system, you solve the homogeneous system, and you decide whether it is dependent or independent. Fantastic technique.0605

Okay. Let us consider the vector space of polynomials again. Let us consider p2, again.0617

p2, the set of all polynomials of degree < or = 2.0626

Let us look at 3 vectors in there... we have t2 + t + 2.0631

We have p2t, which is equal to 2t2 + t, we have p3t, which is equal to 3t2 + 2t + 2.0642

So, they are just, you know, random vectors in this particular space, in other words random polynomials.0656

Well, we want to know if these three as vectors are the linearly dependent or independent.0661

Well, do what we do. We set up equation 1, which is the following. We take arbitrary constants... c1 × p1t + c2 × p2t, we will write everything out here... we want things to be as explicit as possible.0671

Plus c3 × p3t, and we set it equal to 0, that is our homogeneous system.0688

Now, we actually expand this by putting in what these p1, p2, p3 are. Okay.0695

We get p1 × t2 + t + 2 + c2 × 2t2 + t + c3 × 3t2 + 2t + 2 = 0.0702

Now, let us actually... this one I am going to do explicitly... there is no particular reason why, I just decided that it would be nice to do this one explicitly.0725

So, I have c1t2 + c1t + 2c1 + 2c2t2 +c2t + 3c3t2 + 2c3t + 2c3 = 0.0734

Algebra makes me crazy, just like it makes you crazy, because there are a whole bunch of things floating around to keep track of it all.0762

Just go slowly and very carefully and be systematic. That is... do not ever do anything in your head.0769

That is the real secret to math, do not do anything in your head. You will not be impressing anyone.0774

I collect the terms... the t2 terms, so I have that one, that one, and that one, and I end up with... so let me write these out as t2 × c1 + 2c2 + 3c3.0780

Then, I will take the t terms... there is a t, there is a t, there is a t, and I will write that as a second line here, just to be clear what it is that we are doing.0800

c1 + c2 + 2c3... then I have plus the... well, the rest of the terms.0811

That one... and that one... and is there one that I am missing? No. It looks like it is okay.0821

So, it is going to be + 2c1 + 2c3 and all of this... sum... is equal to 0.0829

Again, that means that this is 0, this is 0, this is 0. That is what this system is.0843

So, we will write that, because everything is 0 on the right, so all of these have to be 0 in order to make this left side 0.0850

So, I get c1 + 2c2 + 2c3 = 0.0858

Note, we do not want these lines floating around. We want to be able to see everything here.0867

c1 + c2 + 2c3, is equal to 0.0875

2c1 + 2c3 = 0, this is of course equivalent to... I will just take the coefficients... (1,2,3,0), (1,1,2,0), (2,0,2,0), okay.0885

So, this is the system that we want to solve, and we are going to subject that to reduced row echelon.0906

So, I put a little arrow to let you know what is happening here and what you end up with is (1,0,1,0), (0,1,1,0), (0,0,0,0).0913

So, let us take a look at our reduced row echelon. We have this is fine, yes. That is a leading entry... that is fine, that is a leading entry.0934

There is no leading entry here. Remember when we solved reduced row echelon for a homogeneous system, this means we have infinite number of solutions, because this one can be any parameter.0944

If this is any parameter, well, I can choose any number for this one and then that this means these two will be based on this.0954

Therefore, we have infinite solutions. In other words, there does exist a non-trivial solution.0963

So, there exists a non-trivial solution, which implies dependence... that means that those three polynomials that I had, one of them can be expressed as a linear combination of the other two.0974

So, they are not completely independent. At least one of them depends on the others.0996

So, we have dependence. Again, today we talked about linear independence and dependence.1006

The previous lesson we talked about the span, so, make sure you recall... we are still studying a linear system when we do that, but with a span we choose an arbitrary vector... that is our solution on the right hand side of the equation, that linear combination that we write.1013

For linear dependence and independence we are solving a homogeneous system. We just set everything equal to 0. Make sure to keep those straight.1028

Thank you for joining us here for a discussion of Linear Algebra at Educator.com. We will see you next time.1036

OR

### Start Learning Now

Our free lessons will get you started (Adobe Flash® required).