Raffi Hovasapian

Raffi Hovasapian

Basis & Dimension

Slide Duration:

Table of Contents

Section 1: Linear Equations and Matrices
Linear Systems

39m 3s

Intro
0:00
Linear Systems
1:20
Introduction to Linear Systems
1:21
Examples
10:35
Example 1
10:36
Example 2
13:44
Example 3
16:12
Example 4
23:48
Example 5
28:23
Example 6
32:32
Number of Solutions
35:08
One Solution, No Solution, Infinitely Many Solutions
35:09
Method of Elimination
36:57
Method of Elimination
36:58
Matrices

30m 34s

Intro
0:00
Matrices
0:47
Definition and Example of Matrices
0:48
Square Matrix
7:55
Diagonal Matrix
9:31
Operations with Matrices
10:35
Matrix Addition
10:36
Scalar Multiplication
15:01
Transpose of a Matrix
17:51
Matrix Types
23:17
Regular: m x n Matrix of m Rows and n Column
23:18
Square: n x n Matrix With an Equal Number of Rows and Columns
23:44
Diagonal: A Square Matrix Where All Entries OFF the Main Diagonal are '0'
24:07
Matrix Operations
24:37
Matrix Operations
24:38
Example
25:55
Example
25:56
Dot Product & Matrix Multiplication

41m 42s

Intro
0:00
Dot Product
1:04
Example of Dot Product
1:05
Matrix Multiplication
7:05
Definition
7:06
Example 1
12:26
Example 2
17:38
Matrices and Linear Systems
21:24
Matrices and Linear Systems
21:25
Example 1
29:56
Example 2
32:30
Summary
33:56
Dot Product of Two Vectors and Matrix Multiplication
33:57
Summary, cont.
35:06
Matrix Representations of Linear Systems
35:07
Examples
35:34
Examples
35:35
Properties of Matrix Operation

43m 17s

Intro
0:00
Properties of Addition
1:11
Properties of Addition: A
1:12
Properties of Addition: B
2:30
Properties of Addition: C
2:57
Properties of Addition: D
4:20
Properties of Addition
5:22
Properties of Addition
5:23
Properties of Multiplication
6:47
Properties of Multiplication: A
7:46
Properties of Multiplication: B
8:13
Properties of Multiplication: C
9:18
Example: Properties of Multiplication
9:35
Definitions and Properties (Multiplication)
14:02
Identity Matrix: n x n matrix
14:03
Let A Be a Matrix of m x n
15:23
Definitions and Properties (Multiplication)
18:36
Definitions and Properties (Multiplication)
18:37
Properties of Scalar Multiplication
22:54
Properties of Scalar Multiplication: A
23:39
Properties of Scalar Multiplication: B
24:04
Properties of Scalar Multiplication: C
24:29
Properties of Scalar Multiplication: D
24:48
Properties of the Transpose
25:30
Properties of the Transpose
25:31
Properties of the Transpose
30:28
Example
30:29
Properties of Matrix Addition
33:25
Let A, B, C, and D Be m x n Matrices
33:26
There is a Unique m x n Matrix, 0, Such That…
33:48
Unique Matrix D
34:17
Properties of Matrix Multiplication
34:58
Let A, B, and C Be Matrices of the Appropriate Size
34:59
Let A Be Square Matrix (n x n)
35:44
Properties of Scalar Multiplication
36:35
Let r and s Be Real Numbers, and A and B Matrices
36:36
Properties of the Transpose
37:10
Let r Be a Scalar, and A and B Matrices
37:12
Example
37:58
Example
37:59
Solutions of Linear Systems, Part 1

38m 14s

Intro
0:00
Reduced Row Echelon Form
0:29
An m x n Matrix is in Reduced Row Echelon Form If:
0:30
Reduced Row Echelon Form
2:58
Example: Reduced Row Echelon Form
2:59
Theorem
8:30
Every m x n Matrix is Row-Equivalent to a UNIQUE Matrix in Reduced Row Echelon Form
8:31
Systematic and Careful Example
10:02
Step 1
10:54
Step 2
11:33
Step 3
12:50
Step 4
14:02
Step 5
15:31
Step 6
17:28
Example
30:39
Find the Reduced Row Echelon Form of a Given m x n Matrix
30:40
Solutions of Linear Systems, Part II

28m 54s

Intro
0:00
Solutions of Linear Systems
0:11
Solutions of Linear Systems
0:13
Example I
3:25
Solve the Linear System 1
3:26
Solve the Linear System 2
14:31
Example II
17:41
Solve the Linear System 3
17:42
Solve the Linear System 4
20:17
Homogeneous Systems
21:54
Homogeneous Systems Overview
21:55
Theorem and Example
24:01
Inverse of a Matrix

40m 10s

Intro
0:00
Finding the Inverse of a Matrix
0:41
Finding the Inverse of a Matrix
0:42
Properties of Non-Singular Matrices
6:38
Practical Procedure
9:15
Step1
9:16
Step 2
10:10
Step 3
10:46
Example: Finding Inverse
12:50
Linear Systems and Inverses
17:01
Linear Systems and Inverses
17:02
Theorem and Example
21:15
Theorem
26:32
Theorem
26:33
List of Non-Singular Equivalences
28:37
Example: Does the Following System Have a Non-trivial Solution?
30:13
Example: Inverse of a Matrix
36:16
Section 2: Determinants
Determinants

21m 25s

Intro
0:00
Determinants
0:37
Introduction to Determinants
0:38
Example
6:12
Properties
9:00
Properties 1-5
9:01
Example
10:14
Properties, cont.
12:28
Properties 6 & 7
12:29
Example
14:14
Properties, cont.
18:34
Properties 8 & 9
18:35
Example
19:21
Cofactor Expansions

59m 31s

Intro
0:00
Cofactor Expansions and Their Application
0:42
Cofactor Expansions and Their Application
0:43
Example 1
3:52
Example 2
7:08
Evaluation of Determinants by Cofactor
9:38
Theorem
9:40
Example 1
11:41
Inverse of a Matrix by Cofactor
22:42
Inverse of a Matrix by Cofactor and Example
22:43
More Example
36:22
List of Non-Singular Equivalences
43:07
List of Non-Singular Equivalences
43:08
Example
44:38
Cramer's Rule
52:22
Introduction to Cramer's Rule and Example
52:23
Section 3: Vectors in Rn
Vectors in the Plane

46m 54s

Intro
0:00
Vectors in the Plane
0:38
Vectors in the Plane
0:39
Example 1
8:25
Example 2
15:23
Vector Addition and Scalar Multiplication
19:33
Vector Addition
19:34
Scalar Multiplication
24:08
Example
26:25
The Angle Between Two Vectors
29:33
The Angle Between Two Vectors
29:34
Example
33:54
Properties of the Dot Product and Unit Vectors
38:17
Properties of the Dot Product and Unit Vectors
38:18
Defining Unit Vectors
40:01
2 Very Important Unit Vectors
41:56
n-Vector

52m 44s

Intro
0:00
n-Vectors
0:58
4-Vector
0:59
7-Vector
1:50
Vector Addition
2:43
Scalar Multiplication
3:37
Theorem: Part 1
4:24
Theorem: Part 2
11:38
Right and Left Handed Coordinate System
14:19
Projection of a Point Onto a Coordinate Line/Plane
17:20
Example
21:27
Cauchy-Schwarz Inequality
24:56
Triangle Inequality
36:29
Unit Vector
40:34
Vectors and Dot Products
44:23
Orthogonal Vectors
44:24
Cauchy-Schwarz Inequality
45:04
Triangle Inequality
45:21
Example 1
45:40
Example 2
48:16
Linear Transformation

48m 53s

Intro
0:00
Introduction to Linear Transformations
0:44
Introduction to Linear Transformations
0:45
Example 1
9:01
Example 2
11:33
Definition of Linear Mapping
14:13
Example 3
22:31
Example 4
26:07
Example 5
30:36
Examples
36:12
Projection Mapping
36:13
Images, Range, and Linear Mapping
38:33
Example of Linear Transformation
42:02
Linear Transformations, Part II

34m 8s

Intro
0:00
Linear Transformations
1:29
Linear Transformations
1:30
Theorem 1
7:15
Theorem 2
9:20
Example 1: Find L (-3, 4, 2)
11:17
Example 2: Is It Linear?
17:11
Theorem 3
25:57
Example 3: Finding the Standard Matrix
29:09
Lines and Planes

37m 54s

Intro
0:00
Lines and Plane
0:36
Example 1
0:37
Example 2
7:07
Lines in IR3
9:53
Parametric Equations
14:58
Example 3
17:26
Example 4
20:11
Planes in IR3
25:19
Example 5
31:12
Example 6
34:18
Section 4: Real Vector Spaces
Vector Spaces

42m 19s

Intro
0:00
Vector Spaces
3:43
Definition of Vector Spaces
3:44
Vector Spaces 1
5:19
Vector Spaces 2
9:34
Real Vector Space and Complex Vector Space
14:01
Example 1
15:59
Example 2
18:42
Examples
26:22
More Examples
26:23
Properties of Vector Spaces
32:53
Properties of Vector Spaces Overview
32:54
Property A
34:31
Property B
36:09
Property C
36:38
Property D
37:54
Property F
39:00
Subspaces

43m 37s

Intro
0:00
Subspaces
0:47
Defining Subspaces
0:48
Example 1
3:08
Example 2
3:49
Theorem
7:26
Example 3
9:11
Example 4
12:30
Example 5
16:05
Linear Combinations
23:27
Definition 1
23:28
Example 1
25:24
Definition 2
29:49
Example 2
31:34
Theorem
32:42
Example 3
34:00
Spanning Set for a Vector Space

33m 15s

Intro
0:00
A Spanning Set for a Vector Space
1:10
A Spanning Set for a Vector Space
1:11
Procedure to Check if a Set of Vectors Spans a Vector Space
3:38
Example 1
6:50
Example 2
14:28
Example 3
21:06
Example 4
22:15
Linear Independence

17m 20s

Intro
0:00
Linear Independence
0:32
Definition
0:39
Meaning
3:00
Procedure for Determining if a Given List of Vectors is Linear Independence or Linear Dependence
5:00
Example 1
7:21
Example 2
10:20
Basis & Dimension

31m 20s

Intro
0:00
Basis and Dimension
0:23
Definition
0:24
Example 1
3:30
Example 2: Part A
4:00
Example 2: Part B
6:53
Theorem 1
9:40
Theorem 2
11:32
Procedure for Finding a Subset of S that is a Basis for Span S
14:20
Example 3
16:38
Theorem 3
21:08
Example 4
25:27
Homogeneous Systems

24m 45s

Intro
0:00
Homogeneous Systems
0:51
Homogeneous Systems
0:52
Procedure for Finding a Basis for the Null Space of Ax = 0
2:56
Example 1
7:39
Example 2
18:03
Relationship Between Homogeneous and Non-Homogeneous Systems
19:47
Rank of a Matrix, Part I

35m 3s

Intro
0:00
Rank of a Matrix
1:47
Definition
1:48
Theorem 1
8:14
Example 1
9:38
Defining Row and Column Rank
16:53
If We Want a Basis for Span S Consisting of Vectors From S
22:00
If We want a Basis for Span S Consisting of Vectors Not Necessarily in S
24:07
Example 2: Part A
26:44
Example 2: Part B
32:10
Rank of a Matrix, Part II

29m 26s

Intro
0:00
Rank of a Matrix
0:17
Example 1: Part A
0:18
Example 1: Part B
5:58
Rank of a Matrix Review: Rows, Columns, and Row Rank
8:22
Procedure for Computing the Rank of a Matrix
14:36
Theorem 1: Rank + Nullity = n
16:19
Example 2
17:48
Rank & Singularity
20:09
Example 3
21:08
Theorem 2
23:25
List of Non-Singular Equivalences
24:24
List of Non-Singular Equivalences
24:25
Coordinates of a Vector

27m 3s

Intro
0:00
Coordinates of a Vector
1:07
Coordinates of a Vector
1:08
Example 1
8:35
Example 2
15:28
Example 3: Part A
19:15
Example 3: Part B
22:26
Change of Basis & Transition Matrices

33m 47s

Intro
0:00
Change of Basis & Transition Matrices
0:56
Change of Basis & Transition Matrices
0:57
Example 1
10:44
Example 2
20:44
Theorem
23:37
Example 3: Part A
26:21
Example 3: Part B
32:05
Orthonormal Bases in n-Space

32m 53s

Intro
0:00
Orthonormal Bases in n-Space
1:02
Orthonormal Bases in n-Space: Definition
1:03
Example 1
4:31
Theorem 1
6:55
Theorem 2
8:00
Theorem 3
9:04
Example 2
10:07
Theorem 2
13:54
Procedure for Constructing an O/N Basis
16:11
Example 3
21:42
Orthogonal Complements, Part I

21m 27s

Intro
0:00
Orthogonal Complements
0:19
Definition
0:20
Theorem 1
5:36
Example 1
6:58
Theorem 2
13:26
Theorem 3
15:06
Example 2
18:20
Orthogonal Complements, Part II

33m 49s

Intro
0:00
Relations Among the Four Fundamental Vector Spaces Associated with a Matrix A
2:16
Four Spaces Associated With A (If A is m x n)
2:17
Theorem
4:49
Example 1
7:17
Null Space and Column Space
10:48
Projections and Applications
16:50
Projections and Applications
16:51
Projection Illustration
21:00
Example 1
23:51
Projection Illustration Review
30:15
Section 5: Eigenvalues and Eigenvectors
Eigenvalues and Eigenvectors

38m 11s

Intro
0:00
Eigenvalues and Eigenvectors
0:38
Eigenvalues and Eigenvectors
0:39
Definition 1
3:30
Example 1
7:20
Example 2
10:19
Definition 2
21:15
Example 3
23:41
Theorem 1
26:32
Theorem 2
27:56
Example 4
29:14
Review
34:32
Similar Matrices & Diagonalization

29m 55s

Intro
0:00
Similar Matrices and Diagonalization
0:25
Definition 1
0:26
Example 1
2:00
Properties
3:38
Definition 2
4:57
Theorem 1
6:12
Example 3
9:37
Theorem 2
12:40
Example 4
19:12
Example 5
20:55
Procedure for Diagonalizing Matrix A: Step 1
24:21
Procedure for Diagonalizing Matrix A: Step 2
25:04
Procedure for Diagonalizing Matrix A: Step 3
25:38
Procedure for Diagonalizing Matrix A: Step 4
27:02
Diagonalization of Symmetric Matrices

30m 14s

Intro
0:00
Diagonalization of Symmetric Matrices
1:15
Diagonalization of Symmetric Matrices
1:16
Theorem 1
2:24
Theorem 2
3:27
Example 1
4:47
Definition 1
6:44
Example 2
8:15
Theorem 3
10:28
Theorem 4
12:31
Example 3
18:00
Section 6: Linear Transformations
Linear Mappings Revisited

24m 5s

Intro
0:00
Linear Mappings
2:08
Definition
2:09
Linear Operator
7:36
Projection
8:48
Dilation
9:40
Contraction
10:07
Reflection
10:26
Rotation
11:06
Example 1
13:00
Theorem 1
18:16
Theorem 2
19:20
Kernel and Range of a Linear Map, Part I

26m 38s

Intro
0:00
Kernel and Range of a Linear Map
0:28
Definition 1
0:29
Example 1
4:36
Example 2
8:12
Definition 2
10:34
Example 3
13:34
Theorem 1
16:01
Theorem 2
18:26
Definition 3
21:11
Theorem 3
24:28
Kernel and Range of a Linear Map, Part II

25m 54s

Intro
0:00
Kernel and Range of a Linear Map
1:39
Theorem 1
1:40
Example 1: Part A
2:32
Example 1: Part B
8:12
Example 1: Part C
13:11
Example 1: Part D
14:55
Theorem 2
16:50
Theorem 3
23:00
Matrix of a Linear Map

33m 21s

Intro
0:00
Matrix of a Linear Map
0:11
Theorem 1
1:24
Procedure for Computing to Matrix: Step 1
7:10
Procedure for Computing to Matrix: Step 2
8:58
Procedure for Computing to Matrix: Step 3
9:50
Matrix of a Linear Map: Property
10:41
Example 1
14:07
Example 2
18:12
Example 3
24:31
Loading...
This is a quick preview of the lesson. For full access, please Log In or Sign up.
For more information, please see full course syllabus of Linear Algebra
Bookmark & Share Embed

Share this knowledge with your friends!

Copy & Paste this embed code into your website’s HTML

Please ensure that your website editor is in text mode when you paste the code.
(In Wordpress, the mode button is on the top right corner.)
  ×
  • - Allow users to view the embedded video in full-size.
Since this lesson is not free, only the preview will appear on your website.
  • Discussion

  • Answer Engine

  • Download Lecture Slides

  • Table of Contents

  • Transcription

  • Related Books

Lecture Comments (17)

1 answer

Last reply by: Professor Hovasapian
Fri Jan 20, 2017 3:34 AM

Post by Shih-Kuan Chen on January 19, 2017

Are orthogonal vectors always linearly independent, and are linearly independent vectors always orthogonal?

2 answers

Last reply by: Professor Hovasapian
Fri Mar 25, 2016 10:40 PM

Post by David Löfqvist on March 19, 2016

Maybe wrong place again, but could you explain the Hector triple production? I know hos to calcu?ate it, but I havet no idea och what I'm calculating?
Thanks again for your great Messina!

1 answer

Last reply by: Professor Hovasapian
Fri Mar 14, 2014 7:29 PM

Post by Josh Winfield on February 26, 2014

6:48 "This is the non-trivial solution"..........should be trivial solution  

1 answer

Last reply by: Professor Hovasapian
Thu Jan 9, 2014 3:39 AM

Post by Joel Fredin on January 8, 2014

Raffi.

For the first, I totally love your videos. You are a really really great teacher, so keep it up! :)

For the second, are you going through how to use the determinant to calculate the area or the volume of a Geometric shape? I can't seem to find it anywhere :( Maybe a little bit off topic but i hope you will answer my question relative soon. Thank you very much for all of your hard work, I don't think you know how much it helps me.

Joel

2 answers

Last reply by: Christian Fischer
Tue Oct 1, 2013 2:29 AM

Post by Christian Fischer on September 25, 2013

Hi Raffi: Just a question for Theorem 2: is the following true
a) S is a subset of V (not a subspace)
b) w=SpanS is a few of the vectors in S (so a subset of S if not all the vectors in S are required to span S)

Have a great day!!

1 answer

Last reply by: Professor Hovasapian
Tue Oct 16, 2012 4:02 AM

Post by Suhaib Hasan on October 16, 2012

Thanks for the quick explanations in the beginning with unit vectors i, j, and k; it definitely helped me get a better understanding for the basis of a vector space.

1 answer

Last reply by: Professor Hovasapian
Tue Aug 14, 2012 8:29 PM

Post by Shahaz Shajahan on August 14, 2012

if you solved the system and you are left with the trivial solution, is that enough to show that all the vectors in the set are basis? as i've seen a few examples which show the solution involving the inverse of the matrix, im not entirely sure why the inverse relates to the unique solution, would you be able to explain please, thanks

0 answers

Post by Kamal Almarzooq on December 12, 2011

i like this theorem too :)

Basis & Dimension

Lecture Slides are screen-captured images of important points in the lecture. Students can download and print out these lecture slide images to do practice problems as well as take notes while watching the lecture.

  • Intro 0:00
  • Basis and Dimension 0:23
    • Definition
    • Example 1
    • Example 2: Part A
    • Example 2: Part B
    • Theorem 1
    • Theorem 2
    • Procedure for Finding a Subset of S that is a Basis for Span S
    • Example 3
    • Theorem 3
    • Example 4

Transcription: Basis & Dimension

Welcome back to Educator.com and welcome back to linear algebra.0000

Last couple of lessons, we talked about linear independence, and we talked about the span.0004

Today we are going to talk about something called basis and dimension, and we are going to use linear independence and span to define those things.0013

So, let us get started. Okay. Let us start with a definition here.0021

Again, math usually always starts with some definition. Okay.0031

Vectors v1, v2, and so on... all the way to vk are said to form a basis for vector space v.0037

If 1... v1, v2, all the way to vk, with a span b.0065

And 2... v1, v2... vk are independent, linearly independent, but I will just write independent.0086

So, again, in the case of a set of vectors that is both independent and happens to span a vector space or some subspace, span something that we are, that we happen to be interested in dealing with.0103

We actually give it a special name. It is called a basis. Now, you can have vectors that are independent, but do not necessarily span a space.0119

So, for example, if I had 3-space, I could take the i vector and the j vector.0130

Well, they certainly are independent, they are orthogonal, they have nothing to do with each other... and yet they do not span the entire space. They only span a part of the space... in other words, the xy plane.0135

Or, you can have vectors that span the entire space, but are not necessarily independent.0145

So, again, let us take 3-space. I can have i,j,k, and let us say I decided to take also the vector 5k, another vector in the direction of k, but 5 times its length.0152

Well, it is a different vector. So, there are four vectors but... and they span the space, you know, every single vector can be written as a linear combination of i,j,k, and 5k, but they are not linearly independent.0165

They are dependent, because 5k can be written as a, well, constant × k. It is just they are parallel, they are the same thing,0179

So, again, you can have something that spans a space but is not independent, and you can have vectors that are independent but do not necessarily span the space.0190

What we want is something that does both. When it does both, it is called a basis for that space... profoundly important.0199

Okay. Let us see what we can do. Let us just throw out some basic examples.0207

Okay. So, the one we just discussed, e1, e2, and e3, they form a basis for R3.0218

Just like e1, e2, e3, e4, e5 would form a basis for R5.0235

Let us do a computational example here.0243

We are going to take a list of four vectors.0249

v1 = (1,0,1,0), v2 = (0,1,-1,2), these are vectors by the way, I better notate them as such.0254

v3 = (0,2,2,1) and v4 = (1,0,0,1), again I just wrote them in a list, you can write them as vectors, anything you want.0273

Let me see. We want to show that these four vectors are... so, show that these form a basis for R4.0290

Well, what do we need to show that they form a basis. Two things, we need to show that the vectors span R4, in this case, and we need to show that they are linearly independent.0309

So, let us get started, and see which one we want to do first.0321

Let us go ahead and do independence first. So, again, we form the following.0326

So, equation 1. Remember, c1v1 + c2v2, I am not going to write out everything, but it is good to write out the equation which is the definition of dependence and independence... c3v3 + c4v4 = the 0 vector.0336

When I put the vectors, v1, v2, v3, v4 in here, multiply the c's, get a linear system, convert that to a matrix, I get the following (1,0,0,1,0).0354

Again, that final 0 is there... (0,1,2,0,0), (1,-1,2,0,0), again the columns of the matrix are just the vectors through v4 (0,2,1,1,0). Okay.0368

When I subject this to reduced row echelon, I get the following. c1 = 0, c2 = 0, c3 = 0, c4 = 0.0392

Again, you can confirm this with your mathematical software. This is the non-trivial solution. It implies independence. Good.0403

So, part of it is set, now let us see about the span. Well, for the span, we need to pick an arbitrary vector in R4, since we are dealing with R4.0415

We can just call it -- I do not know -- (a,b,c,d), and, we need to find to set up the following equation.0425

I will not use c because we used them before, I will use k... k1v1, constant, k2v2 + k3v3 + k4v4... symbolism in mathematics just gets crazy. Very tedious sometimes.0432

And... I will just call it v arbitrary... just some vector v.0452

Although, again, we can set up the solution, we can go (1,0,1,0), (0,1,-1,2), (0,2,2,1), (1,0,0,1), and we can do (a,b,c,d).0458

You know what, let me go ahead and just write it out, so you see it.0470

We have (1,0,0,1), (0,1,2,0), (1,-1,2,0), (0,2,2,1), and of course our vector, this time it is not a (0,0,0,0), it is going to be (a,b,c,d).0476

Again, the nice thing about mathematical software is that it actually solves this symbolically. Not necessarily numerically.0497

So, it will give you a solution for k1, k2, k3, k4 in terms of a, b, c, and d. Well, there does exist a solution.0503

Okay. There does exist a solution. That means that any arbitrary vector can be represented by these 4 vectors.0513

So, let us see, so v1, v2, v3, and v4, which are just v1, v2, v3, v4, span R4.0527

We found something that spans R4, and we also found that they are linearly independent, so yes, these vectors are a good basis.0543

Are they the best basis? Maybe, maybe not, it depends on the problem at hand... but they are a basis and it is a good basis for R4.0560

Okay. Let us list a theorem here... theorem... if s, the set of vectors, v1 so on and so forth onto vk, is a basis for v.0575

So, if the set is a basis for v, then every vector in v can be written in 1 and only 1 way, as a linear combination of the vectors in s.0603

That is not s, we should write the vectors in s.0655

So, in other words, if I know that s is a basis for the vector space, any vector in that vector space can only be represented 1 way.0665

That means the particular representation, the constants that are chosen is unique. Not multiple ways, it is unique.0675

Another theorem... actually, let me write this one in blue because we are possibly going to do something with this one.0684

Let s be v1... vk be a set of non-zero vectors in v, and we will let w equal the span of s.0702

So, we have this set s, there is a span of it, we will call that w, because it may not span the entire vector space, that is why we are giving it different, but obviously it is going to... I mean it is in v, so it is going to be some subset of it.0736

Then, some subset of s is a basis for w. Okay, let us stop and think about what this means.0752

I have a vector space v, I have some arbitrary collection of vectors that I have taken from v and I call that set s, just a list of vectors.0771

I know that these vectors span some part of v.0779

I call that w, if I need to give it a name, or I can just refer to it as the span of s.0785

Well, if I take some subset of this, maybe all of it, but so... either k vectors or less than k vectors, some subset of it, it actually forms a basis for the span.0791

That makes sense. Again, you have some set of vectors that spans an entire space, well, either all of the vectors together are independent, in which case that is your basis.0805

Or, they might be dependent, which means that you should be able to throw out a couple of them and reduce the number.0820

But to get something, some set of vectors from here, some collection that actually forms a basis for the span.0825

let us see how this works in terms of... a real life example. Okay.0834

We are going to list a procedure for finding the subset of s, of any s that is a basis for the span of s.0841

Let me actually move forward. Let me write down what this is.0854

Procedure for finding a subset of s, that is a basis for this span of s.0866

Okay. First thing we are going to do. We want to form c1v1 + c2v2 + so on and so forth... ckvk = 0.0890

We want to set up the homogeneous system, okay?0907

Now, we want to solve the system by taking it to reduced row echelon form.0911

Now, here is the best part. The vectors corresponding to the leading entries form a basis for span s.0930

This is actually kind of extraordinary. I love this, and I do not know why, but it is amazing.0963

I have this collection of vectors that spans a particular space.0969

I set up the homogeneous system and I subject it to Gauss Jordan elimination, bring it down to reduced row echelon form, and as you know, not every column needs to have a leading entry.0974

Well, the columns that do have a leading entry, that means I throw out all of the others.0986

The original vectors that correspond to those columns that have leading entries, they actually form a basis for my span of s.0992

So, let us just do an example and see what happens. Let us take the following vectors.0998

let me do this in red, actually... so v1 = 1... this is not going to work... (1,2,-2,1).1007

v2 = (-3,0,-4,3). v3, and of course these are vectors, so let me notate them as such... (2,1... this is also 1,-1).1029

v4 = (-3,3,-9,6).1054

v5 = (9,3,7,-6).1065

So, we have 5 vectors... we want to find a subset of these vectors, it might be all 5, it might be 2, it might be 3, it might be 4... that form a basis for the span of s.1073

Okay? Okay. So, we form for step... we do this thing right here.1091

So, we set up this equation and we put these vectors in for this equation, and we end up with the following system.1105

Columns... these vectors just going down... (1,2,-2,1)... or you can do them across... either way.1115

(1,-3,2,-3,9), 1, 2, 3, 4, 5 because we have 5 vectors... 5 columns, and of course the augmented is going to be 0.1126

(2,0,1,3,3,0), (-2,-4,1,-9,7,0), (3,-1,6,-6,0)... good.1140

We are going to subject that to reduced row echelon form.1161

When we do that, let me just put that there and write that there. Let me see...1165

Let me move on to the next page, that is not a problem.1172

So, we have subjected that matrix to reduced row echelon and we end up with the following... (1,0,0,0), (0,1,0,0), (1/2,3/2,3/2,0), (-1/2,3/2,5/2,0), and 0's everywhere else.1176

So our reduced row echelon looks like this.1206

Well, leading entry, leading entry, no leading entries anywhere else.1210

So, vector number 1, vector number 2, v1 and v2 form a basis.1217

So, it is not these, it is not (1,0,0,0), (0,1,0,0).1231

This is the reduced row echelon from the matrix, the columns of which are the vectors that we are talking about.1236

So, those vectors, the actual columns from the original matrix, those 2 vectors, so we started off with 5 vectors, and we found two of them that actually span the entire space.1242

We threw out the other three, they are not that important. We can describe the entire span with just these 2 vectors.1254

Form a basis for the span of s.1260

Again, this is really, really extraordinary.1266

Okay. Let us... another theorem... if s, v1... so on and so forth all the way to vk and t, which is, let us say w1 all the way to wk... okay?1270

wN, so if s is the set of vectors v1 to vk, k could be 13 so we might have 13 vectors in this one... and t is equal to w1 all the way to wN.1308

So, k and n do not have to necessarily be the same, but here is what the theorem says.1320

If these 2 sets are bases for v, then k = n.1328

In other words, if I have a given vector space, and if I have the bases for them, the bases have the same number of vectors.1340

So, the basis set has the same number of vectors. In other words, I cannot have a vector space that has one basis that is 3 vectors and another that is 5 vectors.1349

That is not what basis is. Basis expands the set, and it is linearly independent.1358

Therefore, if I have 2 bases, they have to have the same number of elements in them.1364

It makes sense. Okay. Now, because of this, once again, every basis of a given vector space has the same number of vectors in it.1368

There are an infinite number of bases for a vector space... but of that infinite number, they all have the same number of vectors in them.1382

Therefore, we define... again, very, very important definition... the dimension of a non-zero vector space dimension -- fancy word -- is the number of vectors in a basis for the vector space.1393

So, read this again, the dimension of a non-zero vector, of a non-zero vector space is the number of vectors in the basis for that space.1440

So, dimension is kind of a fancy word that a lot of people throw around.1452

So, we talk about 3-dimensional space, the space that we live in. Well, 3-dimensional space, there are a couple of ways to think about it.1458

Yes, it means 3-dimensional space because it will require 3 numbers to actually describe a point, (x,y,z)... three coordinates.1467

However, the actual mathematical definition is 3 space is 3 dimensional because any basis for 3-space has to be made up of 3 vectors... 5 dimensional space.1475

Any basis for 5-dimensional space has to be made up of 5 vectors. I cannot have 4 vectors describing it for 5 dimensional space. It is not going to happen.1489

Can I have 6 vectors that actually describe it? Yes, I can have 6 vectors that span the 5-dimensional space, but that span is not linearly independent.1501

So, because... and that is the whole idea. The dimension of a space is the number of vectors that form the basis and a basis is expansive and it is linearly independent.1513

Okay. Let us see here. p2, which we have used a lot.1526

That is the vector space of all polynomials of degree < or = 2.1534

The dimension is 3, and here is why. The basis, we will list a basis, and that should tell you.1548

This is the best part, if you just want to list a basis you can just count the number of vectors... that is how many dimensions that space is.1555

t2, t, and 1. Any linear combination of t2, t, and 1 will give you every single possible polynomial of degree < or = 2.1562

For example, 3t2 + 6t - 10. Well, it is 3 × t2, 6 × t, -10 × 1.1580

3t + 2... 3 × t... 2... it is of degree 2, this is degree 1. degree less than or equal to 2.1592

So, this one has to be in there. So, p2 has a dimension 3.1602

pn has dimension n + 1.1615

Okay. Now, here is where it gets really, really interesting and sort of just a sideline discussion, something sort of think about a little bit of mathematical culture. A little bit of abstraction...1621

Notice that this p2 has a dimension of 3. Well, our 3-space, our normal 3-space that we live in also has a dimension of 3.1636

As it turns out, all vector spaces of a given dimension, the only different between the vector spaces is the identity of their elements.1649

In one vector space, R3, we are talking about points, or vectors, arrows.1658

In this vector space, where this is a basis, it is a dimension of 3... the elements are actual polynomials.1664

As it turns out, the identity of the elements is the only thing that is different about those 2 spaces. These two spaces have the exact same algebraic properties.1675

They behave exactly the same way. In fact, I do not even need to think about it... if I can find myself 15 other vector spaces that have a dimension of 3, the identity of those elements completely does not matter.1683

In fact, it does not even matter, I can treat it completely symbolically. I can call them whatever I want. I can label them whatever I want.1697

What is important is the underlying algebraic property, and it is the same for every single vector space of a given dimension. That is what is extraordinary, that is what gives mathematics its power.1704

Once I understand, let us say R3, and we understand R3 pretty well... we live in this space, we enjoy the world around us, look at what we have done with the world around us.1715

If I find any other vector space with strange objects in it, if it has a dimension of 3, I know everything about it. I know absolutely everything about it because it behaves the same way that R3 does.1724

Again, that is really, really extraordinary... the last thing that I want to leave you with in this particular lesson is that what we have dealt with are finite dimensional vector spaces.1737

In other words, we know that R3 has an infinite number of vectors in them, but the basis, the dimension is finite... 3.1748

That means I only need 3 vectors in order to describe the entire space.1757

Now, that is not always true. There are infinite dimensional vector spaces that require an infinite number of vectors to actually describe them.1761

Those of you that go on into higher mathematics, or not even that, those of you who are engineering and physics majors, at some point you will be discussing something called the Fourier series, which is an infinite series of trigonometric polynomials.1772

Sin(x), cos(x), sin2(x), cos2(x), sin3(x), cos3(x), and so on. That is an infinite dimensional vector space.1785

Okay. So, I will list... let us see... 2 infinite dimensional vector spaces, we of course are not going to deal with it.1796

Linear algebra, mostly we stick with finite dimensional vector spaces, but I do want you to be aware of them.1804

p, the space of all polynomials... all polynomials, that is an infinite dimensional vector space.1811

It requires... it has an infinite number of vectors in its basis. Not like p2 or R3, that only have 3.1817

The other one is the space of continuous functions on the real line.1824

So, the space of continuous functions, you will see it represented like this... from negative infinity to infinity... that is defined on the entire real line.1838

That space has an infinite number of dimensions. I need an infinite number of functions in order to be able to describe all of the other functions, if I need to do so.1845

I just wanted to throw that out there. Really, what I wanted you to take away from this is that the identity for a vector space of any given dimension, the identity of the elements is completely irrelevant.1859

The underlying behavior is what we are concerned with, and the underlying behavior is exactly the same.1870

Thank you for joining us here at Educator.com, linear algebra, we will see you next time.1876

Educator®

Please sign in to participate in this lecture discussion.

Resetting Your Password?
OR

Start Learning Now

Our free lessons will get you started (Adobe Flash® required).
Get immediate access to our entire library.

Membership Overview

  • Available 24/7. Unlimited Access to Our Entire Library.
  • Search and jump to exactly what you want to learn.
  • *Ask questions and get answers from the community and our teachers!
  • Practice questions with step-by-step solutions.
  • Download lecture slides for taking notes.
  • Track your course viewing progress.
  • Accessible anytime, anywhere with our Android and iOS apps.