Raffi Hovasapian

Raffi Hovasapian

Orthogonal Complements, Part I

Slide Duration:

Table of Contents

Section 1: Linear Equations and Matrices
Linear Systems

39m 3s

Intro
0:00
Linear Systems
1:20
Introduction to Linear Systems
1:21
Examples
10:35
Example 1
10:36
Example 2
13:44
Example 3
16:12
Example 4
23:48
Example 5
28:23
Example 6
32:32
Number of Solutions
35:08
One Solution, No Solution, Infinitely Many Solutions
35:09
Method of Elimination
36:57
Method of Elimination
36:58
Matrices

30m 34s

Intro
0:00
Matrices
0:47
Definition and Example of Matrices
0:48
Square Matrix
7:55
Diagonal Matrix
9:31
Operations with Matrices
10:35
Matrix Addition
10:36
Scalar Multiplication
15:01
Transpose of a Matrix
17:51
Matrix Types
23:17
Regular: m x n Matrix of m Rows and n Column
23:18
Square: n x n Matrix With an Equal Number of Rows and Columns
23:44
Diagonal: A Square Matrix Where All Entries OFF the Main Diagonal are '0'
24:07
Matrix Operations
24:37
Matrix Operations
24:38
Example
25:55
Example
25:56
Dot Product & Matrix Multiplication

41m 42s

Intro
0:00
Dot Product
1:04
Example of Dot Product
1:05
Matrix Multiplication
7:05
Definition
7:06
Example 1
12:26
Example 2
17:38
Matrices and Linear Systems
21:24
Matrices and Linear Systems
21:25
Example 1
29:56
Example 2
32:30
Summary
33:56
Dot Product of Two Vectors and Matrix Multiplication
33:57
Summary, cont.
35:06
Matrix Representations of Linear Systems
35:07
Examples
35:34
Examples
35:35
Properties of Matrix Operation

43m 17s

Intro
0:00
Properties of Addition
1:11
Properties of Addition: A
1:12
Properties of Addition: B
2:30
Properties of Addition: C
2:57
Properties of Addition: D
4:20
Properties of Addition
5:22
Properties of Addition
5:23
Properties of Multiplication
6:47
Properties of Multiplication: A
7:46
Properties of Multiplication: B
8:13
Properties of Multiplication: C
9:18
Example: Properties of Multiplication
9:35
Definitions and Properties (Multiplication)
14:02
Identity Matrix: n x n matrix
14:03
Let A Be a Matrix of m x n
15:23
Definitions and Properties (Multiplication)
18:36
Definitions and Properties (Multiplication)
18:37
Properties of Scalar Multiplication
22:54
Properties of Scalar Multiplication: A
23:39
Properties of Scalar Multiplication: B
24:04
Properties of Scalar Multiplication: C
24:29
Properties of Scalar Multiplication: D
24:48
Properties of the Transpose
25:30
Properties of the Transpose
25:31
Properties of the Transpose
30:28
Example
30:29
Properties of Matrix Addition
33:25
Let A, B, C, and D Be m x n Matrices
33:26
There is a Unique m x n Matrix, 0, Such That…
33:48
Unique Matrix D
34:17
Properties of Matrix Multiplication
34:58
Let A, B, and C Be Matrices of the Appropriate Size
34:59
Let A Be Square Matrix (n x n)
35:44
Properties of Scalar Multiplication
36:35
Let r and s Be Real Numbers, and A and B Matrices
36:36
Properties of the Transpose
37:10
Let r Be a Scalar, and A and B Matrices
37:12
Example
37:58
Example
37:59
Solutions of Linear Systems, Part 1

38m 14s

Intro
0:00
Reduced Row Echelon Form
0:29
An m x n Matrix is in Reduced Row Echelon Form If:
0:30
Reduced Row Echelon Form
2:58
Example: Reduced Row Echelon Form
2:59
Theorem
8:30
Every m x n Matrix is Row-Equivalent to a UNIQUE Matrix in Reduced Row Echelon Form
8:31
Systematic and Careful Example
10:02
Step 1
10:54
Step 2
11:33
Step 3
12:50
Step 4
14:02
Step 5
15:31
Step 6
17:28
Example
30:39
Find the Reduced Row Echelon Form of a Given m x n Matrix
30:40
Solutions of Linear Systems, Part II

28m 54s

Intro
0:00
Solutions of Linear Systems
0:11
Solutions of Linear Systems
0:13
Example I
3:25
Solve the Linear System 1
3:26
Solve the Linear System 2
14:31
Example II
17:41
Solve the Linear System 3
17:42
Solve the Linear System 4
20:17
Homogeneous Systems
21:54
Homogeneous Systems Overview
21:55
Theorem and Example
24:01
Inverse of a Matrix

40m 10s

Intro
0:00
Finding the Inverse of a Matrix
0:41
Finding the Inverse of a Matrix
0:42
Properties of Non-Singular Matrices
6:38
Practical Procedure
9:15
Step1
9:16
Step 2
10:10
Step 3
10:46
Example: Finding Inverse
12:50
Linear Systems and Inverses
17:01
Linear Systems and Inverses
17:02
Theorem and Example
21:15
Theorem
26:32
Theorem
26:33
List of Non-Singular Equivalences
28:37
Example: Does the Following System Have a Non-trivial Solution?
30:13
Example: Inverse of a Matrix
36:16
Section 2: Determinants
Determinants

21m 25s

Intro
0:00
Determinants
0:37
Introduction to Determinants
0:38
Example
6:12
Properties
9:00
Properties 1-5
9:01
Example
10:14
Properties, cont.
12:28
Properties 6 & 7
12:29
Example
14:14
Properties, cont.
18:34
Properties 8 & 9
18:35
Example
19:21
Cofactor Expansions

59m 31s

Intro
0:00
Cofactor Expansions and Their Application
0:42
Cofactor Expansions and Their Application
0:43
Example 1
3:52
Example 2
7:08
Evaluation of Determinants by Cofactor
9:38
Theorem
9:40
Example 1
11:41
Inverse of a Matrix by Cofactor
22:42
Inverse of a Matrix by Cofactor and Example
22:43
More Example
36:22
List of Non-Singular Equivalences
43:07
List of Non-Singular Equivalences
43:08
Example
44:38
Cramer's Rule
52:22
Introduction to Cramer's Rule and Example
52:23
Section 3: Vectors in Rn
Vectors in the Plane

46m 54s

Intro
0:00
Vectors in the Plane
0:38
Vectors in the Plane
0:39
Example 1
8:25
Example 2
15:23
Vector Addition and Scalar Multiplication
19:33
Vector Addition
19:34
Scalar Multiplication
24:08
Example
26:25
The Angle Between Two Vectors
29:33
The Angle Between Two Vectors
29:34
Example
33:54
Properties of the Dot Product and Unit Vectors
38:17
Properties of the Dot Product and Unit Vectors
38:18
Defining Unit Vectors
40:01
2 Very Important Unit Vectors
41:56
n-Vector

52m 44s

Intro
0:00
n-Vectors
0:58
4-Vector
0:59
7-Vector
1:50
Vector Addition
2:43
Scalar Multiplication
3:37
Theorem: Part 1
4:24
Theorem: Part 2
11:38
Right and Left Handed Coordinate System
14:19
Projection of a Point Onto a Coordinate Line/Plane
17:20
Example
21:27
Cauchy-Schwarz Inequality
24:56
Triangle Inequality
36:29
Unit Vector
40:34
Vectors and Dot Products
44:23
Orthogonal Vectors
44:24
Cauchy-Schwarz Inequality
45:04
Triangle Inequality
45:21
Example 1
45:40
Example 2
48:16
Linear Transformation

48m 53s

Intro
0:00
Introduction to Linear Transformations
0:44
Introduction to Linear Transformations
0:45
Example 1
9:01
Example 2
11:33
Definition of Linear Mapping
14:13
Example 3
22:31
Example 4
26:07
Example 5
30:36
Examples
36:12
Projection Mapping
36:13
Images, Range, and Linear Mapping
38:33
Example of Linear Transformation
42:02
Linear Transformations, Part II

34m 8s

Intro
0:00
Linear Transformations
1:29
Linear Transformations
1:30
Theorem 1
7:15
Theorem 2
9:20
Example 1: Find L (-3, 4, 2)
11:17
Example 2: Is It Linear?
17:11
Theorem 3
25:57
Example 3: Finding the Standard Matrix
29:09
Lines and Planes

37m 54s

Intro
0:00
Lines and Plane
0:36
Example 1
0:37
Example 2
7:07
Lines in IR3
9:53
Parametric Equations
14:58
Example 3
17:26
Example 4
20:11
Planes in IR3
25:19
Example 5
31:12
Example 6
34:18
Section 4: Real Vector Spaces
Vector Spaces

42m 19s

Intro
0:00
Vector Spaces
3:43
Definition of Vector Spaces
3:44
Vector Spaces 1
5:19
Vector Spaces 2
9:34
Real Vector Space and Complex Vector Space
14:01
Example 1
15:59
Example 2
18:42
Examples
26:22
More Examples
26:23
Properties of Vector Spaces
32:53
Properties of Vector Spaces Overview
32:54
Property A
34:31
Property B
36:09
Property C
36:38
Property D
37:54
Property F
39:00
Subspaces

43m 37s

Intro
0:00
Subspaces
0:47
Defining Subspaces
0:48
Example 1
3:08
Example 2
3:49
Theorem
7:26
Example 3
9:11
Example 4
12:30
Example 5
16:05
Linear Combinations
23:27
Definition 1
23:28
Example 1
25:24
Definition 2
29:49
Example 2
31:34
Theorem
32:42
Example 3
34:00
Spanning Set for a Vector Space

33m 15s

Intro
0:00
A Spanning Set for a Vector Space
1:10
A Spanning Set for a Vector Space
1:11
Procedure to Check if a Set of Vectors Spans a Vector Space
3:38
Example 1
6:50
Example 2
14:28
Example 3
21:06
Example 4
22:15
Linear Independence

17m 20s

Intro
0:00
Linear Independence
0:32
Definition
0:39
Meaning
3:00
Procedure for Determining if a Given List of Vectors is Linear Independence or Linear Dependence
5:00
Example 1
7:21
Example 2
10:20
Basis & Dimension

31m 20s

Intro
0:00
Basis and Dimension
0:23
Definition
0:24
Example 1
3:30
Example 2: Part A
4:00
Example 2: Part B
6:53
Theorem 1
9:40
Theorem 2
11:32
Procedure for Finding a Subset of S that is a Basis for Span S
14:20
Example 3
16:38
Theorem 3
21:08
Example 4
25:27
Homogeneous Systems

24m 45s

Intro
0:00
Homogeneous Systems
0:51
Homogeneous Systems
0:52
Procedure for Finding a Basis for the Null Space of Ax = 0
2:56
Example 1
7:39
Example 2
18:03
Relationship Between Homogeneous and Non-Homogeneous Systems
19:47
Rank of a Matrix, Part I

35m 3s

Intro
0:00
Rank of a Matrix
1:47
Definition
1:48
Theorem 1
8:14
Example 1
9:38
Defining Row and Column Rank
16:53
If We Want a Basis for Span S Consisting of Vectors From S
22:00
If We want a Basis for Span S Consisting of Vectors Not Necessarily in S
24:07
Example 2: Part A
26:44
Example 2: Part B
32:10
Rank of a Matrix, Part II

29m 26s

Intro
0:00
Rank of a Matrix
0:17
Example 1: Part A
0:18
Example 1: Part B
5:58
Rank of a Matrix Review: Rows, Columns, and Row Rank
8:22
Procedure for Computing the Rank of a Matrix
14:36
Theorem 1: Rank + Nullity = n
16:19
Example 2
17:48
Rank & Singularity
20:09
Example 3
21:08
Theorem 2
23:25
List of Non-Singular Equivalences
24:24
List of Non-Singular Equivalences
24:25
Coordinates of a Vector

27m 3s

Intro
0:00
Coordinates of a Vector
1:07
Coordinates of a Vector
1:08
Example 1
8:35
Example 2
15:28
Example 3: Part A
19:15
Example 3: Part B
22:26
Change of Basis & Transition Matrices

33m 47s

Intro
0:00
Change of Basis & Transition Matrices
0:56
Change of Basis & Transition Matrices
0:57
Example 1
10:44
Example 2
20:44
Theorem
23:37
Example 3: Part A
26:21
Example 3: Part B
32:05
Orthonormal Bases in n-Space

32m 53s

Intro
0:00
Orthonormal Bases in n-Space
1:02
Orthonormal Bases in n-Space: Definition
1:03
Example 1
4:31
Theorem 1
6:55
Theorem 2
8:00
Theorem 3
9:04
Example 2
10:07
Theorem 2
13:54
Procedure for Constructing an O/N Basis
16:11
Example 3
21:42
Orthogonal Complements, Part I

21m 27s

Intro
0:00
Orthogonal Complements
0:19
Definition
0:20
Theorem 1
5:36
Example 1
6:58
Theorem 2
13:26
Theorem 3
15:06
Example 2
18:20
Orthogonal Complements, Part II

33m 49s

Intro
0:00
Relations Among the Four Fundamental Vector Spaces Associated with a Matrix A
2:16
Four Spaces Associated With A (If A is m x n)
2:17
Theorem
4:49
Example 1
7:17
Null Space and Column Space
10:48
Projections and Applications
16:50
Projections and Applications
16:51
Projection Illustration
21:00
Example 1
23:51
Projection Illustration Review
30:15
Section 5: Eigenvalues and Eigenvectors
Eigenvalues and Eigenvectors

38m 11s

Intro
0:00
Eigenvalues and Eigenvectors
0:38
Eigenvalues and Eigenvectors
0:39
Definition 1
3:30
Example 1
7:20
Example 2
10:19
Definition 2
21:15
Example 3
23:41
Theorem 1
26:32
Theorem 2
27:56
Example 4
29:14
Review
34:32
Similar Matrices & Diagonalization

29m 55s

Intro
0:00
Similar Matrices and Diagonalization
0:25
Definition 1
0:26
Example 1
2:00
Properties
3:38
Definition 2
4:57
Theorem 1
6:12
Example 3
9:37
Theorem 2
12:40
Example 4
19:12
Example 5
20:55
Procedure for Diagonalizing Matrix A: Step 1
24:21
Procedure for Diagonalizing Matrix A: Step 2
25:04
Procedure for Diagonalizing Matrix A: Step 3
25:38
Procedure for Diagonalizing Matrix A: Step 4
27:02
Diagonalization of Symmetric Matrices

30m 14s

Intro
0:00
Diagonalization of Symmetric Matrices
1:15
Diagonalization of Symmetric Matrices
1:16
Theorem 1
2:24
Theorem 2
3:27
Example 1
4:47
Definition 1
6:44
Example 2
8:15
Theorem 3
10:28
Theorem 4
12:31
Example 3
18:00
Section 6: Linear Transformations
Linear Mappings Revisited

24m 5s

Intro
0:00
Linear Mappings
2:08
Definition
2:09
Linear Operator
7:36
Projection
8:48
Dilation
9:40
Contraction
10:07
Reflection
10:26
Rotation
11:06
Example 1
13:00
Theorem 1
18:16
Theorem 2
19:20
Kernel and Range of a Linear Map, Part I

26m 38s

Intro
0:00
Kernel and Range of a Linear Map
0:28
Definition 1
0:29
Example 1
4:36
Example 2
8:12
Definition 2
10:34
Example 3
13:34
Theorem 1
16:01
Theorem 2
18:26
Definition 3
21:11
Theorem 3
24:28
Kernel and Range of a Linear Map, Part II

25m 54s

Intro
0:00
Kernel and Range of a Linear Map
1:39
Theorem 1
1:40
Example 1: Part A
2:32
Example 1: Part B
8:12
Example 1: Part C
13:11
Example 1: Part D
14:55
Theorem 2
16:50
Theorem 3
23:00
Matrix of a Linear Map

33m 21s

Intro
0:00
Matrix of a Linear Map
0:11
Theorem 1
1:24
Procedure for Computing to Matrix: Step 1
7:10
Procedure for Computing to Matrix: Step 2
8:58
Procedure for Computing to Matrix: Step 3
9:50
Matrix of a Linear Map: Property
10:41
Example 1
14:07
Example 2
18:12
Example 3
24:31
Loading...
This is a quick preview of the lesson. For full access, please Log In or Sign up.
For more information, please see full course syllabus of Linear Algebra
Bookmark & Share Embed

Share this knowledge with your friends!

Copy & Paste this embed code into your website’s HTML

Please ensure that your website editor is in text mode when you paste the code.
(In Wordpress, the mode button is on the top right corner.)
  ×
  • - Allow users to view the embedded video in full-size.
  • Discussion

  • Answer Engine

  • Download Lecture Slides

  • Table of Contents

  • Transcription

  • Related Books

Lecture Comments (4)

2 answers

Last reply by: scott ZHANG
Sat Mar 15, 2014 12:47 AM

Post by scott ZHANG on March 11, 2014

u said that If you have two vector basis that are orthgan to each other in R(n) they must span the entire dimension, but lets say i have two lines are orthgan to each other in the R(3), but they dont span the entire R(3) universe?

0 answers

Post by Manfred Berger on June 21, 2013

Are you using the term function to mean invertable function in general?

Orthogonal Complements, Part I

Lecture Slides are screen-captured images of important points in the lecture. Students can download and print out these lecture slide images to do practice problems as well as take notes while watching the lecture.

  • Intro 0:00
  • Orthogonal Complements 0:19
    • Definition
    • Theorem 1
    • Example 1
    • Theorem 2
    • Theorem 3
    • Example 2

Transcription: Orthogonal Complements, Part I

Welcome back to Educator.com and welcome back to linear algebra.0000

Today we are going to be talking about orthogonal complements.0004

So, rather than doing a preamble discussion of what it is, let us just jump into some definitions and it should make sense once we actually set it out in a definition form.0008

Okay. So, let us start with our definition. It is a little long, but nothing strange.0019

Let w be a subspace of RN, so we are definitely talking about N-space here. Okay.0032

A vector u which is a member of RN is said to be orthogonal to w, so notice orthogonal to w as a subspace.0049

Orthogonal to an entire subspace, if it is orthogonal to every vector in that subspace.0070

Okay. The set of all vectors in RN that are orthogonal to all vectors in w is called the orthogonal complement of w.0094

And... is symbolized by w with a little perpendicular mark on the top, and they call it w perp.0142

The top right. Okay. SO, let us look at this definition again.0157

So, w is a subspace of RN, okay? So it could be dimension 1, 2, 3, all the way up to N, because RN is a subspace of itself.0161

A vector u in RN is said to be orthogonal to that subspace if its orthogonal to every vector in that subspace.0170

So, the set of all vectors that are orthogonal, we call it the orthogonal complement of w.0180

It is symbolized by that w with a little perpendicular mark on the top right, called w perp.0187

Let us give a little bit of a picture so that we see what it is we are looking at.0194

So, let us just deal in R3, and let us say that, so let me draw a plane.0199

As you know, a plane is 2-dimensional so it is in R3, and then let me just draw some random vectors in this plane. Something like that.0205

Well, if I have some vector like that, which is perpendicular to this plane, so this plane... let us call that w.0214

So, that is some subspace of R3, and again, let me make sure that I write it down... so we are dealing with R3.0226

This two dimensional plane is a subspace of R3, and every single vector in there is of course... well, it is a vector in the plane.0231

Then, if I take this vector here, well every single vector that is perpendicular to it is going to be parallel to this vector, right?0241

So, when we speak about parallel vectors, we really only speak about 1 vector.0248

So, as it turns out, if this is w, this vector right here and all of the scalar multiples of it, like shortened versions of it, long versions of it, this is your w perp.0253

Because, this vector, any vector in here, is going to end up being perpendicular to every one of these vectors. This is the orthogonal complement of that.0268

So, it helps to use this picture working in R3, and working with either dimension 1 or 2, because we can actually picture it.0279

For something like R4 or R5, I mean I can go ahead and tell you what it is that you will be dealing with.0287

So let us say in R4 you have a subspace that is 2-dimensional, that is some kind of plane so to speak in R4.0294

Well, the orthogonal complement of that is going to be every vector that is going to be perpendicular to that 1 or 2 dimensions, that is actually going to end up being 2-dimensional.0302

The idea is we have this subspace and we have a bunch of vectors that are orthogonal to every vector in that subspace.0314

The set of all of those vectors that are orthogonal are called the orthogonal complement. That is all that it means.0324

Okay. Let us actually do a little bit of an example here.0331

So, let us say... well actually, you know what, let us just jump into a theorem and we will get into an example in a minute.0338

So, theorem... let w be a subspace of RN... okay.0348

Then, aw perp is a subspace of RN.0371

So, if w is a subspace, w perp, its orthogonal complement, is also a subspace.0383

We do not have to go through that procedure of checking whether it is a subspace.0388

And... it is interesting... that the intersection of w and w perp is the 0-vector.0393

So, again, they are subspaces so they have to include the 0 vector, both of them, but that is the only thing common between the two subspaces of w and w perp, its orthogonal complement.0402

The only thing they have in common. They actually pass through the origin.0413

Okay. So, now let us do our example.0419

Let us see. We will let w be a subspace of, this time we will work in R4, with basis w1, w2.0425

So, w1, w2, these two vectors form a basis for our subspace w.0447

And... w1 is going to be 1, 1, 0, 1, and I have just written this vector in horizontal form without the... not as a list without the commas, it does not really matter.0453

w2 is going to be the vector 0, -1, 1, 1, 1.0466

So, you have 2 vectors, they form a basis for the subspace w in R4.0474

Now, our task is find a basis for the orthogonal complement, w perp. Find a basis for the subspace of all of the vectors that are orthogonal to all of the vectors in w, that has these 2 vectors as a basis.0479

Okay, well, so, let us just take some random... okay... so we will let u, let us choose u equal to some random vector in R4.0498

a, b, c, d, we want to be as general as possible... a, b, c, d.0512

Well, so we are looking for the following. We want... actually, let me see, let this be -- I am sorry -- let this be a random vector in the orthogonal complement.0518

Okay. So, we are just going to look for some random vector, see if we can find values for a, b, c, d.0532

We are going to take a vector in the orthogonal complement, and we know that this is going to be true.0538

We know that because w perp and w are orthogonal to each other, we know that u · w1 = 0.0543

We know that u · w2... let me make this dot a little more clear, we do not want that.... is equal to 0, right?0555

So, because they are orthogonal complements, we know that they are orthogonal, which means that their dot product is equal to 0.0570

Well, these are just a couple of equations, so let us actually do this.0578

So, if I do u · w1, I get a + b + 0 + b = 0.0582

Then, if I do u · w2, I get 0 - b + c + d = 0.0595

When we solve this using the techniques that we have at our disposal... I am going to go ahead and do it over here.0608

So, this is just a homogeneous system, you set up the coefficient matrix, reduced row echelon form, the columns that have... that do not have a leading entry, those are your free variables... r, s, t, u, v, whatever you want.0618

Then you solve for the other variables that do have leading entries0629

When you do this, you end up with the following. You get a, b, c, and d, the vector takes on the form R × -1, 1, 1, 1, 0, + s × -2, 1, 0, 1.0633

So, those two vectors form a basis for the orthogonal complement w perp.0655

Therefore, we will set it up as c -- set notation, let me just write it and again -1, 1, 1, 1, 0... comma, -2, 1, 0, 1... is a basis for w perp.0663

So, that is it. We started with a basis of two vectors in R4.0682

Then, just by virtue of the fact that we know that the orthogonal complement is going to be orthogonal to every single vector in this, so it is certainly going to be orthogonal to these two... I pick a random vector in this orthogonal complement.0689

I write my equation... orthogonal just means the dot product equals 0, get a homogeneous system.0702

I solve the homogeneous system and I set it up a way where I can basically read off the basis for my solution space of this homogeneous system, which is the basis for, in this particular case, based on this problem, the orthogonal complement.0708

Nice, straight forward, nothing we have not done. We have seen dot product, we have seen homogeneous systems, we have seen basis, everything is new.0725

Now we are just supplying it to this new idea of 2 subspaces being orthogonal to each other. Being perpendicular to each other.0733

Of course, perpendicularity, of course you know from your geometric intuition, only makes sense in R2 and R3, which is why we do not use the word perpendicular, we use the word orthogonal, but it is the same thing in some sense0742

So, you might have a 6-dimensional subspace being orthogonal to a 3-dimensional subspace "whatever that means".0750

Well, geometrically, pictorially, we do not know what that means. We cannot actually picture that. We have no way of representing it geographically.0760

But, we know what it means algebraically. The dot product of two vectors in those spaces is equal to 0.0767

Okay. One of the things that I would like you to notice when we had R4, you notice that our w had dimension 2.0774

Its basis had 2 vectors, dimension 2... and you noticed when we had w perp, the orthogonal complement, we ended up with 2 vectors as a basis, also in dimension 2.0782

Notice that the dimension of the subspace w + the dimension of its orthogonal complement added up to 4, the actual dimension of the space. That is not a coincidence.0793

So, let us write down a theorem... Let w be a subspace of RN.0808

Then, RN, the actual space itself is made up of 2 w + w perp. SO, let me talk about this thing.0826

This little plus sign with a circle around it, it is called a direct sum, and I will speak about it in just a minute.0836

Okay. Essentially what this means is... we will have to speak a little bit more about it, but one of the things that it means is that w intersect w perp, the only thing they have in common is like we said before... the 0 vector.0846

These are both subspaces, so they have to have at least the 0 vector in common. They do not share anything else in common.0860

Okay. Yet another theorem, and I will talk about the sum in just a moment, but going back to the problem that we just did, this basically says that if I take the subspace w and its orthogonal complement, and if I somehow combine them -- which we will talk about it in a minute -- we will actually end up getting this space itself, the 4-dimensional space.0870

So if I had a 6-dimensional space and I know that I am dealing with a subspace of 2-dimensions, w, I know that the orthogonal complement is going to have dimension 4 because 2 + 4 has to equal 6, or 6 - 2 = 4, however you want to look at it.0890

Okay. Let us do another theorem here. Just a little bit of an informational theorem, which will make sense.0906

If w is a subspace of RN, then w perp perp = w.0915

This just says that if you take an orthogonal complement of some subspace and you take the orthogonal complement of that ,you are going to end up getting the original subspace.0932

Nothing new about that, I mean like a function... if you take the inverse of a function and then you take the inverse of the inverse, you get the function back. That is all it is. Very, very intuitive.0940

Okay. Now, let us discuss this symbol some more. This + symbol.0950

So, when we write... this direct sum symbol -- I am sorry -- when we write w + w perp, these are subspaces, okay?0956

We do not... this is a symbol for the addition of subspaces, we are not actually doing the operation of addition.0970

What this means, so this symbolizes the addition of a subspace. This whole thing is a space.0976

What this means is that something... it means that if I have some w1 -- no, let me make it a little bit more general, there are going to be too many w's floating around.0987

So, if I have a, this direct sum symbol, plus b, okay?1003

It is the space made up of vectors v, such that v is equal to some a + b, where the vector a comes from the space a and the vector b comes from the space b.1009

So, this symbol, this direct sum symbol... it means if I take some vector in the subspace a... and a vector in the subspace b, and i actually add those vectors like I normally would? I am going to get some vector.1040

Well, that vector belongs to this space. When I see this symbol, I am talking about a space. In some sense what I have done is I have taken 1 whole space and I have attached another space right to it.1055

In the case of the example that we did, we had a 2-dimensional subspace, we added a 2-dimensional orthogonal complement to it, and what I got was the entire space R4.1070

That is what is happening here. That is what this direct sum symbol means. It symbolizes the addition of spaces, the putting together of spaces.1078

But, these vectors are spaces that contain individual vectors.1088

Okay. Let us see. Let us do a little bit further here. Let us take R4, expand upon this...1097

Let us let w = ... well not equal, let us say it has a basis.1112

Let w have... has a basis vector (1,0,0,0), and (0,1,0,0).1123

So, let us say that w is the subspace that has these 2 vectors as a basis.1136

So, it is a 2-dimensional subspace, and we will let w perp have basis (0,0,1,0)... (0,0,0,1)... okay, as a basis.1141

Now, if I take w, the direct sum w perp, well, that is equal to R4... right?1165

So, a vector in R4... let us say for example... which is let us just say some random vector (4,3,-2,6), which is a vector in R4, it can be written as... well, it can be written as a vector from this subspace + a vector from this subspace.1180

Just like what we defined, that is what a direct sum means. This w + the w perp, means take a vector from here, add it to a vector from here, and you have a vector in the sum, which happens to be R4.1207

We can write it as (4,3,0,0)... this vector right here is in the space w.1220

We can add it to the vector (0,0,-2,6), which is a vector in w perp.1231

What is nice about this representation, this direct sum representation is that -- let us see -- this representation is unique.1240

So, when I write a particular vector as a direct sum of 2 individual subspaces, the way that I write it is unique. There is no other way of writing it.1262

Okay. So that gives us a nice basic idea of orthogonal complements to work with.1273

We will continue on next time some more with orthogonal complements.1277

Thank you for joining us at Educator.com and we will see you for the next instalment of linear algebra. Take care.1280

Educator®

Please sign in to participate in this lecture discussion.

Resetting Your Password?
OR

Start Learning Now

Our free lessons will get you started (Adobe Flash® required).
Get immediate access to our entire library.

Membership Overview

  • Available 24/7. Unlimited Access to Our Entire Library.
  • Search and jump to exactly what you want to learn.
  • *Ask questions and get answers from the community and our teachers!
  • Practice questions with step-by-step solutions.
  • Download lecture slides for taking notes.
  • Track your course viewing progress.
  • Accessible anytime, anywhere with our Android and iOS apps.