WEBVTT mathematics/linear-algebra/hovasapian
00:00:00.000 --> 00:00:04.000
Welcome back to educator.com, and welcome back to linear algebra.
00:00:04.000 --> 00:00:12.000
Today we are going to be discussing properties of matrix operations, so we have talked a little bit about matrices, before we talked about the dot product.
00:00:12.000 --> 00:00:21.000
We introduced some linear systems with a couple of examples, we discovered that we can either have no solution, one solution or infinite number of solutions.
00:00:21.000 --> 00:00:35.000
And now we are going to continue to develop some of the tools that we need in order to make linear algebra more understandable and more powerful, which is ultimately what we want, is we want a tool box from which we can pull and develop this really powerful mathematical technique.
00:00:35.000 --> 00:00:46.000
Today we are going to be talking about the properties of the matrices, we are going to be talking about properties of additions, properties of matrix multiplication, scalar multiplication, other things like that.
00:00:46.000 --> 00:00:57.000
Most of these properties are going to be familiar to you from what you have been doing with the real numbers ever since you were kids, addition, distributive property, associative property, things like that.
00:00:57.000 --> 00:01:13.000
Most properties carry over, however not all properties carry over, and there is a few minor sodalities that we want to sort of make sure we understand and pay attention to, so things don't get to add up hand, so let's just jump right in.
00:01:13.000 --> 00:01:26.000
Okay, the first thing we are going to be talking about are the properties of addition, properties of matrix addition, so let's go ahead and list them out....
00:01:26.000 --> 00:01:42.000
... We will let A, B, C and D be matrices, more specifically we want them to be actually M by N matrices.
00:01:42.000 --> 00:01:57.000
And again remember that we are using capital letters for matrices and usually when we talk about vectors, which is in M by 1, or a 1 by N matrix, then we will usually use a lower case letter with an arrow on top of it, but capital letters for standard matrix.
00:01:57.000 --> 00:02:09.000
Let A,. B, C and D be M by N matrices, okay...
00:02:09.000 --> 00:02:22.000
... A + B = B + A, this is the commutativity property, in other words it doesn't really matter which order I add them in, just like if you add 5 = 4 is 4+ 5, matrices are the same thing.
00:02:22.000 --> 00:02:32.000
And again we are presuming that these are defined, so if you have a 2 by 3, you need a 2 by 3, you can't add matrices of different dimensions.
00:02:32.000 --> 00:02:42.000
B, our second property...
00:02:42.000 --> 00:03:01.000
... Is the associativity property, so it says that if I add B and C, and then if I add A to it, I can actually do it in a different order, I can add A and B first and then add C to it, again very similar from what you understand with the really numbers, it's perfectly true for matrices as well.
00:03:01.000 --> 00:03:10.000
Yeah, okay, i am going to introduce a new symbol here, it's a little bit of short hand , it's a reverse E with an exclamation point, it means there is a unique.
00:03:10.000 --> 00:03:17.000
that means there is 1, so there is a unique matrix...
00:03:17.000 --> 00:03:27.000
... RIX, 0 and I am going to put a little bracket around it to make sure that it’s a, this is the symbol for the 0 matrix, which consist of all entries that are 0.
00:03:27.000 --> 00:03:45.000
There exists a unique matrix 0, such that A + 0 matrix equals A, and this serves the same function in matrices as the 0 does in numbers, so in other words 5 + 0 is still 5.
00:03:45.000 --> 00:04:05.000
This is called the additive identity, additive because we are dealing with additions, identity because it leads at the same, A over here on the left side of the equality sign, A over here on the right side of equality sign, so it is called additive identity...
00:04:05.000 --> 00:04:21.000
... Or more formally, with these three lines here means it's equivalent to, we just call at the 0 matrix, it's probably more appropriate or more often be referring to it as just the 0 matrix or something like that instead of the additive identity, okay.
00:04:21.000 --> 00:04:34.000
And our final, okay, once again our little symbol, there exist a unique M by N matrix...
00:04:34.000 --> 00:04:49.000
... We will call it D, such that, we are sure, let's put it over here, A + D = the 0 matrix....
00:04:49.000 --> 00:05:03.000
Call D - A, so we are referred to D as -A, in other words it's just the reverse, so if I have the number 7, well -7, 7 + -7 = 0.
00:05:03.000 --> 00:05:18.000
It gives me the additive identity for the real numbers same way if I have some matrix A, and if I have -A to it, I end up with the 0 matrix, makes complete sense, and again we are doing corresponding entries, okay.
00:05:18.000 --> 00:05:25.000
Lets do an example...
00:05:25.000 --> 00:05:43.000
... Let's let A = (, 1, -2, -3, 0, 8 and 7), okay and we will let -A, which is just the same as -1 times A, well just reverse everything.
00:05:43.000 --> 00:05:54.000
We have (-1, 2, 3) and then we have (0, -8, -7).
00:05:54.000 --> 00:06:17.000
When we do A + -A, we add corresponding entries 1 + -1 = 0, -2 + 2 = 0, remember we have already taken care of this negative sign in the entries of the matrix, so I don't have to go -2 - -2, I have already taken care of it.
00:06:17.000 --> 00:06:36.000
This -A is just a symbol, these are the entries, so we have -2 and 2 is 0, we have -3 and 3 is 0, and then we go on to form the 2 by 3 0 matrix, there you go.
00:06:36.000 --> 00:06:50.000
Now we are going to move on to the properties of multiplication, let me go back to my black ink here...
00:06:50.000 --> 00:07:15.000
... Okay, once again A, B, C are matrices of appropriate size, and if you remember when we deal with matrix multiplication, appropriate size, when we are multiplying two matrices.
00:07:15.000 --> 00:07:29.000
Let's say for example we have, put eg for example, if I have a 3 by 2 matrix, and multiply that, I have to multiply that by a 2 by some other number matrix.
00:07:29.000 --> 00:07:47.000
These inside dimensions have to match in order for my final matrix to be a 3 by whatever, my final matrix is outside, so this is what we mean by appropriate and again we are presuming that we can do the multiplication, the multiplication is defined in other words.
00:07:47.000 --> 00:08:07.000
A, B and C are matrices of appropriate size, then we have A times BC quantity = AB quantity times C, again this is the associative property for multiplication now, I can multiply B times C then multiply by A.
00:08:07.000 --> 00:08:15.000
It's the same thing as doing A times B first and then multiplying by C, I get the same answer, same as for real numbers.
00:08:15.000 --> 00:08:35.000
B, A times quantity B + C, well it's exactly what you think it is, this is a distributive property for matrices, and it is equal to A times B + A times C, In other words I am treating these, just like numbers except their matrices.
00:08:35.000 --> 00:08:49.000
I am going to reiterate that fact over, and over and over and over again, and later on when we talk about precise definitions of linearity, when we talk about linear maps in more of the abstract, it will actually make a lot more sense as it turns out.
00:08:49.000 --> 00:09:00.000
Matrices and the numbers are actually examples of deeper properties, it's something called a group property, which those of you that go on to higher mathematics will discuss very beautiful area of mathematics.
00:09:00.000 --> 00:09:21.000
And the nice thing is that you only have to prove it for one mathematical object called a group and then you just sort of check to see if these objects that you run across in your studies fulfil the certain handful of axioms and then since you have already done all the work, you don't have to do all the work for proving all the theorems for it again.
00:09:21.000 --> 00:09:42.000
See, A + B times C = AC + BC, that's just the distributive property in reverse, okay, let's do an example here, let’s take....
00:09:42.000 --> 00:10:07.000
... A= (5, 2, 3, 2, -3, 4), well take B = (2, -1, 1, 0, 0,2, 2, 2, 3, 0, 3, 0, -1, 3).
00:10:07.000 --> 00:10:27.000
Okay, and let's take C also little bit of a big matrix here, (1, 0, 2, 2, -3, 0, 0, 0, 3 and 2, 1, 0).
00:10:27.000 --> 00:10:38.000
Now I have to warn you from time to time when I write my matrices, I may actually skip these brackets over here, sometimes I'll just write them as a square array of numbers, it's a personal choice on my part.
00:10:38.000 --> 00:10:49.000
As long as the numbers are associated, there should be no confusion, so in case you see that I have forgotten the brackets, it's a completely notational thing, you can actually deal mathematics anyway that you want to.
00:10:49.000 --> 00:11:00.000
You can use symbols, you can use words, it's the underlying mathematics that's important, not the symbolic representation, of ‘course presuming that the symbolic representation is understandable to somebody else.
00:11:00.000 --> 00:11:15.000
Okay, so we have our three matrices here, and let's go ahead and move forward, so we are going to calculate A times BC, now i am presuming that you are reasonably familiar with matrix multiplication.
00:11:15.000 --> 00:11:36.000
We will go ahead and just write out, when we do B times C first, and then multiply by A on the left hand side, we end up the following matrix (43, 16, 56, 12, 30 and 8).
00:11:36.000 --> 00:11:52.000
Now, when we do AB first, and then multiply by C on the right hand side, well as it turns out we get ( 43, 16, 56, 12, 30 and 8).
00:11:52.000 --> 00:12:00.000
Now, since we are mostly with just arithmetic and adding numbers, i suggest that you actually go ahead and run through this to confirm that this is true, it is true.
00:12:00.000 --> 00:12:09.000
One thing that you should note here, notice we did BC here, and A on the left hand side.
00:12:09.000 --> 00:12:13.000
Here we did AB and we did C on the right hand side.
00:12:13.000 --> 00:12:23.000
If you notice when we were writing down the properties for matrix multiplication, we did not say A times B = B times A, the way it did for addition.
00:12:23.000 --> 00:12:43.000
Addition of matrices commute, A + B = B + A, however multiplication of matrices A times B is not equal to B times A, and generally speaking, if we do M times P matrix, times a P times N matrix.
00:12:43.000 --> 00:12:50.000
M by P, P by N, these have to be the same, what you end up with is an M by N matrix.
00:12:50.000 --> 00:13:05.000
Well if you reverse those two, if you get N and N by P matrix, and try to multiply it by M by P matrix, now P and M are not the same.
00:13:05.000 --> 00:13:20.000
This, in this particular case, the multiplication is not even defined, now is it possible that if you switch to matrices, you may end up actually getting something that is equal, possible if you are dealing with a square matrix, hire them likely but it's not a general case.
00:13:20.000 --> 00:13:25.000
Commutivity and multiplication doesn’t not hold, that's a very unusual property.
00:13:25.000 --> 00:13:35.000
This is one of properties that's different in matrices than it is in numbers, 5 times 6 = 6 times 5, 5 + 6 = 6 + 5, not the case in matrices.
00:13:35.000 --> 00:13:51.000
Remember, commutivity, I am sorry, multiplication in matrices does not commute, so multiplication in matrices...
00:13:51.000 --> 00:14:00.000
... Does not commute, very important, it has profound consequences which we will deal with later on.
00:14:00.000 --> 00:14:15.000
Okay, let's move on to some other definitions and properties that are associated with multiplication, so let us define an identity matrix.
00:14:15.000 --> 00:14:27.000
Earlier we had talked about a 0 matrix, which is just a matrix with all 0 entries, and we said that it was the additive identity, in other words if you add it to any other matrix, nothing changes, you just get the matrix back.
00:14:27.000 --> 00:14:44.000
In this case an identity matrix, it is an N by N matrix, so it is square, same number of rows as columns, where everything on the main diagonal, all entries are 1, so an example that the symbolism is something like this.
00:14:44.000 --> 00:14:53.000
The identity matrix for 2 is a 2 by 2 matrix, (1, 0, 0, 1), this is the main diagonal right here from top left to bottom right.
00:14:53.000 --> 00:15:12.000
This is the skew diagonal, this is the main diagonal, the identity matrix for a fourth dimension is exactly what you think it is (1, 0, 0, 0), (0, 1, 0, 0), (0, 0, 1, 0), (0, 0, 0, 1).
00:15:12.000 --> 00:15:24.000
It's a four by four on the main diagonal, top left, bottom right, everything is a 1, so that's just the standard identity matrix, okay.
00:15:24.000 --> 00:15:30.000
Now...
00:15:30.000 --> 00:15:37.000
... Let A be a matrix....
00:15:37.000 --> 00:15:52.000
... The identity matrix of dimension M times A, okay, well, let it be a matrix of dimension M by N, we should specify actually, so we have a matrix M by N.
00:15:52.000 --> 00:16:08.000
the identity matrix multiplied by A on the left, meaning the Identity matrix on the A is the same as A times the identity matrix of dimension N on the right, and again this has to do with matrix multiplication.
00:16:08.000 --> 00:16:20.000
It needs to be defined, this identity matrix is M by M, A is M by N, therefore the M's cancel and you are left with an M by N matrix and the same thing here.
00:16:20.000 --> 00:16:40.000
A is M by N, let me write it out, M by N, the identity matrix here is N by N, these go way leaving you an M by N matrix, so when you are multiplying by identity matrices, commutivity can hold.
00:16:40.000 --> 00:16:48.000
Next property, let's say we have the matrix A raised to some P power, P is an integer, 2, 3, 4, 5, 6, 7, natural numbers.
00:16:48.000 --> 00:16:59.000
That's the same as just A times A times A, just P times...
00:16:59.000 --> 00:17:08.000
... Same thing is for numbers, 4³ is 4 times, 4 times, 4, the matrix A³ is just A time, A times A.
00:17:08.000 --> 00:17:18.000
Again provided that it's defined and in this case A is going to have to be a square matrix, okay.
00:17:18.000 --> 00:17:33.000
Very important definition, just like the definition for numbers, any matrix to the 0 power is equal to the identity matrix, just like you know that any number by definition, in order to make the mathematics work, we define any number to the 0th power as 1.
00:17:33.000 --> 00:17:41.000
10 to the 0th power is one, 30 to the 0th power is 1, π to the 0th power is 1, Okay.
00:17:41.000 --> 00:18:00.000
AP, AQ and let the exact same way, AP + Q, you just add the exponents and multiply by that many times, so if you have A² times A³, that's going to be A to the 5th, A times, A, times, A times, A times A.
00:18:00.000 --> 00:18:12.000
Just like numbers, there is nothing new here, we are just dealing with matrices, and our last one is A to the P, to the Q, well you know how to handle this with the numbers.
00:18:12.000 --> 00:18:24.000
You just multiply P and Q, same thing A to the PQ, so if you have A²³, that's going to be A⁶, A^2 times 3 = A⁶.
00:18:24.000 --> 00:18:38.000
There's nothing new here; you are just dealing with matrices instead of numbers, okay....
00:18:38.000 --> 00:18:55.000
... Couple of things to notate and be aware of if I have A times B to the P power, now with numbers, you are used to seeing, let's say 5 times 2², well let me write it out for you.
00:18:55.000 --> 00:19:09.000
(5 times 2)², you can go 5², 2 ², in some sense you are sort of distributing this 2 over there, with matrices it doesn't do that.
00:19:09.000 --> 00:19:19.000
It does not equal A to the P, B to the P and again it has to the commutivity property, so this is not generally true.
00:19:19.000 --> 00:19:43.000
It's another property that's not true, when you multiply two matrices and you raise it to some exponential power, it is not the same as raising each one to the exponential power and multiplying them together, that doesn't work with matrices, and again it has some profound consequences for linear systems and for other deeper properties of linear algebraic systems, okay.
00:19:43.000 --> 00:20:06.000
Now, you know that if A times B, if I take two numbers and I multiply them and I end up with 0, for A and B as members of the real number line, this is the symbol 4 is a member of, and this little r with two little bars here, the real numbers, just the numbers that you know are positive, negative, everything in between, rational, irrational.
00:20:06.000 --> 00:20:20.000
Well, this implies you know that either A = 0, or B = 0, so again if I, I know from numbers that if I multiply two numbers together and I get 0, I know that one of them has to be 0.
00:20:20.000 --> 00:20:41.000
Lets do an example here to show you that it's not true for matrices, if I take (1, 2, 2, 4) and if I take B = (4, 6, -2 and 3) and if this is 2 by 2, AB....
00:20:41.000 --> 00:20:49.000
... I get the 0 matrix, however notice neither A nor B is 0, so again for matrices, it doesn't hold.
00:20:49.000 --> 00:20:59.000
You can multiply two matrices and get the zero matrix that doesn't mean that one of them is equal to 0, it's not the same, so keep track of these things that are actually different from the real numbers.
00:20:59.000 --> 00:21:12.000
Commutitivity of this property right here, this property right here where two matrices can be both non-zero and they can multiply to a 0 matrix.
00:21:12.000 --> 00:21:28.000
Another thing that's different is, well you know the law of cancellation, if I have two numbers, A, B or actually A times B = A times C, I just know naturally that I can cancel the A.
00:21:28.000 --> 00:21:52.000
Cancellation property, it implies that B = C, well let's take a matrix, let's do (1, 2, 2, 4) again, let's take B = (2, 1, 3, 2) and let's take a third matrix (-2, 7, 5 and -1).
00:21:52.000 --> 00:22:00.000
Now, if I multiply...
00:22:00.000 --> 00:22:14.000
... If I multiply AB, excuse me, and if I multiply AC, I get the same matrix, I get a 5, I get 16, 10.
00:22:14.000 --> 00:22:22.000
Notice AB, AC but...
00:22:22.000 --> 00:22:42.000
... B and C are not equal, so you can multiply a matrix A by a second matrix B, you can multiply that same matrix A by a third matrix C, you might end up with the same matrix, however that doesn't imply that the second matrix and the third matrix are the same.
00:22:42.000 --> 00:22:55.000
Another property that's different with matrices that does not follow the real numbers, there is no cancellation property...
00:22:55.000 --> 00:23:06.000
... Let's move forward, now we are going to be talking about some properties of scalar multiplication, I'll just go ahead and list these out, we won't necessarily do examples of this because they are reasonably straight forward.
00:23:06.000 --> 00:23:21.000
we are just talking about multiplying a matrix by a number, which means multiplying each entry of the matrix by that number, straight, simple or arithmetic, no worries, okay.
00:23:21.000 --> 00:23:41.000
We will let R and S, be real numbers and A and B matrices...
00:23:41.000 --> 00:23:58.000
... First property, as times A is equal to RS, times A, this is if I have multiplied a matrix by some scalar S, and if I multiply by another scalar, I can pretty much just reverse the multiplication process.
00:23:58.000 --> 00:24:09.000
I can multiply the two scalars together, and then multiply by the matrix, pretty intuitive for the most part, nothing strange happening here.
00:24:09.000 --> 00:24:20.000
B, if I have two scalars that I add together like 5 + 2 times some matrix A, that's equal to, distributive property is all part of here.
00:24:20.000 --> 00:24:33.000
RA + SA, again nothing particularly strange happening here, we are just listing the properties sot that we have a formality that we see in them.
00:24:33.000 --> 00:24:50.000
R times (A + B), If I add two matrices together and then multiply by a scalar, same thing, RA + RB, the scalar distributes over both matrices.
00:24:50.000 --> 00:24:59.000
D,...
00:24:59.000 --> 00:25:07.000
... I I had multiplied R times some, if I have multiplied a scalar times some matrix, and then I decided to multiply by another matrix, I can actually reverse those.
00:25:07.000 --> 00:25:15.000
I can multiply the two matrices together and then multiply by the scalar, again these are just ways of manipulating things that we might run across.
00:25:15.000 --> 00:25:23.000
You'll do them intuitively, you don’t necessarily have to refer to these, you already know that these are true, you just go ahead and use them.
00:25:23.000 --> 00:25:31.000
Okay, so again we don't have to worry about examples here, they are just arithmetic.
00:25:31.000 --> 00:25:45.000
Properties of the transpose, okay, transpose, very important concept, we introduce the transpose before you remember the transpose is where you take the columns and the rows of a matrix and you switch them.
00:25:45.000 --> 00:26:00.000
Just as a quick example let's just say we have a matrix, (2, 3) let's do it (3, 2, 1) something like that, we go ahead and we flip this along the main diagonal.
00:26:00.000 --> 00:26:21.000
Everything that's a row becomes a column, everything that's a column becomes a row, so under transposition, this becomes (2, 3, 3, 2, 4, 1), what I have done is I have read, down the column and I have gone from left to right.
00:26:21.000 --> 00:26:31.000
(2, 3) that's the first row (3, 2), that's the second row, (4, 1), all i have done is flip it along the main diagonal.
00:26:31.000 --> 00:26:48.000
Okay, so now properties, let's say, we will let R be a member of the real numbers, let's use a little buit of formal mathematics here, and A and B...
00:26:48.000 --> 00:27:01.000
... Are matrices, then we have, if I have already taken the transpose of A and then take the transpose again, so it's exactly what you think it is.
00:27:01.000 --> 00:27:15.000
If I go this way, and then if I take the transpose again and I go back that way, I end up with my original matrix A, completely intuitive.
00:27:15.000 --> 00:27:31.000
Okay, if I add A and B and then take the transpose, I can take the transpose of A and add it to the transpose of B, so A + B transpose, is the same as A transpose + B transpose.
00:27:31.000 --> 00:27:45.000
This property is very interesting, something that you could probably expect intuitively, but as it turns out this property becomes the foundation for the definition of a linear function, which we will be more precise about later on.
00:27:45.000 --> 00:27:54.000
There is actually no reason for believing that they should be true, so this has to be proved, in fact all of these properties, they need to be proven.
00:27:54.000 --> 00:28:01.000
We won't worry about the proofs, they are all reasonably self-evident, but this one is kind of interesting in another self.
00:28:01.000 --> 00:28:18.000
Okay, this next one is very curious, if I multiply two matrices, A time B, and then if I take the transpose, I don't get A transpose times B transpose, what I get is B transpose times A transpose.
00:28:18.000 --> 00:28:29.000
This one, very important property, I'll put a couple of stars by it, I mean al of the properties are important, but this is the one that's most unusual and it probably trips up students the most.
00:28:29.000 --> 00:28:42.000
When you multiply two matrices and then take the transpose, it's not the same as here, notice here the order A + B first, A transpose first, B transpose, your order is not retained, here the order is reversed.
00:28:42.000 --> 00:28:53.000
You have to take the transpose of B first and then multiply by the transpose of A first, that's what makes these, that's what makes the equality hold.
00:28:53.000 --> 00:29:17.000
And then of ‘course our final one, we have some scalar multiplied by a matrix A, if we take the transpose, that's just the same as the scalar times the transpose of A, so you take the transpose first, and then you multiply as supposed to here you multiply first, which is why we have the parenthesis and do their transpose, okay.
00:29:17.000 --> 00:29:33.000
Most of you who are taking a look at this video, will probably have some sort of a book that you are referring to, I would certainly recommend taking a look at a proof of this property, property C, it's not a difficult proof, all you are doing is...
00:29:33.000 --> 00:29:44.000
... You are literally just, you know picking the entries of A, multiplying by the entries of B, and sort of doing it long hand, so it is somewhat tedious, if could use that word, but it isn't difficult.
00:29:44.000 --> 00:30:00.000
It's completely self-evident, and it's a little bit of a surprise when you actually end up with this, so that's the interesting thing about mathematics, is that things show up in the process of proof, that your intuition would not lead you to believe it is true, which is why we wouldn't believe our intuition.
00:30:00.000 --> 00:30:18.000
In mathematics, past the certain point, we have to rely on proof, it's important and it will become very evident in the linear algebra when we discover some very deep fundamental properties, of, not of mathematical structures, but nature itself that one would not even believe could be true and yet they are.
00:30:18.000 --> 00:30:26.000
There are statement about the nature of reality that you would never even think for a million years, might actually be true.
00:30:26.000 --> 00:30:50.000
Okay, let's do a couple of examples, let's take a matrix A, and we will set it equal to (1, 3, 2, 2, -1, 3) and again forgive me for not putting the brackets around it, and I have a matrix B which is (0, 1, 2, 2, 3, -1).
00:30:50.000 --> 00:31:06.000
Okay, now we want to ask ourselves, we are going to take A times B, we are going to take the transpose, we want to know, we want to confirm that it's actually equal to B transpose times A transpose.
00:31:06.000 --> 00:31:17.000
What we are doing here is not a proof, what we are doing here is a confirmation, they are very different, confirmation just confirms something true for a specific case, a proof is true generally.
00:31:17.000 --> 00:31:29.000
Okay, when we multiply AB, okay, I'll let you do the multiplication, this is a 2 by 3, this is a 3 by 2, so we are going to end up with a 2 by 2 matrix.
00:31:29.000 --> 00:31:49.000
Again it's just arithmetic, so I'll let you guys take care of that (7, -3) and then we will go ahead and subject this to transposition, we end up with is (12, 7, 5, -3).
00:31:49.000 --> 00:32:03.000
Notice the entry this is a square matrix, the entries along the main diagonal did not change, everything just switched positions, 7 went to 5, 5 went to 7, so that's the left hand side (12, 7, 5, -3).
00:32:03.000 --> 00:32:20.000
Now, if take the transpose of B, which gives me a, well here, if I take B, if I take the transpose of B, I end up with a 2 by 3 matrix.
00:32:20.000 --> 00:32:29.000
If I take A and take the transpose of A, this is a 2 by 3, I get a 3 by 2 matrix.
00:32:29.000 --> 00:32:41.000
When I multiply B transpose times A transpose, 2 by 3 times the 3 by 2, 2 by 3 times 3 by 2, I get a 2 by 2, so it matches.
00:32:41.000 --> 00:32:56.000
Now we want to see if the entries match, so as far as definitions are concerned, the matrix is defined when we do the multiplication, sure enough we end up with (12, 7, 5, -3).
00:32:56.000 --> 00:33:11.000
This confirms the fact that, let's go back up here, A times B, multiply first and then take the transpose is the same as taking the transpose of first, multiplying it on the left of A transpose.
00:33:11.000 --> 00:33:20.000
Again, this is not random, the order must be maintained in matrix multiplication because matrix multiplication does not commute.
00:33:20.000 --> 00:33:24.000
In fact it's often not even defined.
00:33:24.000 --> 00:33:41.000
Okay, let's take a look at what we have done here today, we have talked about properties of matrix addition, so very similar to properties for real numbers, matrix A + B = B + A, matrix addition commutes.
00:33:41.000 --> 00:34:02.000
A + B + C quantity equals A + quantity + C, matrix addition is associative, there is a unique M by N matrix 0, such that A + the 0 matrix equals 0 + A matrix, and again we can do it this way because addition is commutative is equal to...
00:34:02.000 --> 00:34:18.000
... Oops, my apologies here, this should be A, so A + 0 matrix leaves it unchanged, so this is called the 0 matrix or the additive identity, we will call it the 0 matrix most of the time.
00:34:18.000 --> 00:34:28.000
For every matrix A, there exist an unique matrix D such that when you add D to A, in other words A + D, you get the 0 matrix.
00:34:28.000 --> 00:34:41.000
We will write D as -A, which is exactly what it is, 7, _7, matrix A, matrix -A, and call it the additive inverse or we will just refer to as the negative of A.
00:34:41.000 --> 00:34:52.000
Things like additive inverse, additive identity, these are formal terms that you will talk about for those of you that go on to study abstract algebra, group theory, ring theory, feel theory, things like that.
00:34:52.000 --> 00:34:59.000
Beautiful area is mathematics by the way, my personal favorite.
00:34:59.000 --> 00:35:14.000
Okay we talked about properties of matrix multiplication, A times B + C quantity equals A + B quantity times C, this is the associativity property of multiplication, notice there is nothing here about commutivity.
00:35:14.000 --> 00:35:21.000
Matrix, sorry to keep hammering the point, I know you are probably getting sick of it but matrix multiplication does not commute, it's amazing.
00:35:21.000 --> 00:35:32.000
How many times you actually sort of mentioned that and yet we are in such a habit of, you know commuting our multiplication that we don't even think twice about it, we have to catch ourselves, that's why I keep mentioning it.
00:35:32.000 --> 00:35:45.000
Distributive property is effective, A times quantity B + C, is AB + AC, and A + B quantity times C is AC + BC.
00:35:45.000 --> 00:35:54.000
Now for dealing with a square matrix N by N, same dimensions, rows and columns, we will define any matrix the 0 power is equal to the identity matrix.
00:35:54.000 --> 00:36:00.000
And again the identity matrix is just that square matrix, where everything on the main diagonal is a 1.
00:36:00.000 --> 00:36:14.000
Think of it as the one matrix, multiplying everything by 1, A^p, where P is an integer, it's just A multiplied by itself P times.
00:36:14.000 --> 00:36:29.000
We have A^p times A^q, same as numbers, just multiply, I am sorry, raise it to the sum of P + Q, and A to the P to the Qth power, just multiply P and Q.
00:36:29.000 --> 00:36:37.000
Again, these are all analogous to everything that you deal with numbers.
00:36:37.000 --> 00:36:52.000
Scalar multiplication, R and S are real numbers, A and B are matrices, R times SA = RS times A, R + S times A = RA + SA, distribution.
00:36:52.000 --> 00:37:05.000
Same thing the other way around, FR times A + B matrices, RA + RB and A times RB quantity = R times AB quantity.
00:37:05.000 --> 00:37:15.000
Covered a lot of properties, but again most of them are the same, only a couple of several differences, okay let's...
00:37:15.000 --> 00:37:20.000
... Oh sorry, let's pick a look at the properties of transpose, probably the most important.
00:37:20.000 --> 00:37:29.000
Okay, let R be a scalar, A and B are matrices, A transpose, just recovers A.
00:37:29.000 --> 00:37:36.000
A + B quantity transpose is equal to A transpose + B transpose, order is retained.
00:37:36.000 --> 00:37:46.000
A times B transpose = B transpose times A transpose, order is not retained.
00:37:46.000 --> 00:37:49.000
Highlight that one.
00:37:49.000 --> 00:38:01.000
Some number times A, then take the transpose is equal to just taking the transpose of A and then multiplying by that number okay.
00:38:01.000 --> 00:38:07.000
Let's see, let's define...
00:38:07.000 --> 00:38:21.000
(2, 1, 3, -1, 2, 4, 3, 1, 0), that's our matrix C...
00:38:21.000 --> 00:38:28.000
... (1, 1, 2), okay, my E a little but more clear here instead of just a couple of random lines.
00:38:28.000 --> 00:38:57.000
(2, -1, 3, -3, 2, 1) and have that, we want to find 3C - 2E transpose, just a combination of things, we want to multiply 3 times , we want to subtract twice E, and then we want to take the transpose of that matrix if it's defined, okay.
00:38:57.000 --> 00:39:13.000
Well, we know that if we have the transpose of some quantity, we can deal with it this way, we can take the transpose individually....
00:39:13.000 --> 00:39:29.000
... And we know that when we have a number times a matrix transpose, we can go ahead and just take the transpose of the matrix first and then multiply by the number.
00:39:29.000 --> 00:39:33.000
Transpose...
00:39:33.000 --> 00:39:46.000
...Okay, when we go ahead and do, C transpose E transpose, okay, these are 3 by 3, 3 by 3.
00:39:46.000 --> 00:40:10.000
It's going to end up with 3 by 3 transpose, so let's just go ahead and do it here, let's do, so C transpose, again we are flipping it along the main diagonal, so we will end up withy (2, -1, 3, 1, 2, 1) and then (3, 4, 0) right.
00:40:10.000 --> 00:40:34.000
And then if we multiply that by 3, we will multiply each of these entries by 3, we end up with (6, -3, 9, 3, 6, 3) 3 times 3 is 9, 3 times 4 is 12, 3 times 0 is 0.
00:40:34.000 --> 00:40:39.000
Okay, so this is our...
00:40:39.000 --> 00:40:59.000
... #3 C and now if we take E transpose E, we will get (1, 2, -3) I just take in the columns, switch them to rows and I have (1, -1, 2) and I have (2, 3, 1) so I take the transpose.
00:40:59.000 --> 00:41:10.000
Now I multiply that by -2, okay and I end up with...
00:41:10.000 --> 00:41:31.000
... Let's go, -2 times 1 is -2, -2 times 2 is -4, -2 times -3 is 6, -2 times 1 is -2, -2 times -1 is 2 and -2 times 2 is -2.
00:41:31.000 --> 00:41:42.000
Here we have -4 for this entry, then we have -6 for this entry and we have -2 for this entry.
00:41:42.000 --> 00:42:00.000
Notice what I have done here, this is 3 C transpose - 2 E transpose, I could have just done 2 times this and subtracted, but I went ahead and treated this whole coefficient as a -2, -2 to get this.
00:42:00.000 --> 00:42:08.000
Now, what I am doing is I am actually going to be adding these straight, okay.
00:42:08.000 --> 00:42:28.000
Okay, and the final matrix that we end up with once we add those two matrices together, we end up with (4, -7, 15, 1, 8, -1, 5, 6 and -2).
00:42:28.000 --> 00:43:00.000
This was 3C - 2E transpose, we treated it as 3C transpose + actually -2 times E transpose, you can treat this again, you can do, you can multiply by 2 and then subtract or you can actually multiply by the -2 treated as the whole number and do the addition.
00:43:00.000 --> 00:43:05.000
You are doing the same thing; ultimately it's just about what's comfortable for you.
00:43:05.000 --> 00:43:14.000
Okay, and that takes care of the properties of matrix multiplication, addition and transpose and scalar multiplication.
00:43:14.000 --> 00:43:17.000
Thank you for joining us at educator.com.