WEBVTT mathematics/probability/murray
00:00:00.000 --> 00:00:05.200
Hi, welcome back to the probability lectures here on www.educator.com, my name is Will Murray.
00:00:05.200 --> 00:00:09.700
We are going to talk today about moment generating functions.
00:00:09.700 --> 00:00:16.700
Moment generating functions are one of the most confusing topics that people encounter in probability.
00:00:16.700 --> 00:00:20.700
I'm going to try to walk you through them and show you what they are used for.
00:00:20.700 --> 00:00:25.900
You might prepare yourself to be a little confused at first because every time I taught it,
00:00:25.900 --> 00:00:29.500
it is my students who always found them to be a little confusing.
00:00:29.500 --> 00:00:32.600
I will try to show you how it works.
00:00:32.600 --> 00:00:36.900
The initial idea I want to talk about is a moments.
00:00:36.900 --> 00:00:40.500
We start with a random variable and it can be discreet or continuous.
00:00:40.500 --> 00:00:45.900
We will talk about moment generating functions for all of the distributions that we have been studying,
00:00:45.900 --> 00:00:55.600
all of the discrete ones, binomial, geometric, and so on, and all of the continuous distributions, uniform and normal, and so on.
00:00:55.600 --> 00:01:03.500
We can talk about moments and we can talk about moment generating functions for all of these distributions.
00:01:03.500 --> 00:01:09.300
The first definition is the Kth moment of Y taken around the mean.
00:01:09.300 --> 00:01:10.600
Let me highlight that.
00:01:10.600 --> 00:01:18.700
The Kth moment of Y taken around the mean is just the expected value of Y ⁺K.
00:01:18.700 --> 00:01:24.600
The Kth there can be 1, 2, 3, and it can be 0, although people do not usually need to look
00:01:24.600 --> 00:01:28.200
at 0 as moment because that is not very illuminating.
00:01:28.200 --> 00:01:30.700
I said mean but I meant to say R gen.
00:01:30.700 --> 00:01:33.600
We are also going to talk about moments around the mean.
00:01:33.600 --> 00:01:39.900
But, it is important here that we are talking about the moments around the origin.
00:01:39.900 --> 00:01:46.200
There is some notation that is sometimes used for this which is ν K prime.
00:01:46.200 --> 00:01:50.700
That is really not obvious why we would use the notation ν K prime.
00:01:50.700 --> 00:01:55.200
I’m not going to use that notation in this lecture, but if you are following along
00:01:55.200 --> 00:02:00.900
in your own probability course or in your own probability book, you might see the notation ν K prime.
00:02:00.900 --> 00:02:06.000
What that means is the expected value of Y ⁺K.
00:02:06.000 --> 00:02:08.300
Those mean the same thing.
00:02:08.300 --> 00:02:18.200
There is another notation that you might see in your book which is that idea of central moments.
00:02:18.200 --> 00:02:24.500
Instead of taking the moment around the origin, we will talk about taking the moment about the mean.
00:02:24.500 --> 00:02:35.100
Which means, instead of talking about Y ⁺K, you do Y – μ ⁺K where μ is the mean of the original distribution.
00:02:35.100 --> 00:02:42.300
And that is called μ sub K and that is why we have to use μ sub K prime for the one that we are studying.
00:02:42.300 --> 00:02:46.600
I want to emphasize that there are 2 different ideas here.
00:02:46.600 --> 00:02:50.600
There is the moment around the origin and there is the moment around the mean.
00:02:50.600 --> 00:02:55.600
In this lecture, in the probability lectures here on www.educator.com,
00:02:55.600 --> 00:02:59.200
I'm just going to look at the moment taken around the origin.
00:02:59.200 --> 00:03:06.400
I have got some more common one and it is easier to understand the ideas for that one.
00:03:06.400 --> 00:03:09.300
I'm not going to talk anymore about the central moment.
00:03:09.300 --> 00:03:12.200
We will talk about the moment around the mean.
00:03:12.200 --> 00:03:18.500
I just mention that, in case you see it in your course, you know what the difference is.
00:03:18.500 --> 00:03:22.600
You do not really need to study both of them, you can figure it out.
00:03:22.600 --> 00:03:27.800
If you know one, you can figure out the other one just by doing some computations.
00:03:27.800 --> 00:03:30.000
It is not necessary to study both of them.
00:03:30.000 --> 00:03:36.300
You pick a system and then you follow that, and you can find all the information you need within one system.
00:03:36.300 --> 00:03:40.200
The system we are going to use is the moments around the origin.
00:03:40.200 --> 00:03:43.700
I would not talk anymore about moments about the mean.
00:03:43.700 --> 00:03:47.600
I just included it, in case you see it in your course.
00:03:47.600 --> 00:03:54.800
This has told us what the moment generating function is, let me jump onto the next slide and show you what that is.
00:03:54.800 --> 00:04:01.300
The moment generating function for Y is M sub Y of T.
00:04:01.300 --> 00:04:04.200
That := means it is defined to be.
00:04:04.200 --> 00:04:09.600
It is defined to be the expected value of E ⁺TY.
00:04:09.600 --> 00:04:15.300
That is a very illuminating definition, I do want to highlight it here
00:04:15.300 --> 00:04:22.500
because it is probably the most important definition we are going to have in this whole lecture.
00:04:22.500 --> 00:04:27.900
It is not obvious what it means right now, and I'm not going to clarify it right away .
00:04:27.900 --> 00:04:34.400
I’m just going to throw the definition at you and then we will practice using it to solve some problems.
00:04:34.400 --> 00:04:44.300
MY of T remember is defined to be the expected value of E ⁺TY, that is E like the exponential function.
00:04:44.300 --> 00:04:49.000
The important things that you need to remember right now is,
00:04:49.000 --> 00:05:03.000
the first one is that the moment generating function is a function of T not of Y.
00:05:03.000 --> 00:05:08.800
When you calculate the moment generating function for distribution, you should have a function of T.
00:05:08.800 --> 00:05:10.600
You should see a T in your answer.
00:05:10.600 --> 00:05:13.800
By the time you simplify it down, you will not see a Y.
00:05:13.800 --> 00:05:17.100
We will do some examples and you will see how it works out.
00:05:17.100 --> 00:05:22.800
The Y always disappear, you always end up with a function of T.
00:05:22.800 --> 00:05:26.100
Here is how you use the moment generating function.
00:05:26.100 --> 00:05:36.900
Once you know it, this first line is kind of trivial but I included because it will make the other lines make more sense.
00:05:36.900 --> 00:05:47.300
The expected value of Y⁰ is equal to the moment generating function with 0 plugged in for T.
00:05:47.300 --> 00:05:54.700
The expected value of Y⁰, Y⁰ is always 1 because anything to the 0 is 1.
00:05:54.700 --> 00:05:58.800
That is the expected value of 1 which of course will be 1.
00:05:58.800 --> 00:06:03.100
It is not like we are really learning anything much from the moment generating function,
00:06:03.100 --> 00:06:07.600
because we already knew that the expected value of Y⁰ is 1.
00:06:07.600 --> 00:06:12.300
In the next line, the moment generating function starts to become useful.
00:06:12.300 --> 00:06:16.100
What you do is you take the derivative of the moment generating function.
00:06:16.100 --> 00:06:23.100
And again, you plug in 0 for T and what that tells you is the expected value for your distribution.
00:06:23.100 --> 00:06:31.200
Now, we have something useful, we fused the moment generating function to find the mean of the distribution.
00:06:31.200 --> 00:06:38.400
In the second line, what we have done is we have take another derivative M prime prime.
00:06:38.400 --> 00:06:41.100
We plug in T is equal to 0.
00:06:41.100 --> 00:06:47.000
What that tells us is, the second moment of the distribution E of Y².
00:06:47.000 --> 00:06:54.400
Why is that useful, the reason that is useful is because it helps us to find the variance of the distribution.
00:06:54.400 --> 00:06:59.300
We can use this to find the variance.
00:06:59.300 --> 00:07:04.500
Be careful here, the variance is not the expected value of Y².
00:07:04.500 --> 00:07:07.700
Let me remind you how we calculate the variance.
00:07:07.700 --> 00:07:20.500
We calculate the variance as σ² is equal to the expected value of Y² - the expected value of (Y)².
00:07:20.500 --> 00:07:26.600
If we can figure out these 2 moments using the moment generating function,
00:07:26.600 --> 00:07:35.600
what we can do is drop in the expected value of Y² here from the MGF.
00:07:35.600 --> 00:07:44.400
We can use the MGF, the moment generating function, to calculate the expected value of Y².
00:07:44.400 --> 00:07:53.400
We can also use the moment generating function to calculate the expected value of Y MGF.
00:07:53.400 --> 00:07:59.900
Both of these ingredients, they go into calculating the variance come from the moment generating function.
00:07:59.900 --> 00:08:07.300
That is how we use the moment generating function, is to find these two ingredients to calculate the variance.
00:08:07.300 --> 00:08:11.800
There are other uses for the moment generating function, later on in statistics
00:08:11.800 --> 00:08:15.700
but I'm not going to get into them right away in this lecture.
00:08:15.700 --> 00:08:23.200
Instead, what I want to do is show you some of the moment generating functions for our favorite distributions.
00:08:23.200 --> 00:08:26.400
We will start with the discrete distribution.
00:08:26.400 --> 00:08:31.300
We have here, all our favorite discreet distributions, binomial, geometric,
00:08:31.300 --> 00:08:37.500
negative binomial, hypergeometric, and Poisson distribution.
00:08:37.500 --> 00:08:41.300
Here are what the moment generating functions turn out to be.
00:08:41.300 --> 00:08:46.900
For binomial, it is PE ⁺T + 1 - (P) ⁺n.
00:08:46.900 --> 00:08:55.500
By the way, the binomial distribution, we often define Q to be 1 – P.
00:08:55.500 --> 00:09:03.100
That term of 1 – P, people often write that as Q, and they simplify the way
00:09:03.100 --> 00:09:07.200
to write the moment generating function somewhat.
00:09:07.200 --> 00:09:13.700
For the geometric distribution, PE ⁺T/1 - (1 - P) E ⁺T.
00:09:13.700 --> 00:09:17.300
Again, there is a Q in there, that is equal to 1 – P.
00:09:17.300 --> 00:09:25.400
If you want to simplify this down, you can write this as PE ⁺T/1 - Q × E ⁺T.
00:09:25.400 --> 00:09:29.200
A little bit simpler to write at the expense of having one more variable.
00:09:29.200 --> 00:09:37.100
Negative binomial distribution is almost the same thing, except on there is an R in the exponent.
00:09:37.100 --> 00:09:39.600
Almost the same as the geometric distribution.
00:09:39.600 --> 00:09:43.200
Again, you can put in a Q for 1 – P, if you like that.
00:09:43.200 --> 00:09:49.000
The hypergeometric distribution has no closed form moment generating function.
00:09:49.000 --> 00:09:54.900
If you try to calculate the moment generating function of a hypergeometric distribution, it just blows up in your face.
00:09:54.900 --> 00:09:57.600
There is no reason to go there, we would not go there.
00:09:57.600 --> 00:10:05.400
The Poisson distribution much more well behaved, it is E ^λ × E ⁺T – 1.
00:10:05.400 --> 00:10:09.900
Couple of things I want to mention about all of these, one is you might be wondering where these come from,
00:10:09.900 --> 00:10:12.600
how do you calculate these moment generating functions.
00:10:12.600 --> 00:10:18.700
Stay tuned, I will tell you because we will work out a couple of these in the examples.
00:10:18.700 --> 00:10:22.600
Or you can just scroll down right now, if you are bursting with curiosity.
00:10:22.600 --> 00:10:30.900
Check out example 1 and 3, I think, we are going to do the binomial distribution.
00:10:30.900 --> 00:10:33.600
We will calculate the moment generating function.
00:10:33.600 --> 00:10:39.400
For example 3, we are going to take the Poisson distribution and calculate moment generating function.
00:10:39.400 --> 00:10:41.600
You will be able to see where these come from.
00:10:41.600 --> 00:10:50.600
Another thing that I want to point out about this is, that you notice that nowhere on here do you see the variable Y.
00:10:50.600 --> 00:11:04.100
All of these are functions of T, you see T everywhere here.
00:11:04.100 --> 00:11:12.500
The moment generating function is always a function of T not Y, it is a function of T.
00:11:12.500 --> 00:11:16.700
If you are calculating a moment generating function, if you still have Y on your paper
00:11:16.700 --> 00:11:28.000
then you need to keep going until you can get rid of the Y, and try to simplify it down into a function of T.
00:11:28.000 --> 00:11:32.900
These are just the discreet distributions, we also have a number of continuous distributions.
00:11:32.900 --> 00:11:36.100
Let us go ahead and look at those.
00:11:36.100 --> 00:11:44.000
Here, our favorite continuous distributions, uniform, normal, gamma, exponential, Chi square, and the β distribution.
00:11:44.000 --> 00:11:46.900
The uniform distribution is a very simple distribution.
00:11:46.900 --> 00:12:01.500
It has a surprisingly complicated moment generating function, E ⁺T θ2 – E ⁺T θ1 ÷ T × θ2 – θ1.
00:12:01.500 --> 00:12:04.500
I keep saying my θ in the wrong order.
00:12:04.500 --> 00:12:12.300
We are going to calculate that one out by hand, I think that is example 5.
00:12:12.300 --> 00:12:17.300
If you want, you can scroll down and take a look at example 5.
00:12:17.300 --> 00:12:20.500
You will see how we calculate the uniform distribution.
00:12:20.500 --> 00:12:25.000
The others are more difficult, I did not put them into examples.
00:12:25.000 --> 00:12:30.300
The normal distribution E ^ν T + T² σ²/2.
00:12:30.300 --> 00:12:34.400
All of this is in the exponent of the E.
00:12:34.400 --> 00:12:36.700
There is a lot in the exponent there.
00:12:36.700 --> 00:12:44.500
The gamma distribution is 1 - β T ^-α.
00:12:44.500 --> 00:12:50.400
The next two distributions, remember are actually special cases of the gamma distribution.
00:12:50.400 --> 00:12:58.500
The exponential distribution is just the gamma distribution where we take α equal to 1.
00:12:58.500 --> 00:13:05.900
If you look at the gamma distribution, the moment generating function, and just plug in α = 1,
00:13:05.900 --> 00:13:09.500
you get the moment generating function for the exponential distribution.
00:13:09.500 --> 00:13:11.500
It is quite nice and simple.
00:13:11.500 --> 00:13:21.000
The Chi square distribution is the gamma distribution with α defined to be ν/2.
00:13:21.000 --> 00:13:28.400
Ν is the number of degrees of freedom and β is equal to 2.
00:13:28.400 --> 00:13:38.600
If you take the gamma distribution and you plug in α is equal to ν/2 and β is equal to 2,
00:13:38.600 --> 00:13:49.400
you get the moment generating function for the Chi square distribution, 1 -2T ^-ν/2.
00:13:49.400 --> 00:13:54.500
The β distribution, if you try to calculate the moment generating function,
00:13:54.500 --> 00:13:58.800
you will get into a horrible mess and it just blows up in your face.
00:13:58.800 --> 00:14:04.200
We say that there is no closed formula in moment generating function for the β distribution.
00:14:04.200 --> 00:14:08.700
By the way, if you are a little rusty on what all these words mean, uniform, normal, gamma,
00:14:08.700 --> 00:14:13.900
exponential, chi square, β, we have separate lectures about each one of these distributions.
00:14:13.900 --> 00:14:17.500
You can go back and you can read up on the uniform distribution.
00:14:17.500 --> 00:14:19.500
You can practice the normal distribution.
00:14:19.500 --> 00:14:22.000
You can study the gamma distribution.
00:14:22.000 --> 00:14:27.000
Of course, the exponential and chi square distribution, those are special cases of gamma distribution.
00:14:27.000 --> 00:14:30.800
You will find those in the lecture on gamma distribution.
00:14:30.800 --> 00:14:33.900
Just scroll up here and you will see the lecture on gamma distribution.
00:14:33.900 --> 00:14:38.000
You will get the exponential and Chi square thrown in there as a bonus.
00:14:38.000 --> 00:14:44.300
There is also a lecture on the β distribution, you can read up all about that.
00:14:44.300 --> 00:14:49.000
The only things that are not in those lectures are the moment generating functions.
00:14:49.000 --> 00:14:51.700
That is what I'm telling you about right now.
00:14:51.700 --> 00:14:54.400
Let us go ahead and jump into some examples, and see how we actually derive
00:14:54.400 --> 00:15:05.200
these moment generating functions, and then see how we can use them to calculate some means and some variances.
00:15:05.200 --> 00:15:08.600
I see we have one more slide before I talk about the examples.
00:15:08.600 --> 00:15:12.900
A couple of useful formulas for the moment generating functions.
00:15:12.900 --> 00:15:20.100
If you have one known random variable Y and you do a linear change of variables.
00:15:20.100 --> 00:15:27.300
If you define Z to be AY + B, := means defined to be.
00:15:27.300 --> 00:15:35.400
If you define Z to be AY + B, then the moment generating function for Z is related
00:15:35.400 --> 00:15:45.100
to the moment generating function for Y, except that there is an A missing in there.
00:15:45.100 --> 00:15:51.100
Let me just go ahead and write that A in there.
00:15:51.100 --> 00:15:59.500
It is just MY of AT and then E × E ⁺BT.
00:15:59.500 --> 00:16:06.200
That is how you get from the moment generating function of Y to the moment generating function of Z.
00:16:06.200 --> 00:16:10.300
Very useful, by the way, when you are converting normal distributions.
00:16:10.300 --> 00:16:17.900
When you convert to a standard normal variable, you are doing exactly this kind of variable change.
00:16:17.900 --> 00:16:22.000
This is quite useful, when you want to calculate the moment generating function.
00:16:22.000 --> 00:16:27.600
Second useful formula, when Y1 and Y2 are independent variables.
00:16:27.600 --> 00:16:34.800
Z is Y1 + Y2, there you are defining Z to be Y1 + Y2.
00:16:34.800 --> 00:16:36.800
This only works for independent variables.
00:16:36.800 --> 00:16:41.700
But when they are not independent, you can say that the moment generating function for Z
00:16:41.700 --> 00:16:47.800
is the moment generating function for Y × the moment generating function for Y2.
00:16:47.800 --> 00:16:52.700
What moment generating functions do is they convert sums into products.
00:16:52.700 --> 00:17:01.900
That is really not surprising, that is essentially based on the fact that E ⁺X + Y is equal to E ⁺X × E ⁺Y.
00:17:01.900 --> 00:17:10.500
Remember, our initial definition of moment generating function was in terms of the expected value of an exponential.
00:17:10.500 --> 00:17:18.400
The fact that moment generating functions convert sums of variables into products of functions,
00:17:18.400 --> 00:17:22.600
converts addition into multiplication, is really not very surprising.
00:17:22.600 --> 00:17:27.100
But, you do have to check that you are talking about independent variables.
00:17:27.100 --> 00:17:35.000
Let us go on and talk about some examples where we will actually calculate some moment generating functions.
00:17:35.000 --> 00:17:41.600
In example 1, we want to find the moment generating function for the binomial distribution.
00:17:41.600 --> 00:17:47.400
Let me remind you of the probability function for the binomial distribution.
00:17:47.400 --> 00:17:49.000
It is been awhile since we studied that.
00:17:49.000 --> 00:17:55.500
If you do not know what the binomial distribution is at all, just check back in the list up above,
00:17:55.500 --> 00:17:59.300
you will see a whole lecture on the binomial distribution.
00:17:59.300 --> 00:18:06.800
The take away from that lecturer right now, is that the probability of a value of Y is equal to N choose Y,
00:18:06.800 --> 00:18:09.500
that is a binomial coefficient.
00:18:09.500 --> 00:18:18.000
P ⁺Y Q ⁺N-Y, that is for Y ranging between 0 and N.
00:18:18.000 --> 00:18:25.700
It represents the probability of getting Y heads when you flip a coin N ×.
00:18:25.700 --> 00:18:30.200
Let us try to figure out the moment generating function for that distribution.
00:18:30.200 --> 00:18:44.100
M sub Y of T, using that definition of moment generating function, defined to be the expected value of E ⁺TY.
00:18:44.100 --> 00:18:47.500
How do you find the expected value of a function of Y?
00:18:47.500 --> 00:18:53.300
Here is how you do it, I showed you this in a very early lecture.
00:18:53.300 --> 00:19:06.800
It is the sum over all values of Y, of the probability of that particular Y, × that function of Y, E ⁺TY.
00:19:06.800 --> 00:19:10.000
We need to expand that and figure it out.
00:19:10.000 --> 00:19:12.000
What values of Y are we talking about?
00:19:12.000 --> 00:19:18.500
I read from here that the range of values of Y is from my equal 0 to N.
00:19:18.500 --> 00:19:21.900
The probability of each Y, I wrote that down right above.
00:19:21.900 --> 00:19:29.600
It is N choose Y × P ⁺Y × Q ⁺N- Y.
00:19:29.600 --> 00:19:34.700
Now, I have to multiply on this term E ⁺TY.
00:19:34.700 --> 00:19:40.600
What can I do with this, remember I’m trying to simplify this into a function of T,
00:19:40.600 --> 00:19:44.600
which means I'm trying to get rid of the Y, which means I have to do something clever.
00:19:44.600 --> 00:19:48.700
Here is what I can do, I notice that I have P ⁺Y here.
00:19:48.700 --> 00:19:55.700
Here, I have E ⁺TY which I can write as E ^(T) ⁺Y.
00:19:55.700 --> 00:20:00.600
I can combine those two factors, that is what I'm going to do.
00:20:00.600 --> 00:20:16.400
Y = 0 ⁺N of n choose Y of PE ⁺T ⁺Y × Q ⁺N-Y.
00:20:16.400 --> 00:20:23.600
If you stare at this very heart, you are supposed to recognize something, to have a small epiphany, if you will.
00:20:23.600 --> 00:20:28.300
In fact, you might want to stop the video right now and stare at this formula,
00:20:28.300 --> 00:20:30.800
and go ahead and have that epiphany.
00:20:30.800 --> 00:20:35.200
I will wait, did you come and have that epiphany?
00:20:35.200 --> 00:20:39.500
I think it is worth staring at that equation because it is really fun to recognize something.
00:20:39.500 --> 00:20:43.700
What you are supposed to recognize in this formula is the binomial theorem.
00:20:43.700 --> 00:20:46.700
I will remind you what the binomial theorem says.
00:20:46.700 --> 00:21:02.200
It says (A + B) ⁺n is equal to the sum from Y =0 to N of N choose Y A ⁺Y B ⁺N-Y.
00:21:02.200 --> 00:21:07.100
You might have seen the binomial theorem used in a slightly different variable,
00:21:07.100 --> 00:21:11.000
but it should be the same theorem because it is a universal truth.
00:21:11.000 --> 00:21:14.800
What we have here is exactly that formula.
00:21:14.800 --> 00:21:24.500
We are sort of reverse engineering the binomial theorem now, but my A is going to be PE ⁺T, my B is Q.
00:21:24.500 --> 00:21:27.000
We have a perfect match of the binomial theorem.
00:21:27.000 --> 00:21:38.900
It is A + B ⁺N, that is PE ⁺T + Q ⁺N.
00:21:38.900 --> 00:21:47.200
Notice here that, we have a function of T.
00:21:47.200 --> 00:21:51.700
T only, there are no Y left anymore.
00:21:51.700 --> 00:21:59.200
The moment generating function is now a function of T, we have solved the problem.
00:21:59.200 --> 00:22:02.300
If you do not like that Q, where did that Q come from.
00:22:02.300 --> 00:22:05.400
You can always put it back in two terms of P.
00:22:05.400 --> 00:22:14.700
You could write this as PE ⁺T, Q is 1 – P, all of that is still raised to the nth power.
00:22:14.700 --> 00:22:21.400
I think that is the version that I gave you on the chart of moment generating functions a couple of slides ago.
00:22:21.400 --> 00:22:26.300
Now you know how those two correspond to each other.
00:22:26.300 --> 00:22:31.700
We are done with that example, we found the moment generating function for the binomial distribution.
00:22:31.700 --> 00:22:34.500
Let me recap the steps we went through.
00:22:34.500 --> 00:22:40.100
First of all, I have reminded myself of the probability function for the binomial distribution.
00:22:40.100 --> 00:22:44.400
Here it is, N choose Y P ⁺Y Q ⁺N-Y.
00:22:44.400 --> 00:22:47.100
Here is the range of Y values involved.
00:22:47.100 --> 00:22:54.700
And then, I used the definition of the moment generating function, found on one of the earlier slides in this lecture.
00:22:54.700 --> 00:22:58.100
It is the expected value of E ⁺TY.
00:22:58.100 --> 00:23:03.300
The expected value of any function, the way you calculate it is you sum/Y.
00:23:03.300 --> 00:23:06.400
This would be an integral, if you are in a continuous distribution.
00:23:06.400 --> 00:23:09.400
But since binomial is discrete, we are using the sum.
00:23:09.400 --> 00:23:22.900
The probability of Y × that function E ⁺TY, I expanded P of Y that is what I did here, I expand P of Y into that.
00:23:22.900 --> 00:23:27.200
And then, I noticed that there is a P ⁺Y and E ⁺TY.
00:23:27.200 --> 00:23:33.500
I can combine those, if I cleverly write E ⁺TY as E ⁺T ⁺Y.
00:23:33.500 --> 00:23:37.300
I combined those together as PE ⁺T ⁺Y.
00:23:37.300 --> 00:23:43.600
And then, I really had an epiphany, I said look, that is exactly the binomial theorem.
00:23:43.600 --> 00:23:46.800
I reminded myself of the binomial theorem here.
00:23:46.800 --> 00:23:55.300
I noticed how this fits that pattern and this is exactly PE ⁺T + Q ⁺nth.
00:23:55.300 --> 00:24:00.700
Notice that, it is a function of T, there are no more Y left in this.
00:24:00.700 --> 00:24:05.400
If you do not like the Q, you could always expand it out into 1 – P.
00:24:05.400 --> 00:24:09.500
That was the role that Q played in the binomial distribution.
00:24:09.500 --> 00:24:15.100
Hang onto this moment generating function because we have not really used it for anything yet.
00:24:15.100 --> 00:24:16.900
We just figured out what it was.
00:24:16.900 --> 00:24:25.200
I just justified this formula on the chart at the beginning of this lecture, but I have not used it for anything yet.
00:24:25.200 --> 00:24:32.900
What I'm going to do in the next example is, we will use this formula to calculate the mean of the binomial distribution.
00:24:32.900 --> 00:24:37.100
We will see for the first time what MGF can be good for.
00:24:37.100 --> 00:24:42.300
Do no forget this formula, we are going to use it again right away in example 2.
00:24:42.300 --> 00:24:48.800
In example 2, we are going to use the MGF for the binomial distribution to find the mean of the distribution.
00:24:48.800 --> 00:24:56.000
We calculated the moment generating function for the binomial distribution in the previous example, example 1.
00:24:56.000 --> 00:25:00.500
If you did not just watch example 1, maybe go back and watch it right now.
00:25:00.500 --> 00:25:07.100
What you will find out is that the moment generating function, this is what we calculated in example 1,
00:25:07.100 --> 00:25:14.500
turned out to be PE ⁺T + Q ⁺nth.
00:25:14.500 --> 00:25:16.500
What is that mean? I have no idea.
00:25:16.500 --> 00:25:18.800
But let me show you how we can use it.
00:25:18.800 --> 00:25:27.300
Remember that, we can calculate the mean of the distribution, the expected value of Y.
00:25:27.300 --> 00:25:33.600
The way you calculate that, the way you calculate it now that we have the event Scientific Technology
00:25:33.600 --> 00:25:45.300
of the moment generating function is to take M sub Y prime of T at T =0.
00:25:45.300 --> 00:25:48.900
You take its derivative and then you plug in T = 0.
00:25:48.900 --> 00:25:54.100
This is something that we learned in the second slide, I think, of this lecture.
00:25:54.100 --> 00:25:59.300
If you scroll back a few slides and look at that, you will see where this comes from.
00:25:59.300 --> 00:26:05.100
Let us figure out what the derivative of this is, PE ⁺T + Q ⁺N.
00:26:05.100 --> 00:26:12.100
Remember, T is my variable, everything else is a constant P, Q, E, N, those are all constants.
00:26:12.100 --> 00:26:15.500
N is an exponent, I'm going to use the power rule.
00:26:15.500 --> 00:26:17.700
It is time to review your calculus 1.
00:26:17.700 --> 00:26:31.000
The derivative of something to the nth is N × all that stuff, PE ⁺T + Q ⁺N-1 × the derivative of this stuff inside.
00:26:31.000 --> 00:26:38.900
That is the chain rule, I have to do PE ⁺T, and Q is a constant, I do not have to do anything about that.
00:26:38.900 --> 00:26:43.600
That is the chain rule that I had to write PE ⁺T on the outside there.
00:26:43.600 --> 00:26:48.800
At T = 0, I got to plug in T = 0.
00:26:48.800 --> 00:27:05.300
If I plug in T = 0, it is N × P × E ⁺T is just 1 + Q ⁺N -1 × P × E⁰ is just 1.
00:27:05.300 --> 00:27:08.200
In the parentheses there, I see that I have P + Q.
00:27:08.200 --> 00:27:15.400
Remember that, Q is 1 – P, that means P + Q is equal to 1.
00:27:15.400 --> 00:27:21.700
I have got N × 1, that P + Q magically simplifies into 1.
00:27:21.700 --> 00:27:27.800
1 ⁺N-1 × P × 1.
00:27:27.800 --> 00:27:35.000
1 ⁺N- 1 is just 1, and I have got N × P.
00:27:35.000 --> 00:27:40.400
That is the mean of the binomial distribution, you can call it the expected value or the mean,
00:27:40.400 --> 00:27:44.400
I do not care which one you use because they both mean the same thing.
00:27:44.400 --> 00:27:51.900
This is something that we did know years and years ago, when we study the binomial distribution.
00:27:51.900 --> 00:27:55.900
But, it is nice to have the moment generating function to confirm it.
00:27:55.900 --> 00:28:00.000
The mean of the binomial distribution is N × P.
00:28:00.000 --> 00:28:01.800
Let me recap the steps there.
00:28:01.800 --> 00:28:08.100
I started off with the moment generating function that I calculated back in example 1.
00:28:08.100 --> 00:28:13.200
That comes from example 1, if you did not just watched example 1 then you are missing out
00:28:13.200 --> 00:28:16.600
because you would not know how we derived that.
00:28:16.600 --> 00:28:21.400
Maybe you go back and watch example 1 to see where that came from.
00:28:21.400 --> 00:28:27.200
To find the expected value of any distribution, what you do is you can take
00:28:27.200 --> 00:28:32.800
the moment generating function take its derivative, and then plug in T = 0.
00:28:32.800 --> 00:28:36.400
We took its derivative, a little bit of calculus 1 coming here.
00:28:36.400 --> 00:28:41.400
We got the power rule N × PE ⁺T + Q ⁺N -1.
00:28:41.400 --> 00:28:45.600
The chain rule means you have to multiply on the derivative with the stuff inside.
00:28:45.600 --> 00:28:51.000
That is where the PE ⁺T came from, and the Q just goes away because it is a constant.
00:28:51.000 --> 00:28:58.500
And then, I plug in T = 0 that is why I got E ⁺T is 1 here.
00:28:58.500 --> 00:29:05.900
P + Q turn into 1, and that all simplifies down 1 ⁺N-1 just turns into 1.
00:29:05.900 --> 00:29:14.900
That simplifies down to NP, now, I know what the mean of the binomial distribution is.
00:29:14.900 --> 00:29:19.900
We are going to do this again, something similar with the Poisson distribution.
00:29:19.900 --> 00:29:25.700
If this still does not make sense then you got a chance to see the same kind of process with the Poisson distribution.
00:29:25.700 --> 00:29:29.800
Stick around for examples 3 and 4.
00:29:29.800 --> 00:29:34.700
In example 3, we are going to find the moment generating function for the Poisson distribution.
00:29:34.700 --> 00:29:37.000
It is kind of working from scratch there.
00:29:37.000 --> 00:29:42.100
Let me remind you, first of all, the probability function for the Poisson distribution.
00:29:42.100 --> 00:29:47.300
The probability function for the Poisson distribution, there was a λ parameter in there.
00:29:47.300 --> 00:29:55.000
It is λ ⁺Y/Y! × E^-λ.
00:29:55.000 --> 00:30:02.800
The possible values of Y there could be anything from 0 up to, it is unbounded.
00:30:02.800 --> 00:30:06.700
That is the probability function for the Poisson distribution.
00:30:06.700 --> 00:30:12.700
If you do not remember that, if it looks like I just completely brought that in from that field.
00:30:12.700 --> 00:30:21.100
Maybe, what you want to do is re-watch the video about the Poisson distribution which can be found in the same set of lectures.
00:30:21.100 --> 00:30:25.100
Just scroll up, you will see a whole video on the Poisson distribution.
00:30:25.100 --> 00:30:28.700
In particular, you will see this formula in there, you will see where it comes from.
00:30:28.700 --> 00:30:38.400
Now, I want to find the moment generating function for the Poisson distribution, N sub Y of T.
00:30:38.400 --> 00:30:42.700
By definition, this is the definition I gave you earlier in this lecture.
00:30:42.700 --> 00:30:45.200
I highlighted it, you really would not miss it.
00:30:45.200 --> 00:30:52.100
It is the expected value of E ⁺T × Y.
00:30:52.100 --> 00:30:55.300
How will I calculate the expected value?
00:30:55.300 --> 00:31:02.300
For a discreet distribution, you take the sum overall possible values of Y,
00:31:02.300 --> 00:31:07.900
the probability of each of those values × the function that you are calculating E ⁺TY.
00:31:07.900 --> 00:31:13.700
If this were a continuous distribution, it would be almost the same, except, instead of the sum,
00:31:13.700 --> 00:31:16.400
we would have an integral.
00:31:16.400 --> 00:31:19.400
Also, instead of the P we have an F.
00:31:19.400 --> 00:31:25.900
But it would still be the same basic format, just you might want to get comfortable switching back and forth
00:31:25.900 --> 00:31:30.600
between sums and integrals in your mind, because they really play the same role.
00:31:30.600 --> 00:31:36.700
One for discrete distributions and one is for continuous distributions.
00:31:36.700 --> 00:31:42.300
I'm going to plug in what P of Y is, it is the sum on Y.
00:31:42.300 --> 00:31:47.500
I guess Y is equal to 0 to infinity, that is coming from this range on Y here.
00:31:47.500 --> 00:31:54.000
P of Y is λ ⁺Y/Y! × E ^-λ.
00:31:54.000 --> 00:31:58.500
I also have this term of E ⁺TY, what can I do with that.
00:31:58.500 --> 00:32:03.900
One thing I notice is that E ^-λ is not really doing anything.
00:32:03.900 --> 00:32:08.600
Because it does not have a Y in it, that means it is constant, I can pull that outside.
00:32:08.600 --> 00:32:15.200
E ^-λ × the sum from Y = 0 to infinity.
00:32:15.200 --> 00:32:19.900
Λ ⁺Y and E ⁺TY, I can combine those.
00:32:19.900 --> 00:32:25.800
E ⁺TY is the same as E ⁺T ⁺Y.
00:32:25.800 --> 00:32:34.500
This is λ E ⁺T ⁺Y/Y!.
00:32:34.500 --> 00:32:42.200
I do not need to write the E ^-λ because I wrote it outside, and that accounts for all terms here.
00:32:42.200 --> 00:32:47.300
Again, I'm going to pause and let you stare at this for a moment or 2,
00:32:47.300 --> 00:32:54.800
and have an epiphany because there really is a revelation to be made with this formula.
00:32:54.800 --> 00:33:01.700
Do you see the revelation that we had at this formula, just stare at it, there is something really good.
00:33:01.700 --> 00:33:07.100
As a hint, I will remind you of the old Taylor series for E ⁺X.
00:33:07.100 --> 00:33:17.100
The Taylor series for E ⁺X is the sum from N = 0 to infinity of X ⁺N/N!.
00:33:17.100 --> 00:33:24.500
Look at this, we have got the same formula here except that in place of N, we have got Y.
00:33:24.500 --> 00:33:28.800
In place of X, we got λ E ⁺T.
00:33:28.800 --> 00:33:42.500
What we really have here, of course we still got E ^-λ, is E ^λ E ⁺T, very nice and simple.
00:33:42.500 --> 00:33:45.600
By the way, notice now, that we have gotten rid of the Y.
00:33:45.600 --> 00:33:51.200
We got it down to a function of T, that is very convenient because that is
00:33:51.200 --> 00:33:53.500
what a moment generating function is supposed to be.
00:33:53.500 --> 00:33:58.000
It is supposed to be a function T not of Y.
00:33:58.000 --> 00:34:04.300
That is essentially the mean right now, I will do a little algebra to simplify it but we have done the hard part.
00:34:04.300 --> 00:34:09.900
I can combine these E ^λ E ⁺T – λ.
00:34:09.900 --> 00:34:16.000
E ^λ, if I factor that out × E ⁺T-1.
00:34:16.000 --> 00:34:21.200
That is the moment generating function for the Poisson distribution.
00:34:21.200 --> 00:34:30.600
We are done with that problem.
00:34:30.600 --> 00:34:33.800
To recap the steps there, in case anybody is a little confused.
00:34:33.800 --> 00:34:39.600
Poisson distribution is one we studied earlier, there is another video lecture on the Poisson distribution.
00:34:39.600 --> 00:34:41.600
Just scroll up and you will see it.
00:34:41.600 --> 00:34:45.500
In particular, you will see the probability function for the Poisson distribution.
00:34:45.500 --> 00:34:50.000
There it is right there, λ ⁺Y/Y! × E ^-λ.
00:34:50.000 --> 00:34:58.100
Λ is the parameter that comes in for the Poisson distribution, that you sort of fix ahead of time, it is a constant on.
00:34:58.100 --> 00:35:01.500
There is the range of Y, 0 to infinity.
00:35:01.500 --> 00:35:08.900
To find the moment generating function, we take the expected value of E ⁺TY which means we sum the Y,
00:35:08.900 --> 00:35:12.300
of the probability of Y × E ⁺TY.
00:35:12.300 --> 00:35:14.700
And then, I just dropped the probability function in there.
00:35:14.700 --> 00:35:25.700
There is the probability function, I sum of all the ranges of Y that we are interested in, that came from right here.
00:35:25.700 --> 00:35:31.800
This E ⁺TY, I discovered that I can write it as E ⁺T ⁺Y.
00:35:31.800 --> 00:35:34.900
I can combine it with λ ⁺Y.
00:35:34.900 --> 00:35:41.500
I factored out E ^-λ, I can factor that out because there is no Y in there, it is a constant.
00:35:41.500 --> 00:35:48.400
What I realized here is that, this exactly matches my Taylor series formula for E ⁺X.
00:35:48.400 --> 00:35:52.500
What I get here is E ^λ E ⁺T.
00:35:52.500 --> 00:36:00.400
And then, I did a little algebra to clean that up into E ^λ × E ⁺T – 1.
00:36:00.400 --> 00:36:05.500
Hang onto this moment generating function, we are going to use it again in the next example.
00:36:05.500 --> 00:36:09.600
We are going to find the mean and the variance of the Poisson distribution,
00:36:09.600 --> 00:36:13.600
using the moment generating function.
00:36:13.600 --> 00:36:21.300
Make sure you understand this, and when you are pretty confident with it, go ahead and work on example 4.
00:36:21.300 --> 00:36:28.400
You will see how we use this moment generating function to find the mean and the variance.
00:36:28.400 --> 00:36:34.300
In example 4, we are going to use the moment generating function for the Poisson distribution,
00:36:34.300 --> 00:36:38.100
to find the mean and the variance of the distribution.
00:36:38.100 --> 00:36:50.700
We just calculated in example 3, the moment generating function MY of T is E ^λ × E ⁺T-1.
00:36:50.700 --> 00:36:53.000
That was the moment generating function.
00:36:53.000 --> 00:36:57.500
If you do not remember how we did that, it means you did not just watched example 3.
00:36:57.500 --> 00:37:02.000
Go back and watch examples 3, that should make sense.
00:37:02.000 --> 00:37:13.700
There was an earlier fact that I gave you earlier in this lecture which is that E of Y is always the moment generating function.
00:37:13.700 --> 00:37:17.000
You take its derivative and then you plug in 0.
00:37:17.000 --> 00:37:26.700
We will use that to find the mean E of Y² is the second derivative of the moment generating function.
00:37:26.700 --> 00:37:33.300
You plug in 0, that is not the variance directly but you can use that very quickly to find the variance.
00:37:33.300 --> 00:37:38.700
We are going to take the second derivative of this moment generating function.
00:37:38.700 --> 00:37:44.700
It is going to get a little messy but it is not too bad, especially after we plug in 0, it is really not bad.
00:37:44.700 --> 00:37:54.000
Y prime of T is equal to, we have an exponential function, it is just E to all that same stuff.
00:37:54.000 --> 00:38:02.100
E ^λ × E ⁺T-1 ×, chain rule coming in here, the derivative of all that stuff in the exponent.
00:38:02.100 --> 00:38:07.700
That is λ × E ⁺T - λ × 1.
00:38:07.700 --> 00:38:12.000
Λ × 1 is constant, its derivative just goes away.
00:38:12.000 --> 00:38:18.000
That is it, let me go ahead and take the second derivative while I'm at it.
00:38:18.000 --> 00:38:29.500
N double prime of T, this is going to be nasty.
00:38:29.500 --> 00:38:32.200
We are going to have to use the product rule for it.
00:38:32.200 --> 00:38:35.800
It is not that bad, it is just kind of basic calculus 1 stuff.
00:38:35.800 --> 00:38:40.300
Let me factor out the λ because that is a constant, I factor that right now.
00:38:40.300 --> 00:38:42.600
The first × the derivative of the second.
00:38:42.600 --> 00:38:54.300
The first function is E ^λ × E ⁺T-1, × the second one is E ⁺T.
00:38:54.300 --> 00:39:01.700
I'm ignoring this λ now because I have pulled that to the outside.
00:39:01.700 --> 00:39:03.800
That was the first × the derivative of the second.
00:39:03.800 --> 00:39:07.400
The derivative of E ⁺T is E ⁺T.
00:39:07.400 --> 00:39:10.700
The second function × the derivative of the first one is little a messier.
00:39:10.700 --> 00:39:24.900
The second function is E ⁺T, the derivative of the first one is E ^λ × E ⁺T-1 × its derivative which by the chain rule is λ × E ⁺T.
00:39:24.900 --> 00:39:27.600
All of that multiplied by a λ.
00:39:27.600 --> 00:39:33.000
I could have simplify that but I do not think it is worth doing.
00:39:33.000 --> 00:39:37.300
Instead, what I'm going to do is plug in 0 to each of these functions.
00:39:37.300 --> 00:39:52.500
Let me go back above and Y prime of 0 is E ^λ ×, E ⁺T is E⁰, E⁰ is 1.
00:39:52.500 --> 00:40:01.600
So 1-1 is 0, it is E ^λ × 0 × λ × E⁰ × 1.
00:40:01.600 --> 00:40:06.300
That E⁰ is 1 is just λ.
00:40:06.300 --> 00:40:13.700
N double prime of 0, go through here and plug in 0 everywhere I see a T.
00:40:13.700 --> 00:40:22.500
Λ × E ^λ, E ⁺T is E⁰ which is 1.
00:40:22.500 --> 00:40:40.500
E ^λ × 0, E ⁺T is 1 + E⁰ is 1, E ^λ × 0 × λ × E⁰ is 1.
00:40:40.500 --> 00:40:41.600
Let us simplify this down.
00:40:41.600 --> 00:40:51.500
This is λ × E⁰ is 1 + I see another one × λ, 1 + λ.
00:40:51.500 --> 00:40:56.000
This simplifies down to λ + λ².
00:40:56.000 --> 00:40:58.900
How are we to use all this information?
00:40:58.900 --> 00:41:06.300
Remember, the expected value of Y is M prime of Y, M prime of 0.
00:41:06.300 --> 00:41:20.100
The expected value of Y is MY prime of 0 which we figure out was λ.
00:41:20.100 --> 00:41:24.600
That is λ right there, and that is that mean.
00:41:24.600 --> 00:41:31.300
We figured out the mean of our distribution is λ, very nice to know.
00:41:31.300 --> 00:41:37.600
To find the variance, it is a little more complicated.
00:41:37.600 --> 00:41:46.600
Sigma² is not just M double prime, it is the expected value of (Y)² - the expected value of Y².
00:41:46.600 --> 00:42:08.200
This is N double prime is E of Y², that is λ + λ² -, E of Y², the E of Y we figure out was λ, λ².
00:42:08.200 --> 00:42:11.200
This is very nice, the λ² cancel.
00:42:11.200 --> 00:42:16.800
For the variance, we also get λ, how convenient.
00:42:16.800 --> 00:42:23.300
What we have done is we have calculated the mean and variance of the Poisson distribution,
00:42:23.300 --> 00:42:25.600
based solely on the moment generating function.
00:42:25.600 --> 00:42:32.800
Once you understand the moment generating function, you can find the mean and variance of the distribution.
00:42:32.800 --> 00:42:35.700
Let me show you the steps there, again
00:42:35.700 --> 00:42:41.800
We calculated, first of all the moment generating function, that came from example 3.
00:42:41.800 --> 00:42:47.200
The work here was all done in example 3, and there was some work to be done there.
00:42:47.200 --> 00:42:54.000
And then, we took its derivative which was kind of, no product rule in that but there was a chain rule.
00:42:54.000 --> 00:43:01.200
You took its second derivative and there was a big product rule, and lots of little chain rules coming in.
00:43:01.200 --> 00:43:11.300
It got a little messy, but when we plunged in 0 then all the E⁰ turned into 1, that simplify a lot there.
00:43:11.300 --> 00:43:19.800
M single prime turn into λ, the M double prime, all the 0 turn into 1.
00:43:19.800 --> 00:43:23.000
It simplified down to λ + λ².
00:43:23.000 --> 00:43:33.100
Here is how we use those, remember, I told you on the 2nd slide of this lecture, that M prime is E of Y.
00:43:33.100 --> 00:43:44.800
M prime gives you E of Y which right away is the mean of the distribution, that λ is coming from there.
00:43:44.800 --> 00:43:54.300
The M double prime is the E of Y² which is not the variance yet but it factors into calculating the variance,
00:43:54.300 --> 00:44:00.100
because the variance is E of Y² – E of (Y)².
00:44:00.100 --> 00:44:05.500
That λ + λ² is where we got that λ + λ².
00:44:05.500 --> 00:44:13.900
The E of Y also came from up here.
00:44:13.900 --> 00:44:20.600
We plug in that λ in there, we got λ² which canceled off the λ² from E of Y².
00:44:20.600 --> 00:44:27.600
It just reduced down to the variance of the Poisson distribution is λ.
00:44:27.600 --> 00:44:34.100
Of course, those answers agree with what I told you several lectures ago, when we talked about the Poisson distribution.
00:44:34.100 --> 00:44:48.800
It is really reassuring to have those agree with what we had previously suspected there.
00:44:48.800 --> 00:44:53.500
In example 5, we are going to find the moment generating function for the uniform distribution.
00:44:53.500 --> 00:44:58.700
This is kind of nice because the other examples were both discreet distributions.
00:44:58.700 --> 00:45:02.700
This is the only continuous distribution we are going to calculate.
00:45:02.700 --> 00:45:04.700
The others are kind of messy.
00:45:04.700 --> 00:45:10.100
Even doing this for the uniform distribution, it is a little messy than you might expect,
00:45:10.100 --> 00:45:13.700
considering that the uniform distribution is so simple.
00:45:13.700 --> 00:45:17.800
Let me remind you what the uniform distribution is.
00:45:17.800 --> 00:45:22.900
The density function for the uniform distribution is F of Y is always equal to,
00:45:22.900 --> 00:45:33.500
those three lines mean it is constantly equal to 1/θ2 – θ1, where Y ranges between θ1 and θ2.
00:45:33.500 --> 00:45:36.900
It is just the constant distribution, that is why it is called uniform.
00:45:36.900 --> 00:45:39.100
Let us find the moment generating function.
00:45:39.100 --> 00:45:51.900
By definition, the moment generating function is := means defined to be, the expected value of E ⁺T × Y.
00:45:51.900 --> 00:46:01.600
The way you calculate the expected value of a function is, with the discreet distribution we are studying before was the sum.
00:46:01.600 --> 00:46:05.200
For a continuous distribution, it is an integral.
00:46:05.200 --> 00:46:09.300
The integral, this is also a definition of expected value.
00:46:09.300 --> 00:46:17.200
It is the integral of the density function F of Y × whatever function you are trying to find the expected value of,
00:46:17.200 --> 00:46:20.100
in this case E ⁺TY DY.
00:46:20.100 --> 00:46:28.600
And then, you integrate that over your whole range for Y, which in this case is θ1 to θ2.
00:46:28.600 --> 00:46:31.800
Now, we just have to do some calculus.
00:46:31.800 --> 00:46:35.600
This is the integral from θ1 to θ2.
00:46:35.600 --> 00:46:41.700
F of Y is 1/θ2 – θ1, that is just a constant there.
00:46:41.700 --> 00:46:46.900
E ⁺TY DY, not such a bad integral, really not too bad.
00:46:46.900 --> 00:46:52.900
The answer is 1/θ2 – θ1, that is a constant, I can pull it out.
00:46:52.900 --> 00:46:57.800
What is the integral of the E ⁺TY, remember here, our variable is Y.
00:46:57.800 --> 00:47:03.700
We are integrating with respect to Y.
00:47:03.700 --> 00:47:10.600
The integral of E ⁺TY, if you do a little substitution there, let me go ahead and do it in my head.
00:47:10.600 --> 00:47:19.000
It is just E ⁺TY × 1/T, that is because we are thinking of T as being constant here.
00:47:19.000 --> 00:47:24.000
Y is the variable of integration, it is just 1/T.
00:47:24.000 --> 00:47:28.500
If you take the derivative of that with respect to Y, you get back to E ⁺TY.
00:47:28.500 --> 00:47:36.800
We want to evaluate that from Y is equal to θ1 to Y is equal to θ2.
00:47:36.800 --> 00:47:45.600
We get, I will combine the T in the θ2 – θ1.
00:47:45.600 --> 00:47:51.200
We are plugging in these values for Y.
00:47:51.200 --> 00:48:02.500
E ^θ2 × T – E ^θ1 × T, I need parentheses here.
00:48:02.500 --> 00:48:14.200
I could write that over a common denominator, E ^θ2t –e ^ θ1t.
00:48:14.200 --> 00:48:20.700
We divide that by T × (θ2- θ1).
00:48:20.700 --> 00:48:25.800
That is my moment generating function for the uniform distribution.
00:48:25.800 --> 00:48:32.800
Notice that, this is a function of T now, there are no Y anywhere.
00:48:32.800 --> 00:48:37.100
That is what is supposed to happen with a moment generating function.
00:48:37.100 --> 00:48:43.400
It should always be a function of T, it should not have any Y anywhere in there.
00:48:43.400 --> 00:48:54.700
This is my complete answer here and I'm done with that example, except for a quick recap of the steps there.
00:48:54.700 --> 00:48:58.900
Just to remind you, we have a whole lecture on the uniform distribution.
00:48:58.900 --> 00:49:06.200
If you do not remember the basic premise of the uniform distribution, you can go back and do a quick review there.
00:49:06.200 --> 00:49:09.800
The density function is 1/θ2 – θ1.
00:49:09.800 --> 00:49:14.700
In particular, it is constant that is why I have three lines here to show that is always equal to.
00:49:14.700 --> 00:49:18.500
The range goes from θ1 to θ2.
00:49:18.500 --> 00:49:27.300
The moment generating function, by definition, we learned that in this lecture, it is the expected value of E ⁺TY.
00:49:27.300 --> 00:49:34.300
The expected value of any function is the integral of the density function × that function.
00:49:34.300 --> 00:49:41.100
If this were discreet, we have the sigma sign summation, instead of an integral,
00:49:41.100 --> 00:49:45.500
and we have a probability function P, instead of a density function F.
00:49:45.500 --> 00:49:50.400
It is really the same idea, when you look at these formulas, if you kind of blur your eyes a little bit,
00:49:50.400 --> 00:49:53.300
you should see how they are really the same idea.
00:49:53.300 --> 00:50:01.400
Integrals, like adding things up, the probability function is kind of the analogue of the density function.
00:50:01.400 --> 00:50:07.300
Instead of the summation of P of Y, we have the integral of F of Y, and then, we still have E ⁺TY.
00:50:07.300 --> 00:50:14.300
F of Y from above is just 1/θ2 – θ1, that comes from up above.
00:50:14.300 --> 00:50:16.500
We will pull that out, since it is a constant.
00:50:16.500 --> 00:50:21.000
Now, we have to integrate E ⁺TY, I did a u substitution.
00:50:21.000 --> 00:50:33.400
My u was TY, my DU was T DY, DY was 1/T DU.
00:50:33.400 --> 00:50:36.800
That is where I got that 1/T on the outside there.
00:50:36.800 --> 00:50:41.700
It is the opposite of the chain rule or a substitution.
00:50:41.700 --> 00:50:46.200
We still have E ⁺TY that is because we are integrating with respect to Y, not with respect to T.
00:50:46.200 --> 00:50:57.500
The range on Y goes from θ1 to θ2, I plug those in and I still had 1/T × θ2 – θ1.
00:50:57.500 --> 00:51:02.300
It is still quite complicated considering that it is a uniform distribution,
00:51:02.300 --> 00:51:06.900
you might expect something simpler for the uniform distribution.
00:51:06.900 --> 00:51:14.300
But you end up with this function of T that does represent the moment generating function for the uniform distribution.
00:51:14.300 --> 00:51:18.100
I’m not going to take this one any farther, but if you want to, you could use this
00:51:18.100 --> 00:51:23.700
to find the mean and the variance of the uniform distribution.
00:51:23.700 --> 00:51:27.500
The same that we did in example 4, with the Poisson distribution.
00:51:27.500 --> 00:51:32.300
You can calculate those out, it gets a little messy so I'm not going to do it here.
00:51:32.300 --> 00:51:37.000
Instead, I'm going to wrap up this lecture here on moment generating functions.
00:51:37.000 --> 00:51:42.600
This is part of the probability lecture series here on www.educator.com.
00:51:42.600 --> 00:51:47.300
Next up, we are going to talk about by Bivariate distribution, we will have a Y1 and Y2.
00:51:47.300 --> 00:51:52.100
That is another whole chapter of excitement, I hope you will stick around for that.
00:51:52.100 --> 00:51:58.000
You are watching probability lectures on www.educator.com, my name is Will Murray, thank you very much for joining me, bye.