Sign In | Subscribe

Enter your Sign on user name and password.

Forgot password?
  • Follow us on:
Start learning today, and be successful in your academic & professional career. Start Today!
Loading video...
This is a quick preview of the lesson. For full access, please Log In or Sign up.
For more information, please see full course syllabus of Probability
  • Discussion

  • Study Guides

  • Download Lecture Slides

  • Table of Contents

  • Transcription

Start Learning Now

Our free lessons will get you started (Adobe Flash® required).
Get immediate access to our entire library.

Sign up for

Membership Overview

  • Unlimited access to our entire library of courses.
  • Search and jump to exactly what you want to learn.
  • *Ask questions and get answers from the community and our teachers!
  • Practice questions with step-by-step solutions.
  • Download lesson files for programming and software training practice.
  • Track your course viewing progress.
  • Download lecture slides for taking notes.
  • Learn at your own pace... anytime, anywhere!

Moment-Generating Functions

Download Quick Notes

Moment-Generating Functions

Lecture Slides are screen-captured images of important points in the lecture. Students can download and print out these lecture slide images to do practice problems as well as take notes while watching the lecture.

  • Intro 0:00
  • Moments 0:30
    • Definition of Moments
  • Moment-Generating Functions (MGFs) 3:53
    • Moment-Generating Functions
    • Using the MGF to Calculate the Moments
  • Moment-Generating Functions for the Discrete Distributions 8:22
    • Moment-Generating Functions for Binomial Distribution
    • Moment-Generating Functions for Geometric Distribution
    • Moment-Generating Functions for Negative Binomial Distribution
    • Moment-Generating Functions for Hypergeometric Distribution
    • Moment-Generating Functions for Poisson Distribution
  • Moment-Generating Functions for the Continuous Distributions 11:34
    • Moment-Generating Functions for the Uniform Distributions
    • Moment-Generating Functions for the Normal Distributions
    • Moment-Generating Functions for the Gamma Distributions
    • Moment-Generating Functions for the Exponential Distributions
    • Moment-Generating Functions for the Chi-square Distributions
    • Moment-Generating Functions for the Beta Distributions
  • Useful Formulas with Moment-Generating Functions 15:02
    • Useful Formulas with Moment-Generating Functions 1
    • Useful Formulas with Moment-Generating Functions 2
  • Example I: Moment-Generating Function for the Binomial Distribution 17:33
  • Example II: Use the MGF for the Binomial Distribution to Find the Mean of the Distribution 24:40
  • Example III: Find the Moment Generating Function for the Poisson Distribution 29:28
  • Example IV: Use the MGF for Poisson Distribution to Find the Mean and Variance of the Distribution 36:27
  • Example V: Find the Moment-generating Function for the Uniform Distribution 44:47

Transcription: Moment-Generating Functions

Hi, welcome back to the probability lectures here on, my name is Will Murray.0000

We are going to talk today about moment generating functions.0005

Moment generating functions are one of the most confusing topics that people encounter in probability.0010

I'm going to try to walk you through them and show you what they are used for.0017

You might prepare yourself to be a little confused at first because every time I taught it,0021

it is my students who always found them to be a little confusing.0026

I will try to show you how it works.0029

The initial idea I want to talk about is a moments.0032

We start with a random variable and it can be discreet or continuous.0037

We will talk about moment generating functions for all of the distributions that we have been studying,0040

all of the discrete ones, binomial, geometric, and so on, and all of the continuous distributions, uniform and normal, and so on.0046

We can talk about moments and we can talk about moment generating functions for all of these distributions.0055

The first definition is the Kth moment of Y taken around the mean.0063

Let me highlight that.0069

The Kth moment of Y taken around the mean is just the expected value of Y ⁺K.0070

The Kth there can be 1, 2, 3, and it can be 0, although people do not usually need to look0079

at 0 as moment because that is not very illuminating.0084

I said mean but I meant to say R gen.0088

We are also going to talk about moments around the mean.0091

But, it is important here that we are talking about the moments around the origin.0093

There is some notation that is sometimes used for this which is ν K prime.0100

That is really not obvious why we would use the notation ν K prime.0106

I’m not going to use that notation in this lecture, but if you are following along0111

in your own probability course or in your own probability book, you might see the notation ν K prime.0115

What that means is the expected value of Y ⁺K.0121

Those mean the same thing.0126

There is another notation that you might see in your book which is that idea of central moments.0128

Instead of taking the moment around the origin, we will talk about taking the moment about the mean.0138

Which means, instead of talking about Y ⁺K, you do Y – μ ⁺K where μ is the mean of the original distribution.0144

And that is called μ sub K and that is why we have to use μ sub K prime for the one that we are studying.0155

I want to emphasize that there are 2 different ideas here.0162

There is the moment around the origin and there is the moment around the mean.0166

In this lecture, in the probability lectures here on,0170

I'm just going to look at the moment taken around the origin.0175

I have got some more common one and it is easier to understand the ideas for that one.0179

I'm not going to talk anymore about the central moment.0186

We will talk about the moment around the mean.0189

I just mention that, in case you see it in your course, you know what the difference is.0192

You do not really need to study both of them, you can figure it out.0198

If you know one, you can figure out the other one just by doing some computations.0202

It is not necessary to study both of them.0208

You pick a system and then you follow that, and you can find all the information you need within one system.0210

The system we are going to use is the moments around the origin.0216

I would not talk anymore about moments about the mean.0220

I just included it, in case you see it in your course.0224

This has told us what the moment generating function is, let me jump onto the next slide and show you what that is.0227

The moment generating function for Y is M sub Y of T.0235

That := means it is defined to be.0241

It is defined to be the expected value of E ⁺TY.0244

That is a very illuminating definition, I do want to highlight it here0249

because it is probably the most important definition we are going to have in this whole lecture.0255

It is not obvious what it means right now, and I'm not going to clarify it right away .0262

I’m just going to throw the definition at you and then we will practice using it to solve some problems.0268

MY of T remember is defined to be the expected value of E ⁺TY, that is E like the exponential function.0274

The important things that you need to remember right now is,0284

the first one is that the moment generating function is a function of T not of Y.0289

When you calculate the moment generating function for distribution, you should have a function of T.0303

You should see a T in your answer.0309

By the time you simplify it down, you will not see a Y.0310

We will do some examples and you will see how it works out.0314

The Y always disappear, you always end up with a function of T.0317

Here is how you use the moment generating function.0323

Once you know it, this first line is kind of trivial but I included because it will make the other lines make more sense.0326

The expected value of Y⁰ is equal to the moment generating function with 0 plugged in for T.0337

The expected value of Y⁰, Y⁰ is always 1 because anything to the 0 is 1.0347

That is the expected value of 1 which of course will be 1.0355

It is not like we are really learning anything much from the moment generating function,0359

because we already knew that the expected value of Y⁰ is 1.0363

In the next line, the moment generating function starts to become useful.0367

What you do is you take the derivative of the moment generating function.0372

And again, you plug in 0 for T and what that tells you is the expected value for your distribution.0376

Now, we have something useful, we fused the moment generating function to find the mean of the distribution.0383

In the second line, what we have done is we have take another derivative M prime prime.0391

We plug in T is equal to 0.0398

What that tells us is, the second moment of the distribution E of Y².0401

Why is that useful, the reason that is useful is because it helps us to find the variance of the distribution.0407

We can use this to find the variance.0414

Be careful here, the variance is not the expected value of Y².0419

Let me remind you how we calculate the variance.0424

We calculate the variance as σ² is equal to the expected value of Y² - the expected value of (Y)².0428

If we can figure out these 2 moments using the moment generating function,0440

what we can do is drop in the expected value of Y² here from the MGF.0446

We can use the MGF, the moment generating function, to calculate the expected value of Y².0455

We can also use the moment generating function to calculate the expected value of Y MGF.0464

Both of these ingredients, they go into calculating the variance come from the moment generating function.0473

That is how we use the moment generating function, is to find these two ingredients to calculate the variance.0480

There are other uses for the moment generating function, later on in statistics0487

but I'm not going to get into them right away in this lecture.0492

Instead, what I want to do is show you some of the moment generating functions for our favorite distributions.0496

We will start with the discrete distribution.0503

We have here, all our favorite discreet distributions, binomial, geometric,0506

negative binomial, hypergeometric, and Poisson distribution.0511

Here are what the moment generating functions turn out to be.0517

For binomial, it is PE ⁺T + 1 - (P) ⁺n.0521

By the way, the binomial distribution, we often define Q to be 1 – P.0527

That term of 1 – P, people often write that as Q, and they simplify the way0535

to write the moment generating function somewhat.0543

For the geometric distribution, PE ⁺T/1 - (1 - P) E ⁺T.0547

Again, there is a Q in there, that is equal to 1 – P.0554

If you want to simplify this down, you can write this as PE ⁺T/1 - Q × E ⁺T.0557

A little bit simpler to write at the expense of having one more variable.0565

Negative binomial distribution is almost the same thing, except on there is an R in the exponent.0569

Almost the same as the geometric distribution.0577

Again, you can put in a Q for 1 – P, if you like that.0579

The hypergeometric distribution has no closed form moment generating function.0583

If you try to calculate the moment generating function of a hypergeometric distribution, it just blows up in your face.0589

There is no reason to go there, we would not go there.0595

The Poisson distribution much more well behaved, it is E ⁺λ × E ⁺T – 1.0597

Couple of things I want to mention about all of these, one is you might be wondering where these come from,0605

how do you calculate these moment generating functions.0610

Stay tuned, I will tell you because we will work out a couple of these in the examples.0612

Or you can just scroll down right now, if you are bursting with curiosity.0619

Check out example 1 and 3, I think, we are going to do the binomial distribution.0622

We will calculate the moment generating function.0631

For example 3, we are going to take the Poisson distribution and calculate moment generating function.0633

You will be able to see where these come from.0639

Another thing that I want to point out about this is, that you notice that nowhere on here do you see the variable Y.0641

All of these are functions of T, you see T everywhere here.0650

The moment generating function is always a function of T not Y, it is a function of T.0664

If you are calculating a moment generating function, if you still have Y on your paper0672

then you need to keep going until you can get rid of the Y, and try to simplify it down into a function of T.0677

These are just the discreet distributions, we also have a number of continuous distributions.0688

Let us go ahead and look at those.0693

Here, our favorite continuous distributions, uniform, normal, gamma, exponential, Chi square, and the β distribution.0696

The uniform distribution is a very simple distribution.0704

It has a surprisingly complicated moment generating function, E ⁺T θ2 – E ⁺T θ1 ÷ T × θ2 – θ1.0707

I keep saying my θ in the wrong order.0721

We are going to calculate that one out by hand, I think that is example 5.0724

If you want, you can scroll down and take a look at example 5.0732

You will see how we calculate the uniform distribution.0737

The others are more difficult, I did not put them into examples.0740

The normal distribution E ⁺ν T + T² σ²/2.0745

All of this is in the exponent of the E.0750

There is a lot in the exponent there.0754

The gamma distribution is 1 - β T ⁻α.0757

The next two distributions, remember are actually special cases of the gamma distribution.0764

The exponential distribution is just the gamma distribution where we take α equal to 1.0770

If you look at the gamma distribution, the moment generating function, and just plug in α = 1,0778

you get the moment generating function for the exponential distribution.0786

It is quite nice and simple.0789

The Chi square distribution is the gamma distribution with α defined to be ν/2.0791

Ν is the number of degrees of freedom and β is equal to 2.0801

If you take the gamma distribution and you plug in α is equal to ν/2 and β is equal to 2,0808

you get the moment generating function for the Chi square distribution, 1 -2T ⁻ν/2.0818

The β distribution, if you try to calculate the moment generating function,0829

you will get into a horrible mess and it just blows up in your face.0834

We say that there is no closed formula in moment generating function for the β distribution.0839

By the way, if you are a little rusty on what all these words mean, uniform, normal, gamma,0844

exponential, chi square, β, we have separate lectures about each one of these distributions.0849

You can go back and you can read up on the uniform distribution.0854

You can practice the normal distribution.0857

You can study the gamma distribution.0859

Of course, the exponential and chi square distribution, those are special cases of gamma distribution.0862

You will find those in the lecture on gamma distribution.0867

Just scroll up here and you will see the lecture on gamma distribution.0871

You will get the exponential and Chi square thrown in there as a bonus.0874

There is also a lecture on the β distribution, you can read up all about that.0878

The only things that are not in those lectures are the moment generating functions.0884

That is what I'm telling you about right now.0889

Let us go ahead and jump into some examples, and see how we actually derive0892

these moment generating functions, and then see how we can use them to calculate some means and some variances.0894

I see we have one more slide before I talk about the examples.0905

A couple of useful formulas for the moment generating functions.0908

If you have one known random variable Y and you do a linear change of variables.0913

If you define Z to be AY + B, := means defined to be.0920

If you define Z to be AY + B, then the moment generating function for Z is related0927

to the moment generating function for Y, except that there is an A missing in there.0935

Let me just go ahead and write that A in there.0945

It is just MY of AT and then E × E ⁺BT.0951

That is how you get from the moment generating function of Y to the moment generating function of Z.0959

Very useful, by the way, when you are converting normal distributions.0966

When you convert to a standard normal variable, you are doing exactly this kind of variable change.0970

This is quite useful, when you want to calculate the moment generating function.0978

Second useful formula, when Y1 and Y2 are independent variables.0982

Z is Y1 + Y2, there you are defining Z to be Y1 + Y2.0987

This only works for independent variables.0995

But when they are not independent, you can say that the moment generating function for Z0997

is the moment generating function for Y × the moment generating function for Y2.1002

What moment generating functions do is they convert sums into products.1008

That is really not surprising, that is essentially based on the fact that E ⁺X + Y is equal to E ⁺X × E ⁺Y.1013

Remember, our initial definition of moment generating function was in terms of the expected value of an exponential.1022

The fact that moment generating functions convert sums of variables into products of functions,1030

converts addition into multiplication, is really not very surprising.1038

But, you do have to check that you are talking about independent variables.1042

Let us go on and talk about some examples where we will actually calculate some moment generating functions.1047

In example 1, we want to find the moment generating function for the binomial distribution.1055

Let me remind you of the probability function for the binomial distribution.1061

It is been awhile since we studied that.1067

If you do not know what the binomial distribution is at all, just check back in the list up above,1069

you will see a whole lecture on the binomial distribution.1075

The take away from that lecturer right now, is that the probability of a value of Y is equal to N choose Y,1079

that is a binomial coefficient.1087

P ⁺Y Q ⁺N-Y, that is for Y ranging between 0 and N.1089

It represents the probability of getting Y heads when you flip a coin N ×.1098

Let us try to figure out the moment generating function for that distribution.1106

M sub Y of T, using that definition of moment generating function, defined to be the expected value of E ⁺TY.1110

How do you find the expected value of a function of Y?1124

Here is how you do it, I showed you this in a very early lecture.1127

It is the sum over all values of Y, of the probability of that particular Y, × that function of Y, E ⁺TY.1133

We need to expand that and figure it out.1147

What values of Y are we talking about?1150

I read from here that the range of values of Y is from my equal 0 to N.1152

The probability of each Y, I wrote that down right above.1158

It is N choose Y × P ⁺Y × Q ⁺N- Y.1162

Now, I have to multiply on this term E ⁺TY.1169

What can I do with this, remember I’m trying to simplify this into a function of T,1175

which means I'm trying to get rid of the Y, which means I have to do something clever.1180

Here is what I can do, I notice that I have P ⁺Y here.1184

Here, I have E ⁺TY which I can write as E ^(T) ⁺Y.1189

I can combine those two factors, that is what I'm going to do.1196

Y = 0 ⁺N of n choose Y of PE ⁺T ⁺Y × Q ⁺N-Y.1200

If you stare at this very heart, you are supposed to recognize something, to have a small epiphany, if you will.1216

In fact, you might want to stop the video right now and stare at this formula,1223

and go ahead and have that epiphany.1228

I will wait, did you come and have that epiphany?1231

I think it is worth staring at that equation because it is really fun to recognize something.1235

What you are supposed to recognize in this formula is the binomial theorem.1239

I will remind you what the binomial theorem says.1244

It says (A + B) ⁺n is equal to the sum from Y =0 to N of N choose Y A ⁺Y B ⁺N-Y.1247

You might have seen the binomial theorem used in a slightly different variable,1262

but it should be the same theorem because it is a universal truth.1267

What we have here is exactly that formula.1271

We are sort of reverse engineering the binomial theorem now, but my A is going to be PE ⁺T, my B is Q.1275

We have a perfect match of the binomial theorem.1284

It is A + B ⁺N, that is PE ⁺T + Q ⁺N.1287

Notice here that, we have a function of T.1299

T only, there are no Y left anymore.1307

The moment generating function is now a function of T, we have solved the problem.1312

If you do not like that Q, where did that Q come from.1319

You can always put it back in two terms of P.1322

You could write this as PE ⁺T, Q is 1 – P, all of that is still raised to the nth power.1325

I think that is the version that I gave you on the chart of moment generating functions a couple of slides ago.1335

Now you know how those two correspond to each other.1341

We are done with that example, we found the moment generating function for the binomial distribution.1346

Let me recap the steps we went through.1352

First of all, I have reminded myself of the probability function for the binomial distribution.1354

Here it is, N choose Y P ⁺Y Q ⁺N-Y.1360

Here is the range of Y values involved.1364

And then, I used the definition of the moment generating function, found on one of the earlier slides in this lecture.1367

It is the expected value of E ⁺TY.1375

The expected value of any function, the way you calculate it is you sum/Y.1378

This would be an integral, if you are in a continuous distribution.1383

But since binomial is discrete, we are using the sum.1386

The probability of Y × that function E ⁺TY, I expanded P of Y that is what I did here, I expand P of Y into that.1389

And then, I noticed that there is a P ⁺Y and E ⁺TY.1403

I can combine those, if I cleverly write E ⁺TY as E ⁺T ⁺Y.1407

I combined those together as PE ⁺T ⁺Y.1413

And then, I really had an epiphany, I said look, that is exactly the binomial theorem.1417

I reminded myself of the binomial theorem here.1423

I noticed how this fits that pattern and this is exactly PE ⁺T + Q ⁺nth.1427

Notice that, it is a function of T, there are no more Y left in this.1435

If you do not like the Q, you could always expand it out into 1 – P.1441

That was the role that Q played in the binomial distribution.1445

Hang onto this moment generating function because we have not really used it for anything yet.1449

We just figured out what it was.1455

I just justified this formula on the chart at the beginning of this lecture, but I have not used it for anything yet.1457

What I'm going to do in the next example is, we will use this formula to calculate the mean of the binomial distribution.1465

We will see for the first time what MGF can be good for.1473

Do no forget this formula, we are going to use it again right away in example 2.1477

In example 2, we are going to use the MGF for the binomial distribution to find the mean of the distribution.1482

We calculated the moment generating function for the binomial distribution in the previous example, example 1.1489

If you did not just watch example 1, maybe go back and watch it right now.1496

What you will find out is that the moment generating function, this is what we calculated in example 1,1500

turned out to be PE ⁺T + Q ⁺nth.1507

What is that mean? I have no idea.1514

But let me show you how we can use it.1516

Remember that, we can calculate the mean of the distribution, the expected value of Y.1519

The way you calculate that, the way you calculate it now that we have the event Scientific Technology1527

of the moment generating function is to take M sub Y prime of T at T =0.1533

You take its derivative and then you plug in T = 0.1545

This is something that we learned in the second slide, I think, of this lecture.1549

If you scroll back a few slides and look at that, you will see where this comes from.1554

Let us figure out what the derivative of this is, PE ⁺T + Q ⁺N.1559

Remember, T is my variable, everything else is a constant P, Q, E, N, those are all constants.1565

N is an exponent, I'm going to use the power rule.1572

It is time to review your calculus 1.1575

The derivative of something to the nth is N × all that stuff, PE ⁺T + Q ⁺N-1 × the derivative of this stuff inside.1578

That is the chain rule, I have to do PE ⁺T, and Q is a constant, I do not have to do anything about that.1591

That is the chain rule that I had to write PE ⁺T on the outside there.1599

At T = 0, I got to plug in T = 0.1603

If I plug in T = 0, it is N × P × E ⁺T is just 1 + Q ⁺N -1 × P × E⁰ is just 1.1609

In the parentheses there, I see that I have P + Q.1625

Remember that, Q is 1 – P, that means P + Q is equal to 1.1628

I have got N × 1, that P + Q magically simplifies into 1.1635

1 ⁺N-1 × P × 1.1642

1 ⁺N- 1 is just 1, and I have got N × P.1648

That is the mean of the binomial distribution, you can call it the expected value or the mean,1655

I do not care which one you use because they both mean the same thing.1660

This is something that we did know years and years ago, when we study the binomial distribution.1664

But, it is nice to have the moment generating function to confirm it.1672

The mean of the binomial distribution is N × P.1676

Let me recap the steps there.1680

I started off with the moment generating function that I calculated back in example 1.1682

That comes from example 1, if you did not just watched example 1 then you are missing out1688

because you would not know how we derived that.1693

Maybe you go back and watch example 1 to see where that came from.1696

To find the expected value of any distribution, what you do is you can take1701

the moment generating function take its derivative, and then plug in T = 0.1707

We took its derivative, a little bit of calculus 1 coming here.1713

We got the power rule N × PE ⁺T + Q ⁺N -1.1716

The chain rule means you have to multiply on the derivative with the stuff inside.1721

That is where the PE ⁺T came from, and the Q just goes away because it is a constant.1725

And then, I plug in T = 0 that is why I got E ⁺T is 1 here.1731

P + Q turn into 1, and that all simplifies down 1 ⁺N-1 just turns into 1.1738

That simplifies down to NP, now, I know what the mean of the binomial distribution is.1746

We are going to do this again, something similar with the Poisson distribution.1755

If this still does not make sense then you got a chance to see the same kind of process with the Poisson distribution.1760

Stick around for examples 3 and 4.1766

In example 3, we are going to find the moment generating function for the Poisson distribution.1770

It is kind of working from scratch there.1775

Let me remind you, first of all, the probability function for the Poisson distribution.1777

The probability function for the Poisson distribution, there was a λ parameter in there.1782

It is λ ⁺Y/Y! × E⁻λ.1787

The possible values of Y there could be anything from 0 up to, it is unbounded.1795

That is the probability function for the Poisson distribution.1803

If you do not remember that, if it looks like I just completely brought that in from that field.1807

Maybe, what you want to do is re-watch the video about the Poisson distribution which can be found in the same set of lectures.1813

Just scroll up, you will see a whole video on the Poisson distribution.1821

In particular, you will see this formula in there, you will see where it comes from.1825

Now, I want to find the moment generating function for the Poisson distribution, N sub Y of T.1829

By definition, this is the definition I gave you earlier in this lecture.1838

I highlighted it, you really would not miss it.1843

It is the expected value of E ⁺T × Y.1845

How will I calculate the expected value?1852

For a discreet distribution, you take the sum overall possible values of Y,1855

the probability of each of those values × the function that you are calculating E ⁺TY.1862

If this were a continuous distribution, it would be almost the same, except, instead of the sum,1868

we would have an integral.1874

Also, instead of the P we have an F.1876

But it would still be the same basic format, just you might want to get comfortable switching back and forth1879

between sums and integrals in your mind, because they really play the same role.1886

One for discrete distributions and one is for continuous distributions.1890

I'm going to plug in what P of Y is, it is the sum on Y.1897

I guess Y is equal to 0 to infinity, that is coming from this range on Y here.1902

P of Y is λ ⁺Y/Y! × E ⁻λ.1907

I also have this term of E ⁺TY, what can I do with that.1914

One thing I notice is that E ⁻λ is not really doing anything.1918

Because it does not have a Y in it, that means it is constant, I can pull that outside.1924

E ⁻λ × the sum from Y = 0 to infinity.1928

Λ ⁺Y and E ⁺TY, I can combine those.1935

E ⁺TY is the same as E ⁺T ⁺Y.1940

This is λ E ⁺T ⁺Y/Y!.1946

I do not need to write the E ⁻λ because I wrote it outside, and that accounts for all terms here.1954

Again, I'm going to pause and let you stare at this for a moment or 2,1962

and have an epiphany because there really is a revelation to be made with this formula.1967

Do you see the revelation that we had at this formula, just stare at it, there is something really good.1975

As a hint, I will remind you of the old Taylor series for E ⁺X.1982

The Taylor series for E ⁺X is the sum from N = 0 to infinity of X ⁺N/N!.1987

Look at this, we have got the same formula here except that in place of N, we have got Y.1997

In place of X, we got λ E ⁺T.2004

What we really have here, of course we still got E ⁻λ, is E ⁺λ E ⁺T, very nice and simple.2009

By the way, notice now, that we have gotten rid of the Y.2022

We got it down to a function of T, that is very convenient because that is2025

what a moment generating function is supposed to be.2031

It is supposed to be a function T not of Y.2033

That is essentially the mean right now, I will do a little algebra to simplify it but we have done the hard part.2038

I can combine these E ⁺λ E ⁺T – λ.2044

E ⁺λ, if I factor that out × E ⁺T-1.2050

That is the moment generating function for the Poisson distribution.2056

We are done with that problem.2061

To recap the steps there, in case anybody is a little confused.2070

Poisson distribution is one we studied earlier, there is another video lecture on the Poisson distribution.2074

Just scroll up and you will see it.2079

In particular, you will see the probability function for the Poisson distribution.2081

There it is right there, λ ⁺Y/Y! × E ⁻λ.2085

Λ is the parameter that comes in for the Poisson distribution, that you sort of fix ahead of time, it is a constant on.2090

There is the range of Y, 0 to infinity.2098

To find the moment generating function, we take the expected value of E ⁺TY which means we sum the Y,2101

of the probability of Y × E ⁺TY.2109

And then, I just dropped the probability function in there.2112

There is the probability function, I sum of all the ranges of Y that we are interested in, that came from right here.2115

This E ⁺TY, I discovered that I can write it as E ⁺T ⁺Y.2126

I can combine it with λ ⁺Y.2132

I factored out E ⁻λ, I can factor that out because there is no Y in there, it is a constant.2135

What I realized here is that, this exactly matches my Taylor series formula for E ⁺X.2141

What I get here is E ⁺λ E ⁺T.2148

And then, I did a little algebra to clean that up into E ⁺λ × E ⁺T – 1.2152

Hang onto this moment generating function, we are going to use it again in the next example.2160

We are going to find the mean and the variance of the Poisson distribution,2165

using the moment generating function.2169

Make sure you understand this, and when you are pretty confident with it, go ahead and work on example 4.2173

You will see how we use this moment generating function to find the mean and the variance.2181

In example 4, we are going to use the moment generating function for the Poisson distribution,2188

to find the mean and the variance of the distribution.2194

We just calculated in example 3, the moment generating function MY of T is E ⁺λ × E ⁺T-1.2198

That was the moment generating function.2211

If you do not remember how we did that, it means you did not just watched example 3.2213

Go back and watch examples 3, that should make sense.2217

There was an earlier fact that I gave you earlier in this lecture which is that E of Y is always the moment generating function.2222

You take its derivative and then you plug in 0.2234

We will use that to find the mean E of Y² is the second derivative of the moment generating function.2237

You plug in 0, that is not the variance directly but you can use that very quickly to find the variance.2247

We are going to take the second derivative of this moment generating function.2253

It is going to get a little messy but it is not too bad, especially after we plug in 0, it is really not bad.2259

Y prime of T is equal to, we have an exponential function, it is just E to all that same stuff.2265

E ⁺λ × E ⁺T-1 ×, chain rule coming in here, the derivative of all that stuff in the exponent.2274

That is λ × E ⁺T - λ × 1.2282

Λ × 1 is constant, its derivative just goes away.2288

That is it, let me go ahead and take the second derivative while I'm at it.2292

N double prime of T, this is going to be nasty.2298

We are going to have to use the product rule for it.2309

It is not that bad, it is just kind of basic calculus 1 stuff.2312

Let me factor out the λ because that is a constant, I factor that right now.2316

The first × the derivative of the second.2320

The first function is E ⁺λ × E ⁺T-1, × the second one is E ⁺T.2322

I'm ignoring this λ now because I have pulled that to the outside.2334

That was the first × the derivative of the second.2342

The derivative of E ⁺T is E ⁺T.2344

The second function × the derivative of the first one is little a messier.2347

The second function is E ⁺T, the derivative of the first one is E ⁺λ × E ⁺T-1 × its derivative which by the chain rule is λ × E ⁺T.2351

All of that multiplied by a λ.2365

I could have simplify that but I do not think it is worth doing.2367

Instead, what I'm going to do is plug in 0 to each of these functions.2373

Let me go back above and Y prime of 0 is E ⁺λ ×, E ⁺T is E⁰, E⁰ is 1.2377

So 1-1 is 0, it is E ⁺λ × 0 × λ × E⁰ × 1.2392

That E⁰ is 1 is just λ.2401

N double prime of 0, go through here and plug in 0 everywhere I see a T.2406

Λ × E ⁺λ, E ⁺T is E⁰ which is 1.2414

E ⁺λ × 0, E ⁺T is 1 + E⁰ is 1, E ⁺λ × 0 × λ × E⁰ is 1.2422

Let us simplify this down.2440

This is λ × E⁰ is 1 + I see another one × λ, 1 + λ.2441

This simplifies down to λ + λ².2451

How are we to use all this information?2456

Remember, the expected value of Y is M prime of Y, M prime of 0.2459

The expected value of Y is MY prime of 0 which we figure out was λ.2466

That is λ right there, and that is that mean.2480

We figured out the mean of our distribution is λ, very nice to know.2484

To find the variance, it is a little more complicated.2491

Sigma² is not just M double prime, it is the expected value of (Y)² - the expected value of Y².2497

This is N double prime is E of Y², that is λ + λ² -, E of Y², the E of Y we figure out was λ, λ².2506

This is very nice, the λ² cancel.2528

For the variance, we also get λ, how convenient.2531

What we have done is we have calculated the mean and variance of the Poisson distribution,2537

based solely on the moment generating function.2543

Once you understand the moment generating function, you can find the mean and variance of the distribution.2545

Let me show you the steps there, again2553

We calculated, first of all the moment generating function, that came from example 3.2556

The work here was all done in example 3, and there was some work to be done there.2562

And then, we took its derivative which was kind of, no product rule in that but there was a chain rule.2567

You took its second derivative and there was a big product rule, and lots of little chain rules coming in.2574

It got a little messy, but when we plunged in 0 then all the E⁰ turned into 1, that simplify a lot there.2581

M single prime turn into λ, the M double prime, all the 0 turn into 1.2591

It simplified down to λ + λ².2600

Here is how we use those, remember, I told you on the 2nd slide of this lecture, that M prime is E of Y.2603

M prime gives you E of Y which right away is the mean of the distribution, that λ is coming from there.2613

The M double prime is the E of Y² which is not the variance yet but it factors into calculating the variance,2625

because the variance is E of Y² – E of (Y)².2634

That λ + λ² is where we got that λ + λ².2640

The E of Y also came from up here.2645

We plug in that λ in there, we got λ² which canceled off the λ² from E of Y².2654

It just reduced down to the variance of the Poisson distribution is λ.2660

Of course, those answers agree with what I told you several lectures ago, when we talked about the Poisson distribution.2667

It is really reassuring to have those agree with what we had previously suspected there.2674

In example 5, we are going to find the moment generating function for the uniform distribution.2689

This is kind of nice because the other examples were both discreet distributions.2693

This is the only continuous distribution we are going to calculate.2699

The others are kind of messy.2703

Even doing this for the uniform distribution, it is a little messy than you might expect,2705

considering that the uniform distribution is so simple.2710

Let me remind you what the uniform distribution is.2714

The density function for the uniform distribution is F of Y is always equal to,2718

those three lines mean it is constantly equal to 1/θ2 – θ1, where Y ranges between θ1 and θ2.2723

It is just the constant distribution, that is why it is called uniform.2733

Let us find the moment generating function.2737

By definition, the moment generating function is := means defined to be, the expected value of E ⁺T × Y.2739

The way you calculate the expected value of a function is, with the discreet distribution we are studying before was the sum.2752

For a continuous distribution, it is an integral.2761

The integral, this is also a definition of expected value.2765

It is the integral of the density function F of Y × whatever function you are trying to find the expected value of,2769

in this case E ⁺TY DY.2777

And then, you integrate that over your whole range for Y, which in this case is θ1 to θ2.2780

Now, we just have to do some calculus.2788

This is the integral from θ1 to θ2.2792

F of Y is 1/θ2 – θ1, that is just a constant there.2795

E ⁺TY DY, not such a bad integral, really not too bad.2802

The answer is 1/θ2 – θ1, that is a constant, I can pull it out.2807

What is the integral of the E ⁺TY, remember here, our variable is Y.2813

We are integrating with respect to Y.2818

The integral of E ⁺TY, if you do a little substitution there, let me go ahead and do it in my head.2824

It is just E ⁺TY × 1/T, that is because we are thinking of T as being constant here.2830

Y is the variable of integration, it is just 1/T.2839

If you take the derivative of that with respect to Y, you get back to E ⁺TY.2844

We want to evaluate that from Y is equal to θ1 to Y is equal to θ2.2848

We get, I will combine the T in the θ2 – θ1.2857

We are plugging in these values for Y.2865

E ⁺θ2 × T – E ⁺θ1 × T, I need parentheses here.2871

I could write that over a common denominator, E ⁺θ2t –e ⁺θ1t.2882

We divide that by T × (θ2- θ1).2894

That is my moment generating function for the uniform distribution.2901

Notice that, this is a function of T now, there are no Y anywhere.2906

That is what is supposed to happen with a moment generating function.2913

It should always be a function of T, it should not have any Y anywhere in there.2917

This is my complete answer here and I'm done with that example, except for a quick recap of the steps there.2923

Just to remind you, we have a whole lecture on the uniform distribution.2935

If you do not remember the basic premise of the uniform distribution, you can go back and do a quick review there.2939

The density function is 1/θ2 – θ1.2946

In particular, it is constant that is why I have three lines here to show that is always equal to.2950

The range goes from θ1 to θ2.2955

The moment generating function, by definition, we learned that in this lecture, it is the expected value of E ⁺TY.2958

The expected value of any function is the integral of the density function × that function.2967

If this were discreet, we have the sigma sign summation, instead of an integral,2974

and we have a probability function P, instead of a density function F.2981

It is really the same idea, when you look at these formulas, if you kind of blur your eyes a little bit,2985

you should see how they are really the same idea.2990

Integrals, like adding things up, the probability function is kind of the analogue of the density function.2993

Instead of the summation of P of Y, we have the integral of F of Y, and then, we still have E ⁺TY.3001

F of Y from above is just 1/θ2 – θ1, that comes from up above.3007

We will pull that out, since it is a constant.3014

Now, we have to integrate E ⁺TY, I did a u substitution.3016

My u was TY, my DU was T DY, DY was 1/T DU.3021

That is where I got that 1/T on the outside there.3033

It is the opposite of the chain rule or a substitution.3037

We still have E ⁺TY that is because we are integrating with respect to Y, not with respect to T.3042

The range on Y goes from θ1 to θ2, I plug those in and I still had 1/T × θ2 – θ1.3046

It is still quite complicated considering that it is a uniform distribution,3057

you might expect something simpler for the uniform distribution.3062

But you end up with this function of T that does represent the moment generating function for the uniform distribution.3067

I’m not going to take this one any farther, but if you want to, you could use this3074

to find the mean and the variance of the uniform distribution.3078

The same that we did in example 4, with the Poisson distribution.3084

You can calculate those out, it gets a little messy so I'm not going to do it here.3087

Instead, I'm going to wrap up this lecture here on moment generating functions.3092

This is part of the probability lecture series here on

Next up, we are going to talk about by Bivariate distribution, we will have a Y1 and Y2.3102

That is another whole chapter of excitement, I hope you will stick around for that.3107

You are watching probability lectures on, my name is Will Murray, thank you very much for joining me, bye.3112