For more information, please see full course syllabus of Probability

For more information, please see full course syllabus of Probability

### Moment-Generating Functions

Lecture Slides are screen-captured images of important points in the lecture. Students can download and print out these lecture slide images to do practice problems as well as take notes while watching the lecture.

- Intro
- Premise
- Goal
- Goal Number 1: Find the Full Distribution Function
- Goal Number 2: Find the Density Function
- Goal Number 3: Calculate Probabilities
- Three Methods
- Review of Moment-Generating Functions
- Recall: The Moment-Generating Function for a Random Variable Y
- The Moment-Generating Function is a Function of t (Not y)
- Moment-Generating Functions for the Discrete Distributions
- Moment-Generating Functions for the Continuous Distributions
- Useful Formulas with the Moment-Generating Functions
- How to Use Moment-Generating Functions
- Example I: Find the Density Function
- Example II: Find the Density Function
- Example III: Find the Probability Function
- Example IV: Find the Probability Function
- Example V: Find the Distribution
- Example VI: Find the Density Function

- Intro 0:00
- Premise 0:30
- Premise
- Goal 1:40
- Goal Number 1: Find the Full Distribution Function
- Goal Number 2: Find the Density Function
- Goal Number 3: Calculate Probabilities
- Three Methods 2:39
- Method 1: Distribution Functions
- Method 2: Transformations
- Method 3: Moment-Generating Functions
- Review of Moment-Generating Functions 3:04
- Recall: The Moment-Generating Function for a Random Variable Y
- The Moment-Generating Function is a Function of t (Not y)
- Moment-Generating Functions for the Discrete Distributions 4:31
- Binomial
- Geometric
- Negative Binomial
- Hypergeometric
- Poisson
- Moment-Generating Functions for the Continuous Distributions 6:08
- Uniform
- Normal
- Gamma
- Exponential
- Chi-square
- Beta
- Useful Formulas with the Moment-Generating Functions 8:48
- Useful Formula 1
- Useful Formula 2
- How to Use Moment-Generating Functions 10:41
- How to Use Moment-Generating Functions
- Example I: Find the Density Function 12:22
- Example II: Find the Density Function 30:58
- Example III: Find the Probability Function 43:29
- Example IV: Find the Probability Function 51:43
- Example V: Find the Distribution 1:00:14
- Example VI: Find the Density Function 1:12:10

### Introduction to Probability Online Course

### Transcription: Moment-Generating Functions

*Hello, welcome back to the probability lectures here on www.educator.com, my name is Will Murray.*0000

*Today, we are wrapping up a three lecture series on how to find *0006

*the density and distribution functions for functions of random variables.*0010

*We had one lecture on the method of distribution functions, and then the last lecture cover the method of transformations.*0016

*Today, we are going to talk about moment generating functions which is the last of our three methods.*0023

*Let me jump in and tell you the setting here.*0029

*This is going to start out the same as the last two lectures.*0032

*The first few slides are exactly the same as the last two lectures.*0035

*If you have been following along diligently and you watch the last two lectures, *0038

*you do not need to watch these first few slides again.*0042

*It is just going to be a review of the exact same stuff.*0045

*Just setting up the same premise and then we will get into actual moment generating functions, a few slides in.*0048

*The premise here is that we have several random variables Y1, Y2, etc. And then, we have some function of them*0055

*This U is some function of Y1 through YN.*0062

*We might have something like U is Y1² + Y2², something like that.*0065

*We will have some function of these random variables.*0075

*What I said before, I taught you how to calculate the mean and the variance of U,*0080

*that was in the previous series of lectures.*0085

*What I did not teach you before was how to calculate the whole distribution of U.*0087

*The purpose of this lecture and the previous two videos is to teach you how to find the entire distribution function of U.*0093

*Our goal is to find this distribution function F of U is the probability that U is less than some cutoff value u.*0102

*And then, if we can find that F, then we can find the density function just by taking the derivative.*0111

*F of U is just the derivative of F of U.*0118

*Assuming that we know f and F, it is very easy to calculate probabilities.*0122

*If we want to find the probability that U is in a particular range between A and B, what we will do is,*0127

*if we just know the density function, we could integrate the density function from A to B.*0133

*Or if we know the distribution function, it is even better because we can just do F of B - F of A.*0138

*That is why we want to find these functions, this F and f.*0145

*The point of this lecture and the previous two lectures is to give you various methods for finding this F and this f.*0150

*These three methods that we have been discussing, the first one was distribution functions.*0161

*You will see that, if you scroll back up two lectures, you will see the method of distribution functions.*0165

*Transformation function is what we covered in the previous lecture.*0170

*You should be all set to go with that.*0173

*In this lecture, what we are talking about today is the method of moment generating functions.*0176

*That is what I'm about to jump into is the method of moment generating functions.*0181

*First, I have to review for you what a moment generating function is.*0185

*There was a whole lecture on moment generating functions, earlier on in the lesson.*0190

*If you scroll back up, you will see that we have a whole lecture here on moment generating function.*0195

*If you do not know what they are at all, if you did not go through that lecture before, *0200

*you probably want to watch that lecture before you watch this one, because it would not make much sense.*0205

*This is the quick and dirty and review of moment generating functions.*0210

*Let me just show you just quickly, remind you of what they are all about.*0214

*By definition, the moment generating function for random variable is the expected value of E ⁺TY.*0218

*If you work that out, you always end up with a function of T, it is not a function of Y.*0227

*Moment generating function will always be something like 1 -2T⁻³, something like that where it is a function of T.*0232

*You should not see any Y, in the moment generating function.*0242

*We practice calculating some moment generating functions in that earlier lecture, *0247

*that was just specifically dedicated to moment generating functions.*0253

*Let me show you in particular, the moment generating functions for key distributions.*0257

*Because, you really need to remember them or have them somewhere very close by as a reference,*0262

*in order to make all the examples in this lecture work.*0267

*Here, the key moment generating functions, we have discreet distributions, we have continuous distributions.*0271

*In this side, I’m going to do the discrete one and in the next slide, we will continuous ones.*0277

*Our distributions are binomial, geometric, negative binomial, hypergeometric, and the Poisson distribution.*0281

*Each one has its own moment generating function.*0289

*The binomial is PE ⁺T + 1- P ⁺N, where P is the probability associated with the binomial distribution, N is the number of trials.*0292

*By the way, this 1- P is often called Q.*0302

*You might see this called PE ⁺T + Q ⁺N.*0308

*Geometric looks similar, PE ⁺T + 1-, again, this P could be written as Q.*0312

*If you look at this in some sources and some textbooks, they will just this call 1- P = Q.*0319

*It just means the same thing.*0324

*Negative binomial is the same as the geometric distribution except that it is raised to the R power.*0325

*Again, this 1- P could be a Q.*0331

*Hypergeometric distribution has no closed form, no simple moment generating functions.*0334

*I cannot write down anything for the hypergeometric distribution.*0339

*The Poisson distribution is E ⁺λ × the (E ⁺T-1).*0343

*Notice that all of these, there no Y anywhere in here, all are functions of the variable T.*0349

*You want to be seeing a T, when you are looking at a moment generating function.*0361

*You are not going to see any Y in there.*0365

*We also have moment generating functions for the continuous distributions.*0367

*Our favorites ones are uniform, here is the moment generating function for the uniform distribution.*0371

*Normal is E ⁺ν T + T² σ²/2.*0377

*All of that is in the exponent, by the way.*0383

*That is all in the exponent of the E, it is quite complicated moment generating function there.*0385

*The gamma distribution that is a whole family, 1- β T ⁻α.*0390

*Remember that the exponential and the chi square distributions, *0395

*those should both be considered children of the gamma distribution.*0399

*The exponential distribution is just γ with α equal to 1.*0404

*If you remember the moment generating function for the gamma distribution,*0412

*then you can remember the exponential distribution, its moment generating function *0416

*just by taking the α equal to 1 in the gamma distribution.*0421

*Chi square is also a gamma distribution, it is where you take α is equal to ν/2 and β is equal to 2.*0426

*Ν is the number of degrees of freedom in the Chi square distribution.*0436

*Again, you can see how, if you start with a function for the gamma distribution,*0445

*you plug in β is equal to 2, there is right there.*0450

*And if you plug in α is equal to ν/2, ν by the way is the Greek letter that looks like a v.*0453

*If you plug in α is ν/2, that is what you get.*0460

*You recognize the moment generating function for the Chi square distribution. *0466

*A β distribution has no closed form of moment generating function.*0470

*It does not lend itself very easily to the problems that have to do with moment generating functions.*0474

*The point of using moment generating functions to solve problems is that you got to be able*0481

*to recognize these moment generating functions, when you see them in a dark alley or see them out on safari.*0488

*You see a moment generating function, and then you have to say that *0495

*it is the moment generating function for the gamma distribution, or that is the moment generating function for the normal distribution.*0498

*You really want a kind of stare at this chart on the slide and also the one on the previous side, *0505

*and get these functions into your head.*0511

*Or else, maybe have one of these charts as an easy reference, when you are solving these problems.*0514

*Because, the whole point of this is you got to be able to recognize these, when you see them in the wild to speak.*0520

*Let me show you how that works out, I had not really told you how to solve any problems yet.*0526

*I want to show you how it works.*0532

*You are going to be calculating moment generating functions.*0534

*There is a couple useful formulas that you are going to need to now.*0537

*One is that if you take a linear function of our random variable, AY + B.*0541

*And then, you build a new variable called Z, that is AY + B.*0547

*The moment generating function for Z is, you take the moment generating function for Y and *0553

*just wherever you see a T, you change it to AT.*0560

*And then, you multiply on this factor E ⁺BT on the outside.*0567

*The time when this is most useful is when Y is a normal variable, and you are converting to make Z a standard normal variables.*0571

*That is when you use this formula most often, is when you are converting from a normal variable to a standard normal.*0582

*Here is another very useful formula that we are going to be using in almost every exercise today.*0591

*It is that, if you have two independent variables and it is important that they be independent, *0598

*and you add them together then the moment generating function for the sum is equal to *0604

*the product of the moment generating function of the individual variables.*0611

*That is very convenient because if you want to add two variables, you just multiply their moment generating functions.*0618

*That is very useful, we are going to use that over and over again.*0625

*Moment generating functions converts addition into multiplication.*0628

*It behaves very nicely, as long as your variables are independent.*0633

*Let me show you now, how we are going to use moment generating functions.*0639

*We will be given a collection of random variables and we want to find the moment generating function of U, M sub U of T.*0644

*That can be kind of tricky and we are going to use several different tricks to do that.*0655

*We might use the definition of moment generating function.*0660

*We will often use these formulas on the previous slide, especially the one where it converts addition into multiplication.*0663

*M sub Y1 + Y2 of T will be equal to M sub Y1 of T × M sub Y2 of T.*0672

*That is going to be extremely useful to calculate the new moment generating function.*0682

*What we will do is we will calculate that new moment generating function, *0688

*and then kind of compare it against all the charts that we have of all of our moment generating functions.*0691

*We will make sure that we recognize it as a known distribution.*0697

*If we can, then we will say that is the Poisson distribution with a certain value of λ.*0702

*Or, that is the exponential distribution with a certain value of β.*0707

*And then, we will know what our distribution is.*0712

*There is a lot of pattern recognition involved in using moment generating functions to identify distributions.*0716

*But, we will do some examples and you will see how it works out.*0723

*The first example is actually the trickiest, if you have trouble, if you get bugged down in the first example,*0727

*it is okay if you want to skip to a couple of the later ones.*0733

*And then, maybe come back and analyze the first one because it is the most challenging to understand.*0737

*With example 1, we have a standard normal variable, Y is the standard normal variable.*0746

*We want to find the density function of U which is defined to be Y².*0751

*We want to use moment generating functions for this.*0759

*We want to calculate the moment generating function of U M sub Y of T.*0762

*Remember, our U, by definition is Y² of T.*0770

*We are going to use the definition of the moment generating function here.*0778

*Remember, our definition of moment generating function M sub Y of T is just the expected value of E ⁺TY.*0781

*In this case, we do not have Y, we have Y².*0791

*This is the expected value of E ⁺TY², the expected value of E ⁺TY².*0794

*To calculate the expected value of the function of a random variable, what you do is you take that function E ⁺TY².*0806

*And then, you multiply it by the density function of that random variable.*0816

*You integrate that over all possible values for Y.*0822

*This integral is going to get a little complicated because if you remember,*0826

*the density function for the normal variable is no joke, it is rather complicated.*0831

*One thing we are given here is that Y is a standard normal variable.*0838

*Standard is sort of a loaded term, when you are studying probability and statistics.*0843

*Standard normal variable means that its mean is 0 and its variance is 1.*0848

*That kind of simplifies some of the equations that we have to deal with, when we are looking at its density function.*0859

*The density function for a standard normal variables, I’m plugging in ν = 0 and σ² = 1, is 1/√2 π.*0866

*There is actually a σ in there, but I'm taking advantage of the fact that it is σ = 1 × E ^- Y²/2.*0875

*Again, I’m simplifying that as I go along.*0890

*The full normal variable density function would be E ⁻Y - μ²/2 σ².*0893

*I have simplified that, taking advantage of the fact that we have a standard normal variable.*0901

*We solved E ⁺TY² here and we still have to integrate this thing over all possible values of Y.*0906

*By the way, in this case, the possible values are -infinity to infinity because that is my range for a normal variable.*0913

*This looks like λ, it is tricky to solve.*0920

*Let me pull out the 1/√2 π because that is just a constant.*0924

*I see that I have two functions that both look like E ⁺Y².*0929

*I got E ⁺TY² and E ⁻Y²/2.*0937

*What I'm going to try to do is write this as E ^-, I’m going to try to factor out Y²/2.*0941

*E ⁺Y²/2 + TY², that is what I have here.*0954

*I forgot my DY there, there is DY.*0961

*Let me just work inside the integral for the next couple of steps.*0967

*This is E ⁻Y²/2, I’m going to factor that out.*0970

*I have a 1, this T, since I’m factoring out the negative sign becomes –T.*0976

*Since, I factored out ½, it becomes a 2T.*0982

*E ⁻Y²/2 × 1 -2T, and I have a reason for doing this, but it is not obvious right now.*0986

*Let me show you where I'm headed with this, I do not want to solve this integral.*0993

*In fact, I know that I cannot solve this integral by any direct means.*0998

*What I'm going to try to do, is to try to compare this integral to the density function that I recognized*1002

*which would be a density function for different normal variable.*1012

*Let me show you what I mean by that.*1017

*The density function for a nonstandard normal variable would be 1/σ √2 π.*1019

*I do not think I need a μ, but E ⁺Y²/2 σ².*1032

*That is the density for a nonstandard normal variable.*1040

*It is not the same variable the we start out with.*1056

*Because it is a density function, I know what it is integral is.*1063

*If I integrate that from - infinity to infinity, the integral of any density function must be 1.*1068

*If I had an integral in that form, then I would know that its integral would be 1.*1077

*What I have here is something that is sort of generically similar to that.*1086

*If I try to arrange my variables carefully, I can make this integral equal to 1 in that form.*1091

*What I'm going to do is, I'm going to figure out what my value should be.*1099

*I want -Y²/2 × 1 -2T to be equal to -Y²/2 σ².*1103

*I see that the -Y²/2 is going to cancel, that 1 -2T is equal to 1/σ².*1114

*If I flip both sides, I get σ² is 1 -2T and my σ would be 1 -2T.*1124

*Σ² is 1/1 -2T and σ is 1 -2T⁻¹/2.*1134

*That is the σ that we would be talking about, if we want to make this integral that I have here*1147

*match the density function for nonstandard normal variable.*1153

*Let me arrange, see if I can arrange things to make it work.*1159

*I got 1/√ 2 π and then I got the integral of E ^-, if I arrange this to be Y²/2 σ² DY,*1164

*that was by choosing my σ up above here.*1180

*I will choose my σ to make that work.*1184

*Now, I want to make this match this integral over here.*1185

*It does not quite match it as it is because it is missing that 1 σ in the denominator.*1188

*I'm going to fudge that σ in there, and in order to balance that, I will have to put a σ on the outside here.*1194

*Let me remember that there is a σ on the outside.*1203

*The whole point is that, this is now the density function for a nonstandard variable.*1206

*But I know that this density function, the integral of any density function is equal to 1.*1212

*I got to multiply that one by the σ as well.*1219

*Σ × all that is equal to σ × 1.*1222

*What I have got there is that, this whole thing is equal to σ.*1231

*Σ, remember was 1 -2T⁻¹/2.*1240

*What have I done here, I have just calculated the moment generating function for this variable μ.*1248

*I found out that the moment generating function is 1 -2T⁻¹/2.*1255

*What am I supposed to do with that?*1263

*What I do is I go back and I looked at my charts of common moment generating functions.*1264

*And, I see if I will recognize this moment generating function somewhere on the chart.*1272

*Low and behold, I do.*1277

*Let me remind you on the chart, the moment generating function for Chi square distribution is 1 -2T ⁻ν/2.*1280

*What I have here is exactly a Chi square distribution with ν = 1.*1295

*That is worth writing down, ν has a Chi square distribution with ν = 1 degrees of freedom.*1303

*From that, I can figure out the density function of U because I remember that Chi square is the gamma distribution.*1330

*It is just a special case of the gamma distribution with α is ν/2 and β is 2. *1345

*I can look up the density function for the gamma distribution.*1356

*Let me remind you what it was.*1360

*The gamma distribution, the density function for the gamma distribution, *1363

*I will write it in terms of U is U ⁺α -1 × E ⁻U/β divided by β ⁺α × γ of α.*1368

*If I plug in all my values here, I'm going to plug in U ⁺α-1.*1389

*Α is ν/2 and ν is 1.*1397

*This is U ^½ -1, U⁻¹ E ⁻U/β is 2 U/2.*1401

*Β ⁺α is 2 ^½ and γ of α, γ of ½.*1411

*There is one thing I need to remember which is γ of ½.*1428

*That is kind of something that you either need to remember or look up, because it is quite a lot of work to derive from scratch.*1433

*Γ of ½, it turns out that it is √π , it is a kind of a surprising number there.*1442

*That is not something you can easily figure out from the factorial property of the gamma distribution,*1451

*because ½ is not a whole number.*1456

*It is easy to figure out γ for whole numbers, γ of ½.*1458

*It is quite difficult the first time you work it out, from then on it is probably worth remembering that γ of ½ is √π.*1463

*What I have here is, U⁻¹/2 E ⁻U/2.*1473

*And then, in the denominator I got 2¹/2 and that is √2 and √π.*1484

*I’m just going to combine those together, I think, and give myself √2 π.*1493

*The range on the Chi square distribution, it is the same as the range on the gamma distribution.*1500

*It is all U greater than 0 and less then infinity.*1505

*Another way to think about that is to say that, since U is Y² and since Y goes from - infinity to infinity, *1512

*Y² will go from 0 to infinity.*1524

*That is my density function for U.*1527

*A lot of work to find that one, that is probably the hardest one we are going to do though.*1530

*The rest build on this one and we have done the hardest steps in this example number 1.*1534

*Let me remind you how all those steps went.*1541

*We are trying to find the density function of U = Y².*1544

*I used the original definition of moment generating function here.*1548

*The original definition of a moment generating function is the expected value of E ⁺T Y.*1553

*I plugged in my U there was Y², that means instead of the TY, I'm finding the expected value of E ⁺TY².*1560

*That means, I'm finding the integral of E ⁺TY² × the density function.*1570

*The density function for a standard normal variable is that right there.*1576

*Standard normal variable means μ is 0 and σ² = 1.*1580

*I pulled out the 1 /√2 π, I combine the exponents.*1586

*I got this thing that is a little messy and is definitely not something I'm going to be able to integrate with any ease.*1590

*The trick to integrating that is to combine those exponent using a little bit of clever factoring and then,*1598

*to try to identify it as a density function for another normal distribution.*1605

*Here is the density function for another normal distribution.*1612

*What I know is that if I integrate the density function for any distribution, I should get 1.*1616

*That is of course, because in any experiment, the total probability is 1.*1624

*In order to make this match this density function, I set my two exponents equal to each other.*1630

*And then, I solved and I figure out that my σ had to be 1 -2T⁻¹/2.*1637

*I plugged in that value of σ, I converted this into σ.*1645

*It almost match the density function but there was this one extra factor of σ that I did not have before.*1650

*In order to create that factor of σ in the denominator, I had to multiply it in the numerator *1656

*which meant I also had to multiply it on the other side.*1662

*That density function, if I integrate that is equal 1 but then there is one extra factor of σ here, which is left over,*1665

*that σ tracks on down there.*1675

* What I'm left with, everything else drops out very nicely.*1678

*Thanks to the fact that integrating a density function gives you 1.*1682

*I’m left with 1 -2T⁻¹/2, and what I do there is I go back and look at my charts of the common moment generating functions.*1685

*Because, what I just calculated was the moment generating function for U.*1698

*I go back and look at my charts, and I say that looks a lot like the moment generating function for Chi square distribution.*1701

*It is the Chi square distribution, if I just take my ν equal to 1.*1711

*1 degree of freedom, I got a Chi square distribution and then, I want to write the density function for that.*1717

*In order to do that, I had to remember that Chi square was a gamma distribution, was special values of the α and β.*1722

*Α is ν/2 and β is equal to 2.*1730

*My gamma distribution, I wrote down the density function for the gamma distribution, in general.*1733

*I did it in terms of U, when we originally learned that, I gave it you in terms of Y but our variable now is U.*1740

*It is U ⁺α -1, E ⁻U/β, β ⁺α, γ of α.*1745

*I plugged in my α is ν/2 and my ν is 1.*1752

*I plug in β is equal to 2.*1759

*This all simplified fairly well, except of this γ ½.*1762

*I'm just remembering the γ of ½ is √π.*1766

*That is kind of a lot of work to figure that out, I do not do that work every time.*1771

*I just looked up that value of γ of ½ is √π.*1775

*It is not so obvious like the way that γ of a whole number is easy to calculate using factorials, *1780

*you have to do a lot of integrals, in order to figure out that γ of ½ is √π.*1788

*That is why I did not show you the details of that.*1793

*In the denominator, I combined 2¹/2 and √π, I got the √2 π.*1795

*And that gave me my density function for U = Y².*1801

*My range, that is the generic range for Chi square distribution.*1807

*But, I also could have figured it out by looking at my original range for Y and then, by figuring out U = Y².*1811

*By the way, this is one reason why we study the Chi square distribution.*1820

*It is because it is very common to look at the square of a standard normal variable, it turns out to have a chi square distribution.*1825

*We are going to use the result from this example again in example 2.*1834

*Make sure you understand this example, or at least make sure that you believe the answer,*1840

*before we move on to example 2.*1845

*I do not want to do all this work again in example 2.*1847

*I’m just going to invoke the answer from this example again, in example 2.*1850

*Example 2 looks a lot like example 1, except we have two independent standard normal variables,*1859

*instead of the one that we had in example 1.*1865

*We want to find the density function of Y1² + Y2².*1869

*I am going to use the answer from example 1 to help me solve example 2.*1873

*If you have not worked through example 1 of this lecture, then I really recommend going back and looking at example 1.*1878

*It is a lot of work, if you do not want to work through all the details there,*1885

*just make sure that you understand the answer.*1889

*We are going to use that answer as intermediate result here in example 2.*1892

*It will make example 2 a lot less work.*1897

*Let us work out example 2, we want to find the density function of U = Y1² + Y2².*1899

*The way we are going to do that is via moment generating function.*1909

*We are going to find the moment generating function for U, M sub U of T that is equal to M sub Y1².*1912

*U is Y1² + Y2².*1924

*The lovely thing about moment generating functions, that good property that I gave you on one of the introductory slide*1933

*is that they convert addition into multiplication, when you have independent variables.*1939

*Here, we do have independent variables.*1945

*This is M Y1² of T × this is where the multiplication comes in, M Y2² of T.*1949

*That is really nice, that our addition converts into multiplication.*1959

*Let me now invoke the answer from example 1.*1966

*From example 1, each Y is², Y1² and Y2² has a Chi square distribution with 1 degree of freedom.*1970

*Let me say with ν =1, I will say it that way.*1992

*If you are wondering where that comes from, you got to go back and watch example 1.*1995

*Example 1 is a lot of work and I cannot redo it here.*2004

*If you trust example 1, it is worth knowing that if you start with the standard normal variable *2008

*and you square it, you get a Chi square distribution.*2014

*We can look up what the moment generating function of a Chi square distribution is.*2018

*From the chart of the moment generating function of a Chi square distribution is 1 -2 T.*2026

*It is always a function of T, remember, raised to the –ν.*2041

*In this case, it is just 1 -2T⁻¹.*2053

*What I have here is a product of two functions here, 1 -2T ⁻ν/2.*2061

*1 -2T⁻¹/2 and 1 -2T⁻¹/2, that is not so obvious that it is negative because *2077

*I let my negative run into my line there, my fraction line.*2092

*That is a little more obvious now.*2096

*Let me multiply those two together, if you multiply those then the exponents just add.*2099

*I get 1 -2T⁻¹, that is my moment generating function for U.*2104

*Let me write this as -ν/2 because then, that will make it more obvious that*2119

*the moment generating function that I just discovered for U, is again a Chi² distribution.*2127

*Remember, this whole lecture is about pattern recognition.*2138

*You calculate a moment generating function, then you stare at the chart and you try say *2142

*that is the Chi square distribution or that is the exponential distribution.*2146

*This in fact is the Chi square distribution with ν is equal to, -1 is -ν/2, ν would be 2 there.*2150

*We have a Chi square distribution with 2 degrees of freedom.*2163

*Now, I can find its density function, I will remember that Chi square is a gamma distribution.*2169

*It is a sub family of the γ family.*2178

*Let me remind myself of the density function for the gamma distribution.*2182

*F of U = U ⁺α - 1 × E ⁻U/β.*2188

*That β got a little squashed there, it ended up looking like a Δ.*2199

*U/β and β ⁺α × γ of α.*2202

*That is the density function for a gamma distribution.*2211

*Chi square is γ with α is equal to ν/2 and β is equal to 2.*2215

*I'm going to plug in those values into my gamma distribution, F sub U of U is U ⁺α -1.*2229

*Α is ν/2, we said ν is equal 2, 2/2 -1 is 0, that term drops out. *2242

*I will go ahead and write it as U⁰, just in case you are wondering where it went, E ⁻U/2.*2249

*In my denominator, I got β ⁺α is 2¹.*2257

*And then, γ of α is just γ of 1.*2263

*Γ of 1 is 0Factorial which is just 1.*2268

*Finally, my density function for my U is F sub U of U which is the U⁰ drops out, the γ of 1 drops out.*2273

*It is ½ E ⁻U/2, and my range for Chi square distribution is U goes from 0 to infinity.*2283

*That is my density function.*2295

*That is officially the end of that problem, let me make a coupe of notes about this.*2302

*One note is that you might recognize that density function as an exponential distribution.*2307

*It is in fact an exponential distribution.*2312

*That is not so relevant to this problem because that pattern does not really continue.*2316

*The fact that was an exponential distribution is sort of a fluke of nature on this problem.*2322

*Let me tell you what is not a fluke of nature on this problem ,*2327

*which is that we got a Chi square distribution with 2 degrees of freedom*2330

*by adding up the squares of two standard normals that is not a fluke. *2335

*In general, let me say Y1 through YN, if we add up N standard normal, Y1, Y2, up to YN are independent standard normal.*2342

*U is Y1² up to YN², then U has a Chi square distribution with N degrees of freedom.*2379

*U is Chi² distribution with N degrees of freedom.*2391

*Let me say with a ν, ν is the number of degrees of freedom, in this case it will come out to be N.*2398

*That is not so surprising, if you kind of look at this step right here, instead of having two factors, we would get N factors.*2406

*This exponent would turn into N/2, we would just get Chi square distribution with ν = N.*2414

*This does generalize to adding up N independent squares of standard normal,*2424

*what you get is a Chi square distribution with N degrees of freedom.*2431

*That is really a big reason why the Chi square distribution is significant in probability and statistics.*2435

*It is because it kind of flows out of the standard normal distribution.*2442

*Let me recap the steps here.*2448

*We want to find the density function of Y1² + Y2².*2450

*We start out just by that definition U is Y1² + Y2², the definition of moment generating function.*2455

*But quickly, we are going to use this property that we had, I think I called it a useful formula on one of the earlier slides.*2463

*The really useful fact is that when we have independent variables, it converts addition, *2472

*when we are adding the variables into multiplication of moment generating functions.*2477

*What we do is we multiply the moment generating functions for Y1² and Y2².*2483

*We figured out the moment generating functions for each one, back in example 1.*2489

*If the introduction of the Chi square variable suddenly came out of the left field for you, *2494

*what you want do is go back and watch example 1.*2501

*You will see where we figured out that the distribution for Y1² is just Chi square with 1 degree of freedom.*2504

*Its moment generating function, we figure that out on the chart earlier on in this lecture.*2514

*It is 1 -2T ⁻ν/2, we get that for both of these variables multiplying together and we get 1 -2T⁻¹.*2520

*We notice that, that Chi square again, this is sort of pattern recognition.*2531

*That is still a Chi square, the difference is that the exponent is bigger now, we have 2 degrees of freedom.*2535

*If we want to find again the density function, we have to remember that Chi square comes from the gamma distribution.*2542

*I wrote down my formula for the density function for the gamma distribution.*2550

*And then, I plugged in Chi square is gamma distribution with α is ν/2 and β = 2.*2554

*I plugged in those values, I plug in ν = 2, I plug in all those values to my γ density function.*2561

*I simplified it down and got my density function for my U there.*2569

*Of course, the range for Chi square distribution is from 0 to infinity.*2576

*What I noticed along the way is that, this is sort of a pattern with two variables.*2580

*But if we had N variables, we could have just extended this up to N moment generating functions and*2585

*we would have gotten a Chi square distribution with N degrees of freedom.*2592

*That is kind of a good thing to know in probability and statistics, in general, *2596

*which is that if you add up N standard normal variables, squaring each one, *2600

*then you get a Chi square distribution with N degrees of freedom.*2605

*In examples 3, we have R independent binomial variables.*2611

*They all represent flipping the same coins.*2616

*The coin comes up heads with probability P.*2619

*P is not necessarily ½, it could be a loaded coin, nobody told us that it is a fair coin.*2622

*Each one represents a different number of flips, N1 through NR.*2628

*What we want do is add these variables together and call it U.*2633

*We want to find the probability function of U.*2637

*Our method that we are exploring in this lecture is moment generating functions.*2641

*We are going to find the moment generating function of U.*2647

*In the meantime, along the way we are going to need the moment generating function of the individual Y.*2651

*I want to find the moment generating function of YI of T.*2657

*I get that just by looking at my chart for moment generating functions.*2661

*The binomial is discreet, if you scroll back a few slides in this lecture, *2667

*you will see the chart for moment generating functions of discrete variables.*2675

*The one for binomial is PE ⁺T + 1- O ⁺Nth but Yi variable is Ni flips.*2680

*I’m going to write N sub I here.*2696

*This is coming from the chart earlier on in this lecture, just scroll back and you will find it.*2698

*It is the discrete distributions.*2707

*We want to find my moment generating function for U.*2710

*Than U that we have been given here which is the sum of the Yi, Y1 up to YN.*2714

*The lovely thing about moment generating functions is that they convert addition into multiplication.*2723

*You can only do that when you have independent variables, which is what we have here.*2729

*This is M Y1 of T multiplied, this is multiplied now, I’m not adding any more.*2735

*Which is, I do not know why I try to write an addition sign there.*2741

*This is MYN of T, I'm going to plug in the moment generating functions for each one.*2744

*PE ⁺T + 1 – P ⁺N1, I'm going to multiply that all the way through up to PE ⁺T + 1- P ⁺N,*2752

*I called it Y sub N, of course, I should have called it Y sub R.*2772

*There are R in these things, I will try not to reuse the variable N.*2777

*This is N sub R in my exponent.*2782

*The lovely thing about this is, I got the same base everywhere, *2786

*I can just combine all those exponent and you add the exponents.*2790

*This is PE ⁺T + 1- P, I add the exponents N1 up to N sub R.*2794

*Maybe this is obvious, but if it is not obvious, go back and look at your chart of moment generating functions.*2807

*Stare at this and you recognize that it is a binomial distribution, again.*2813

*Remember, that is how moment generating functions work.*2823

*You work out the MGF and then you go back and look at your chart, and you try to recognize it.*2825

*This is binomial with N = N1 + NR.*2831

*I know what my probability function for binomial distribution is.*2842

*The probability of any given value of U, this is the discrete probability function.*2846

*We had a whole lecture on the binomial distribution, if you are completely lost with the word binomial,*2851

*just scroll back and you will see our probability function for the binomial distribution.*2857

*I’m going to use U instead of Y, we use Y back then.*2861

*It is N choose U × P ⁺U × Q ⁺N- U.*2864

*In this case P of U, I will fill in my N is.*2874

*It is N1 added up to NR choose U P ⁺U and Q ⁺N1 up to NR-U.*2879

*My range here is that U goes between 0 and N, including both of them.*2896

*In this case, U goes between 0 and my N is N1 up to NR.*2903

*That is really not very surprising, it is like you are taking a coin and you are flipping it N1 ×.*2913

*And then, you flip it N2 × and then you flip it N3 ×, and you keep on flipping until you finally flip it NR ×.*2921

*And what you have really done is you flip it N1 + N2 + N3 up to NR × total.*2931

*You get a binomial distribution where your N is just the sum of all those n.*2937

*This is really not very shocking and it is nice to have a moment generating functions*2943

*to confirm what our intuition probably should have already told us.*2949

*Let me review the steps there.*2955

*We are trying to find the moment generating function for U.*2958

*But, U was Y1 through YR, the sum of Y1 through YR.*2961

*Our useful formula on moment generating functions says that, it converts addition into multiplication.*2966

*I got addition in my subscript here, that converted into multiplication here.*2974

*I had to know the moment generating function for each one of the Yi.*2979

*I looked at the moment generating function for binomial distribution, because I was told that the Yi were binomial.*2983

*The moment generating function for binomial distribution, on my chart is PE ⁺T + 1- P ^, *2991

*whatever the N is for that distribution.*2998

*In this case, it is N1 through N sub R.*3001

*I use those as my exponents but then all those terms, since they are multiplied together, *3005

*they combine together and we just get one big exponent at the top and 1 + up to NR.*3010

*And then, I looked back at my charts, see if you can identify this moment generating function.*3018

*And of course, that is a binomial, again it is just binomial where your exponent tells you the N.*3026

*N is N1 up to NR, added together.*3033

*I looked at my binomial probability function, this comes back from our earliest lecture on the binomial distribution.*3037

*You can look this up, you will see this formula except you will see a Y instead of U.*3046

*Here our variable is U and here is the range for U.*3051

*I just plug in what N was, N was N1 through NR.*3055

*I plugged that in all the way through here and my range for U goes from 0 to N.*3061

*Again, this is not surprising, this kind of fits what your instinct should tell you because *3067

*you want to think about flipping the same coin N1 ×, and then you start all over and flip it N2 ×.*3072

*You will keep flipping until you finally flip it NR ×, that is just the same as flipping it many times over.*3079

*N1 + N2 + N3, up to NR ×.*3087

*It is not surprising that the total number of heads will give you a binomial distribution,*3092

*based on that total number of flips.*3100

*In example 4, we got two independent Poisson variables with means λ 1 and λ 2.*3105

*We want to find the probability function of U which is Y1 + Y2.*3111

*Let me set up what we are going to need here.*3118

*I know I'm going to need the moment generating functions of Y1 and Y2.*3119

*Let me go ahead and write those down.*3127

*I'm looking these up from the chart.*3129

*We did have a whole section on how to calculate moment generating functions.*3131

*You could look that up much earlier in the series of lectures, if you want.*3136

*What we did was we eventually found this chart and I’m not going to calculate these again from scratch.*3141

*M sub YI is just E ⁺λ I ×, in the exponent E ⁺T-1.*3146

*That is going to be useful, as I try to calculate the moment generating function of U.*3158

*Let me try to do that, M sub U of T is M sub Y1 + Y2 of T.*3164

*The whole point or one of the really nice features of moment generating functions is that*3173

*they convert addition into multiplication, when you have independent variables.*3179

*We do have independent variables here, this is the M sub Y of T × M sub Y2 of T.*3185

*I can plug in what I have found to be the moment generating functions of each one of those variables.*3194

*This is E ⁺λ 1 × E ⁺T-1 × E ⁺λ 2 × E ⁺T-1.*3201

*Of course, since I have similar exponents, I can add them.*3212

*This is E ⁺λ 1, I will factor out the E ⁺T-1 λ 1 + λ 2 × E ⁺T-1.*3216

*I will go back and I will look at my chart of moment generating functions, and see if I find anything like this.*3227

*Of course, I will find something like this because that is the moment generating function for Poisson distribution.*3234

*The chart tells us this is Poisson with this mean λ is equal to λ 1 + λ 2.*3242

*I know I have a Poisson distribution with means λ 1 + λ 2.*3256

*If I look up my probability function for a Poisson distribution, what it is, is λ ⁺U × E ⁻λ all divided by U!.*3261

*The range there is from U goes from 0 to infinity.*3281

*Let me plug in what λ is, λ is λ 1 + λ 2 ⁺U E ^- λ 1- λ 2 divided by U!, where U goes from 0 to infinity.*3287

*This is also not surprising and let me try to explain this.*3316

*Remember what the Poisson distribution models, it models random occurrences.*3322

*A kind of prototypical example of the Poisson distribution is, you are sitting at an intersection on a country road, *3327

*a kind of a not very crowded country road, and you are counting the number of cars that go by this intersection.*3337

*It does not happen very often, every once in a while a car goes by.*3343

*You might say that Y1 is the number of cars through an intersection on a country road.*3348

*The Poisson distribution models that perfectly because you might have a whole bunch of cars, you might not have any cars.*3362

*Maybe, you are calculating this over the course of 1 hour, how many cars go through this one intersection over 1 hour?*3370

*Y2 could be the number of trucks through the same intersection, again, that is going to follow a Poisson distribution.*3377

*Probably, we will have a different mean because depending on the area, you might have more cars or you might have more trucks.*3387

*If it is a rural community, you might have more trucks because people are carrying stuff around their farms.*3393

*If it is an urban community, you might have more cars.*3398

*But any way, you will have different means for the average number of cars and trucks through the intersection.*3402

*What you are really keeping track of, if you look at Y1 + Y2, it is the of total number of cars and trucks.*3408

*The total number of vehicles through the intersection.*3415

*Again, it is not too surprising, the one we calculate out, the distribution there, *3422

*what we discovered is that is also a Poisson distribution.*3429

*Then, you are just kind of sitting there at that intersection and just every time something with wheels goes through, *3432

*every time a car or truck goes through, you count it as 1.*3438

*It is a Poisson distribution because every once in a while something goes through.*3442

*Sometimes you get a lot of cars and trucks, sometimes you get nothing.*3447

*It is not surprising that, when we calculate the probability function of Y1 + Y2, we end up with the Poisson distribution again.*3451

*Let me recap the steps there.*3459

*We figure out the moment generating function for Poisson distribution.*3462

*I’m being a little charitable when I say I figure that out, I really use the chart that I gave you early on.*3466

*It was in the discrete distributions, earlier on in this lecture.*3474

*If you are really want to know where that comes from, you have to go back and watch the earlier lecture,*3478

*the previous video which covered moment generating functions.*3483

*That is the moment generating function for a single Poisson distribution.*3486

*We want to combine them, we are finding U is Y1 + Y2.*3490

*A very useful formula which showed that, that converts addition into multiplication, *3495

*for a moment generating functions.*3501

*That is because these variables are independent, you can convert addition into multiplication.*3503

*We multiply the two moment generating functions together and it combined very nice and get this λ 1 + λ 2 factoring out.*3509

*If we look back at the chart, that is still the moment generating function for Poisson distribution.*3517

*The only difference is the λ has changed, the new mean is λ 1 + λ 2.*3524

*If you look up the probability function for Poisson distribution, this is something we covered earlier on,*3529

*when we are talking about discreet distributions.*3536

*We had a whole lecture on the Poisson distribution.*3538

*Here is the probability function for Poisson distribution and here is the range.*3541

*I think we used Y before, but now are using U because that is the name of our variable.*3546

*The only difference is that the λ here is λ 1 + λ 2.*3552

*I plug that in everywhere I saw a λ and then I got my probability function for U.*3558

*Again, this is not surprising if you remember what the Poisson distribution measures in real life.*3565

*One way to think about it is, it measures random events that happened with no effect on each other.*3572

*If you are sitting by an intersection, sometimes you see a lot of cars and*3581

*sometimes you see a lot of trucks, and sometimes you do not see anything.*3585

*But, you can have one variable the counts the number of cars, one variable that counts the number of trucks,*3588

*and one variable that counts everything together.*3594

*You are just adding the cars and trucks.*3597

*All three of those are Poisson variables, it is not too surprising when we actually calculate,*3599

*if we add two Poisson variables, the answer is still a Poisson variable.*3608

*In example 5, we have independent normal variables, each one has the same mean and variance.*3616

*Each one has mean μ and variance σ².*3622

*We want to find the distribution of Y ̅, Y ̅ is the average of the variables.*3624

*You can think of it as the mean, but that gets confusing because we also use mean in another sense.*3630

*Y ̅ is 1/N × Y1 + Y2 up to YN.*3636

*We are going to use moment generating functions for this.*3643

*Let me find the moment generating function for any particular normal variable M sub YI.*3645

*I got a normal variable, I have to look this up from the chart look of continuous distributions.*3657

*You will see the moment generating function for a normal variable is E ⁺μ T + σ² T²/2,*3665

*that is all in the exponent there.*3684

*That is the moment generating function for any single variable here.*3686

*I want to find the moment generating function for Y ̅, but I do not think I'm going to find it directly.*3691

*I think I’m going to first find the moment generating function for Y1 through YN.*3695

*I will call that Y, and I will moment generating function from that first.*3701

*And then, I will figure out what to do with that 1/N.*3709

*M sub Y of T is, Y is just Y1 up to YN of T.*3712

*Remember, moment generating functions for independent variables which we have here, *3725

*they turn addition into multiplication.*3730

*M sub Y1 of T multiplying up to M sub YN of T.*3733

*That is just E ⁺μ T + σ² T²/2, multiply it together N ×.*3740

*It is the same moment generating function every time, E ⁺μt + σ² T²/2.*3752

*What I get there is E ⁺μ T + σ² T²/2 ⁺nth.*3761

*I'm going to go ahead and distribute that in into the exponent.*3772

*That is E ⁺μ MT + σ² T² N/2, all of that is in the exponent there.*3776

*That is the moment generating function for Y but that is not quite what I wanted.*3788

*I wanted Y ̅, let me show you how I can deal with that.*3792

*I noticed that Y ̅ is just the same as Y divided by N, it is 1/N × Y.*3797

*Let me remind you of a really useful property, this is listed in fact as a useful formula earlier on in this video.*3804

*Scroll back and you will see the following formula, that M sub AY + the moment generating function of AY + B of T is equal to,*3812

*You start with the moment generating function for Y, you plug in AT whenever you saw a T.*3825

*I forgot to include the extra term there, our extra factor is E ⁺BT × M sub Y of AT.*3831

*That is one of the useful formulas that we have for moment generating functions.*3841

*In this case, what we have is A is equal to 1/N and B is equal to 0, because we have Y ̅ is 1/NY.*3846

*M sub Y ̅ of T is equal to M sub Y of, our A is 1/N so 1/N × T.*3865

*I'm going to take my moment generating function for Y.*3878

*I'm going to plug in 1/N wherever I saw a 1/N × T, wherever I saw AT before.*3887

*I get E ⁺μ N 1/MT, 1/M × T + σ².*3894

*I see a T, I got to put in 1/N × T² × N/2.*3905

*This is actually quite nice because it simplifies E ⁺μ, the N and 1/N cancels, I get E ⁺μ T +, I got Σ² + σ².*3914

*I have got T/N², that is T²/N² × N.*3928

*The N cancels with one of the N in the denominator but not both of them.*3933

*Σ²/N and then, I still have a T² and I still have a 2 there.*3937

*That is all in my exponent, that is my moment generating function for Y ̅.*3947

*What I want to do is go back and look at my chart now *3952

*and see if I recognize that as the moment generating function for any of my known distributions.*3955

*I go back and look at the chart, and what I recognize is that, that is the moment generating function from normal distribution.*3962

*This is the moment generating function for a normal distribution.*3974

*Not quite in the format that was given in the chart though with mean,*3991

*The mean looks good, the mean is ν, that fits the pattern but the moment generating function*4001

*for the normal distribution was E ⁺μ T + σ² T²/2.*4008

*What I have here is σ²/N × T²/2.*4016

*My variance is slightly different here, instead of σ² by itself, σ²/N.*4020

*That is what my distribution of Y ̅ is, my distribution for Y ̅ is normal and it has mean U but its variance is σ²/N.*4031

*It is not the same variance that I started with.*4045

*That is my answer and this is not too surprising because we have a bunch of variables,*4048

*we expect their average to have the same mean as the individual variables.*4055

*However, the average does not have the same variance because we are sampling over more variables.*4060

*It makes the average be less variable, that is the law of large numbers.*4068

*A greater sample size gives smaller variance in the average.*4074

*This is something that is sort of very fundamental to statistics.*4093

*That is why you try and take bigger samples, when you are trying to understand the population.*4097

*It is because, if you take an average of more samples there will be less variance in your calculations.*4103

*By the way, we did calculate this same example back in the lecture on distribution functions.*4110

*If you go back and look at the lecture on distribution functions,*4117

*you will see the same example and you will see the same answer.*4121

*I’m sorry, it was not the lecture on distribution functions, it was the lecture on linear combinations of random variables.*4126

*It was back in the previous chapter, you will see the same example, same answer, *4133

*but calculated using very different methods.*4137

*We were not using moment generating functions back then.*4140

*Let me review the steps here.*4143

*First of all, I wrote down the moment generating function for a normal variable.*4144

*I got that from the chart, I did not calculate that from scratch.*4150

*And then, I want to find the moment generating function for a particular Y, which was Y1 through YN,*4156

*which means I kind of ignored the 1/N to start with here.*4163

*I just called that stuff inside the parentheses Y, I was not going to even worry about the 1/N until later.*4167

*The point of that is that, I have the some of variables and moment generating functions converts sums into products.*4174

*It converts addition into multiplication and that is because these variables are independent.*4182

*And then, I filled in what each one of the individual moment generating functions are.*4188

*Since, I’m multiplying them together, I can just raise it up to the Nth power.*4196

*I can distribute that exponent in, there is N in that exponent there.*4200

*I have to figure out what that 1/N on the outside does to it.*4207

*I was using an old property of moment generating functions that the moment generating function for*4211

*AY + B is E ⁺BT × M sub Y of AT.*4216

*That was listed in, I think it was called the useful formula on one of the introductory slide of this lecture.*4223

*In this case, my A, my coefficient is 1/N.*4229

*I’m plugging in, in place of T I’m substituting in 1/NT.*4233

*There is that 1/NT manifesting itself right there and right there.*4238

*It is very nice on the left, it just cancel off the N, we got that same μ again.*4243

*It does not quite cancel with this N because it gets².*4249

*We have a N² in the denominator and N in the numerator, that is why we still end up with 1N in the denominator.*4252

*Once I got that moment generating function, I went back and look at my chart and said do I recognize this.*4260

*I did spot on the chart, it looks a lot like the moment generating function from normal distribution.*4266

*In fact, the μ is the same, the mean is the same, but the difference is that there was no N for the normal distribution.*4271

*What I have to do is change my variance to be σ²/N, *4279

*that would give me this moment generating function here with the σ²/N.*4283

*I still have a normal distribution, I have the same mean as before, that is not surprising if you take a bunch of samples,*4289

*you expect their average to be the same as the average of the population.*4295

*The variance though is lower, the variance of a bunch of samples will be lower than*4300

*the variance of an individual member of the population.*4306

*The variance that we have now is σ²/N.*4311

*Notice that, if you take more samples which means you make N bigger then you will have a lower variance, *4316

*which is really why surveys with many samples are more accurate than surveys *4321

*with sample of few members of the population.*4327

*In examples 6, we are looking at to two independent exponential variables.*4332

*Each one has mean 3 and we want to find the density function of Y1 + Y2.*4338

*Let me remind you of how this works.*4345

*First, we got to know the moment generating function for an exponential variable since, *4347

*everything here is based on moment generating functions.*4354

*M sub YI, the individual ones, I’m going to look up my moment generating function for the exponential variable on my chart, *4358

*that is earlier on in this lecture.*4372

*If you scroll back in this lecture, you will see the moment generating functions for continuous variables.*4375

*The one for the exponential function is 1- β T to the -1.*4381

*In this case, our mean is given that β = 3, it is 1 -3 T⁻¹.*4390

*We are going to use that when we find the moment generating function for U, *4400

*that is the moment generating function for Y1 + Y2.*4404

*The whole point of moment generating functions or one of the very useful properties *4410

*that they have is that, it converts addition into multiplication.*4414

* M sub Y1 × M sub Y2, that is 1 -3T⁻¹ × 1 – 3T⁻¹.*4419

*We just get 1 -3T⁻², we are going to look back in my chart and say do I recognize this *4434

*as the moment generating function for any of my known distributions.*4442

*If you look back at the chart, you will see that the gamma distribution does have a moment generating function.*4448

*The gamma distribution does have a moment generating function of 1- β T ⁻α.*4456

*What I have here is a gamma distribution with α is 2 and β is 3.*4469

*I can find the density function now as the density function from the gamma distribution.*4477

*Here is the density function for the gamma distribution.*4482

*I learned this way back in one of the earlier videos on the gamma distribution.*4487

*You can look this up, if you do not remember it.*4491

*It is U ⁺α -1 × E ⁻U/β divided by β ⁺α × γ of α.*4494

*In this case, U ⁺α -1, α is 2 so this is just U¹ × E ⁻U/3.*4507

*Β ⁺α is 3² and γ of α is γ of 2.*4518

*Γ of 2, remember is 2 -1!, 1! is going to be 1.*4524

*That is easy to work out, γ of a whole number because it is related to the factorial function.*4530

*Let me simplify that, F sub U of U is UE ⁻U/3 divided by 3² is 9.*4535

*My range for gamma distribution is U goes from 0 to infinity.*4551

*I found my density function for U.*4557

*That is it, let me review the steps there.*4566

*It was given that we had exponential variables.*4569

*The first thing I did was, look up the moment generating function for the exponential variable on the chart.*4573

*It is β T in general, but β is the mean of the exponential distribution, that is 3, in this case.*4579

*We are given that it was 3 and U is Y1 + Y2.*4586

*If I want to calculate its moment generating function, it converts addition into multiplication, *4591

*using the fact that we have independent variables there.*4598

*I multiply together two copies of 1 -3 T⁻¹, I get 1 -3T⁻².*4602

*I go back and look at the chart, and I'm looking at my continuous distributions.*4610

*I'm saying do I recognize this moment generating function.*4614

*And I say, yes this is the MGF, the moment generating function for the gamma distribution because,*4622

*the moment generating function for the gamma distribution has this form, 1- β T ⁻α.*4634

*I just recognize that this is the right thing with α = 2 and β = 3.*4640

*I know I got a gamma distribution and I know my formula for gamma distribution,*4646

*my density function for gamma distribution is just given by this.*4651

*This comes from our earlier lecture on the gamma distribution.*4656

*You can go back and look that up, if this formula seems to come out of left field.*4660

*And then, I plugged in my α and my β.*4664

*Remember, the γ of N is just N -1!, if N is a whole number.*4670

*Γ of 2 is just 1Factorial which is just 1.*4676

*I just simplified everything here and I got down to UE ⁻U/3, they are all divided by 9.*4682

*My range for the gamma distribution is going from 0 to infinity.*4692

*That wraps up our lecture on moment generating functions, this is kind of a long one.*4697

*I really appreciate if you stuck with me through all of that.*4701

*That wraps up this three lecture series on finding distributions of functions of random variables.*4704

*We had one on distribution functions, one on transformations, and now this last one on moment generating functions.*4711

*Next up, we are going to talk about order statistics, I hope you will stay tuned for that.*4718

*This is part of the larger series of probability lectures here on www.educator.com.*4722

*I, as always, I’m your host Will Murray, thank you for joining me today, bye.*4728

1 answer

Last reply by: Dr. William Murray

Thu Jun 5, 2014 12:21 PM

Post by Vivek Sharma on June 4, 2014

Professor Murray, Once we have the MGF for chi square distribution, How can we find the skewness and kurtosis from 3rd and fourth derivatives after putting t =0. i have found the derivatives but can't find skewness(zeta3) and kurtosis(zeta4)..... please help....