William Murray

William Murray

Moment-Generating Functions

Slide Duration:

Table of Contents

Section 1: Probability by Counting
Experiments, Outcomes, Samples, Spaces, Events

59m 30s

Intro
0:00
Terminology
0:19
Experiment
0:26
Outcome
0:56
Sample Space
1:16
Event
1:55
Key Formula
2:47
Formula for Finding the Probability of an Event
2:48
Example: Drawing a Card
3:36
Example I
5:01
Experiment
5:38
Outcomes
5:54
Probability of the Event
8:11
Example II
12:00
Experiment
12:17
Outcomes
12:34
Probability of the Event
13:49
Example III
16:33
Experiment
17:09
Outcomes
17:33
Probability of the Event
18:25
Example IV
21:20
Experiment
21:21
Outcomes
22:00
Probability of the Event
23:22
Example V
31:41
Experiment
32:14
Outcomes
32:35
Probability of the Event
33:27
Alternate Solution
40:16
Example VI
43:33
Experiment
44:08
Outcomes
44:24
Probability of the Event
53:35
Combining Events: Multiplication & Addition

1h 2m 47s

Intro
0:00
Unions of Events
0:40
Unions of Events
0:41
Disjoint Events
3:42
Intersections of Events
4:18
Intersections of Events
4:19
Conditional Probability
5:47
Conditional Probability
5:48
Independence
8:20
Independence
8:21
Warning: Independent Does Not Mean Disjoint
9:53
If A and B are Independent
11:20
Example I: Choosing a Number at Random
12:41
Solving by Counting
12:52
Solving by Probability
17:26
Example II: Combination
22:07
Combination Deal at a Restaurant
22:08
Example III: Rolling Two Dice
24:18
Define the Events
24:20
Solving by Counting
27:35
Solving by Probability
29:32
Example IV: Flipping a Coin
35:07
Flipping a Coin Four Times
35:08
Example V: Conditional Probabilities
41:22
Define the Events
42:23
Calculate the Conditional Probabilities
46:21
Example VI: Independent Events
53:42
Define the Events
53:43
Are Events Independent?
55:21
Choices: Combinations & Permutations

56m 3s

Intro
0:00
Choices: With or Without Replacement?
0:12
Choices: With or Without Replacement?
0:13
Example: With Replacement
2:17
Example: Without Replacement
2:55
Choices: Ordered or Unordered?
4:10
Choices: Ordered or Unordered?
4:11
Example: Unordered
4:52
Example: Ordered
6:08
Combinations
9:23
Definition & Equation: Combinations
9:24
Example: Combinations
12:12
Permutations
13:56
Definition & Equation: Permutations
13:57
Example: Permutations
15:00
Key Formulas
17:19
Number of Ways to Pick r Things from n Possibilities
17:20
Example I: Five Different Candy Bars
18:31
Example II: Five Identical Candy Bars
24:53
Example III: Five Identical Candy Bars
31:56
Example IV: Five Different Candy Bars
39:21
Example V: Pizza & Toppings
45:03
Inclusion & Exclusion

43m 40s

Intro
0:00
Inclusion/Exclusion: Two Events
0:09
Inclusion/Exclusion: Two Events
0:10
Inclusion/Exclusion: Three Events
2:30
Inclusion/Exclusion: Three Events
2:31
Example I: Inclusion & Exclusion
6:24
Example II: Inclusion & Exclusion
11:01
Example III: Inclusion & Exclusion
18:41
Example IV: Inclusion & Exclusion
28:24
Example V: Inclusion & Exclusion
39:33
Independence

46m 9s

Intro
0:00
Formula and Intuition
0:12
Definition of Independence
0:19
Intuition
0:49
Common Misinterpretations
1:37
Myth & Truth 1
1:38
Myth & Truth 2
2:23
Combining Independent Events
3:56
Recall: Formula for Conditional Probability
3:58
Combining Independent Events
4:10
Example I: Independence
5:36
Example II: Independence
14:14
Example III: Independence
21:10
Example IV: Independence
32:45
Example V: Independence
41:13
Bayes' Rule

1h 2m 10s

Intro
0:00
When to Use Bayes' Rule
0:08
When to Use Bayes' Rule: Disjoint Union of Events
0:09
Bayes' Rule for Two Choices
2:50
Bayes' Rule for Two Choices
2:51
Bayes' Rule for Multiple Choices
5:03
Bayes' Rule for Multiple Choices
5:04
Example I: What is the Chance that She is Diabetic?
6:55
Example I: Setting up the Events
6:56
Example I: Solution
11:33
Example II: What is the chance that It Belongs to a Woman?
19:28
Example II: Setting up the Events
19:29
Example II: Solution
21:45
Example III: What is the Probability that She is a Democrat?
27:31
Example III: Setting up the Events
27:32
Example III: Solution
32:08
Example IV: What is the chance that the Fruit is an Apple?
39:11
Example IV: Setting up the Events
39:12
Example IV: Solution
43:50
Example V: What is the Probability that the Oldest Child is a Girl?
51:16
Example V: Setting up the Events
51:17
Example V: Solution
53:07
Section 2: Random Variables
Random Variables & Probability Distribution

38m 21s

Intro
0:00
Intuition
0:15
Intuition for Random Variable
0:16
Example: Random Variable
0:44
Intuition, Cont.
2:52
Example: Random Variable as Payoff
2:57
Definition
5:11
Definition of a Random Variable
5:13
Example: Random Variable in Baseball
6:02
Probability Distributions
7:18
Probability Distributions
7:19
Example I: Probability Distribution for the Random Variable
9:29
Example II: Probability Distribution for the Random Variable
14:52
Example III: Probability Distribution for the Random Variable
21:52
Example IV: Probability Distribution for the Random Variable
27:25
Example V: Probability Distribution for the Random Variable
34:12
Expected Value (Mean)

46m 14s

Intro
0:00
Definition of Expected Value
0:20
Expected Value of a (Discrete) Random Variable or Mean
0:21
Indicator Variables
3:03
Indicator Variable
3:04
Linearity of Expectation
4:36
Linearity of Expectation for Random Variables
4:37
Expected Value of a Function
6:03
Expected Value of a Function
6:04
Example I: Expected Value
7:30
Example II: Expected Value
14:14
Example III: Expected Value of Flipping a Coin
21:42
Example III: Part A
21:43
Example III: Part B
30:43
Example IV: Semester Average
36:39
Example V: Expected Value of a Function of a Random Variable
41:28
Variance & Standard Deviation

47m 23s

Intro
0:00
Definition of Variance
0:11
Variance of a Random Variable
0:12
Variance is a Measure of the Variability, or Volatility
1:06
Most Useful Way to Calculate Variance
2:46
Definition of Standard Deviation
3:44
Standard Deviation of a Random Variable
3:45
Example I: Which of the Following Sets of Data Has the Largest Variance?
5:34
Example II: Which of the Following Would be the Least Useful in Understanding a Set of Data?
9:02
Example III: Calculate the Mean, Variance, & Standard Deviation
11:48
Example III: Mean
12:56
Example III: Variance
14:06
Example III: Standard Deviation
15:42
Example IV: Calculate the Mean, Variance, & Standard Deviation
17:54
Example IV: Mean
18:47
Example IV: Variance
20:36
Example IV: Standard Deviation
25:34
Example V: Calculate the Mean, Variance, & Standard Deviation
29:56
Example V: Mean
30:13
Example V: Variance
33:28
Example V: Standard Deviation
34:48
Example VI: Calculate the Mean, Variance, & Standard Deviation
37:29
Example VI: Possible Outcomes
38:09
Example VI: Mean
39:29
Example VI: Variance
41:22
Example VI: Standard Deviation
43:28
Markov's Inequality

26m 45s

Intro
0:00
Markov's Inequality
0:25
Markov's Inequality: Definition & Condition
0:26
Markov's Inequality: Equation
1:15
Markov's Inequality: Reverse Equation
2:48
Example I: Money
4:11
Example II: Rental Car
9:23
Example III: Probability of an Earthquake
12:22
Example IV: Defective Laptops
16:52
Example V: Cans of Tuna
21:06
Tchebysheff's Inequality

42m 11s

Intro
0:00
Tchebysheff's Inequality (Also Known as Chebyshev's Inequality)
0:52
Tchebysheff's Inequality: Definition
0:53
Tchebysheff's Inequality: Equation
1:19
Tchebysheff's Inequality: Intuition
3:21
Tchebysheff's Inequality in Reverse
4:09
Tchebysheff's Inequality in Reverse
4:10
Intuition
5:13
Example I: Money
5:55
Example II: College Units
13:20
Example III: Using Tchebysheff's Inequality to Estimate Proportion
16:40
Example IV: Probability of an Earthquake
25:21
Example V: Using Tchebysheff's Inequality to Estimate Proportion
32:57
Section 3: Discrete Distributions
Binomial Distribution (Bernoulli Trials)

52m 36s

Intro
0:00
Binomial Distribution
0:29
Binomial Distribution (Bernoulli Trials) Overview
0:30
Prototypical Examples: Flipping a Coin n Times
1:36
Process with Two Outcomes: Games Between Teams
2:12
Process with Two Outcomes: Rolling a Die to Get a 6
2:42
Formula for the Binomial Distribution
3:45
Fixed Parameters
3:46
Formula for the Binomial Distribution
6:27
Key Properties of the Binomial Distribution
9:54
Mean
9:55
Variance
10:56
Standard Deviation
11:13
Example I: Games Between Teams
11:36
Example II: Exam Score
17:01
Example III: Expected Grade & Standard Deviation
25:59
Example IV: Pogo-sticking Championship, Part A
33:25
Example IV: Pogo-sticking Championship, Part B
38:24
Example V: Expected Championships Winning & Standard Deviation
45:22
Geometric Distribution

52m 50s

Intro
0:00
Geometric Distribution
0:22
Geometric Distribution: Definition
0:23
Prototypical Example: Flipping a Coin Until We Get a Head
1:08
Geometric Distribution vs. Binomial Distribution.
1:31
Formula for the Geometric Distribution
2:13
Fixed Parameters
2:14
Random Variable
2:49
Formula for the Geometric Distribution
3:16
Key Properties of the Geometric Distribution
6:47
Mean
6:48
Variance
7:10
Standard Deviation
7:25
Geometric Series
7:46
Recall from Calculus II: Sum of Infinite Series
7:47
Application to Geometric Distribution
10:10
Example I: Drawing Cards from a Deck (With Replacement) Until You Get an Ace
13:02
Example I: Question & Solution
13:03
Example II: Mean & Standard Deviation of Winning Pin the Tail on the Donkey
16:32
Example II: Mean
16:33
Example II: Standard Deviation
18:37
Example III: Rolling a Die
22:09
Example III: Setting Up
22:10
Example III: Part A
24:18
Example III: Part B
26:01
Example III: Part C
27:38
Example III: Summary
32:02
Example IV: Job Interview
35:16
Example IV: Setting Up
35:15
Example IV: Part A
37:26
Example IV: Part B
38:33
Example IV: Summary
39:37
Example V: Mean & Standard Deviation of Time to Conduct All the Interviews
41:13
Example V: Setting Up
42:50
Example V: Mean
46:05
Example V: Variance
47:37
Example V: Standard Deviation
48:22
Example V: Summary
49:36
Negative Binomial Distribution

51m 39s

Intro
0:00
Negative Binomial Distribution
0:11
Negative Binomial Distribution: Definition
0:12
Prototypical Example: Flipping a Coin Until We Get r Successes
0:46
Negative Binomial Distribution vs. Binomial Distribution
1:04
Negative Binomial Distribution vs. Geometric Distribution
1:33
Formula for Negative Binomial Distribution
3:39
Fixed Parameters
3:40
Random Variable
4:57
Formula for Negative Binomial Distribution
5:18
Key Properties of Negative Binomial
7:44
Mean
7:47
Variance
8:03
Standard Deviation
8:09
Example I: Drawing Cards from a Deck (With Replacement) Until You Get Four Aces
8:32
Example I: Question & Solution
8:33
Example II: Chinchilla Grooming
12:37
Example II: Mean
12:38
Example II: Variance
15:09
Example II: Standard Deviation
15:51
Example II: Summary
17:10
Example III: Rolling a Die Until You Get Four Sixes
18:27
Example III: Setting Up
19:38
Example III: Mean
19:38
Example III: Variance
20:31
Example III: Standard Deviation
21:21
Example IV: Job Applicants
24:00
Example IV: Setting Up
24:01
Example IV: Part A
26:16
Example IV: Part B
29:53
Example V: Mean & Standard Deviation of Time to Conduct All the Interviews
40:10
Example V: Setting Up
40:11
Example V: Mean
45:24
Example V: Variance
46:22
Example V: Standard Deviation
47:01
Example V: Summary
48:16
Hypergeometric Distribution

36m 27s

Intro
0:00
Hypergeometric Distribution
0:11
Hypergeometric Distribution: Definition
0:12
Random Variable
1:38
Formula for the Hypergeometric Distribution
1:50
Fixed Parameters
1:51
Formula for the Hypergeometric Distribution
2:53
Key Properties of Hypergeometric
6:14
Mean
6:15
Variance
6:42
Standard Deviation
7:16
Example I: Students Committee
7:30
Example II: Expected Number of Women on the Committee in Example I
11:08
Example III: Pairs of Shoes
13:49
Example IV: What is the Expected Number of Left Shoes in Example III?
20:46
Example V: Using Indicator Variables & Linearity of Expectation
25:40
Poisson Distribution

52m 19s

Intro
0:00
Poisson Distribution
0:18
Poisson Distribution: Definition
0:19
Formula for the Poisson Distribution
2:16
Fixed Parameter
2:17
Formula for the Poisson Distribution
2:59
Key Properties of the Poisson Distribution
5:30
Mean
5:34
Variance
6:07
Standard Deviation
6:27
Example I: Forest Fires
6:41
Example II: Call Center, Part A
15:56
Example II: Call Center, Part B
20:50
Example III: Confirming that the Mean of the Poisson Distribution is λ
26:53
Example IV: Find E (Y²) for the Poisson Distribution
35:24
Example V: Earthquakes, Part A
37:57
Example V: Earthquakes, Part B
44:02
Section 4: Continuous Distributions
Density & Cumulative Distribution Functions

57m 17s

Intro
0:00
Density Functions
0:43
Density Functions
0:44
Density Function to Calculate Probabilities
2:41
Cumulative Distribution Functions
4:28
Cumulative Distribution Functions
4:29
Using F to Calculate Probabilities
5:58
Properties of the CDF (Density & Cumulative Distribution Functions)
7:27
F(-∞) = 0
7:34
F(∞) = 1
8:30
F is Increasing
9:14
F'(y) = f(y)
9:21
Example I: Density & Cumulative Distribution Functions, Part A
9:43
Example I: Density & Cumulative Distribution Functions, Part B
14:16
Example II: Density & Cumulative Distribution Functions, Part A
21:41
Example II: Density & Cumulative Distribution Functions, Part B
26:16
Example III: Density & Cumulative Distribution Functions, Part A
32:17
Example III: Density & Cumulative Distribution Functions, Part B
37:08
Example IV: Density & Cumulative Distribution Functions
43:34
Example V: Density & Cumulative Distribution Functions, Part A
51:53
Example V: Density & Cumulative Distribution Functions, Part B
54:19
Mean & Variance for Continuous Distributions

36m 18s

Intro
0:00
Mean
0:32
Mean for a Continuous Random Variable
0:33
Expectation is Linear
2:07
Variance
2:55
Variance for Continuous random Variable
2:56
Easier to Calculate Via the Mean
3:26
Standard Deviation
5:03
Standard Deviation
5:04
Example I: Mean & Variance for Continuous Distributions
5:43
Example II: Mean & Variance for Continuous Distributions
10:09
Example III: Mean & Variance for Continuous Distributions
16:05
Example IV: Mean & Variance for Continuous Distributions
26:40
Example V: Mean & Variance for Continuous Distributions
30:12
Uniform Distribution

32m 49s

Intro
0:00
Uniform Distribution
0:15
Uniform Distribution
0:16
Each Part of the Region is Equally Probable
1:39
Key Properties of the Uniform Distribution
2:45
Mean
2:46
Variance
3:27
Standard Deviation
3:48
Example I: Newspaper Delivery
5:25
Example II: Picking a Real Number from a Uniform Distribution
8:21
Example III: Dinner Date
11:02
Example IV: Proving that a Variable is Uniformly Distributed
18:50
Example V: Ice Cream Serving
27:22
Normal (Gaussian) Distribution

1h 3m 54s

Intro
0:00
Normal (Gaussian) Distribution
0:35
Normal (Gaussian) Distribution & The Bell Curve
0:36
Fixed Parameters
0:55
Formula for the Normal Distribution
1:32
Formula for the Normal Distribution
1:33
Calculating on the Normal Distribution can be Tricky
3:32
Standard Normal Distribution
5:12
Standard Normal Distribution
5:13
Graphing the Standard Normal Distribution
6:13
Standard Normal Distribution, Cont.
8:30
Standard Normal Distribution Chart
8:31
Nonstandard Normal Distribution
14:44
Nonstandard Normal Variable & Associated Standard Normal
14:45
Finding Probabilities for Z
15:39
Example I: Chance that Standard Normal Variable Will Land Between 1 and 2?
16:46
Example I: Setting Up the Equation & Graph
16:47
Example I: Solving for z Using the Standard Normal Chart
19:05
Example II: What Proportion of the Data Lies within Two Standard Deviations of the Mean?
20:41
Example II: Setting Up the Equation & Graph
20:42
Example II: Solving for z Using the Standard Normal Chart
24:38
Example III: Scores on an Exam
27:34
Example III: Setting Up the Equation & Graph, Part A
27:35
Example III: Setting Up the Equation & Graph, Part B
33:48
Example III: Solving for z Using the Standard Normal Chart, Part A
38:23
Example III: Solving for z Using the Standard Normal Chart, Part B
40:49
Example IV: Temperatures
42:54
Example IV: Setting Up the Equation & Graph
42:55
Example IV: Solving for z Using the Standard Normal Chart
47:03
Example V: Scores on an Exam
48:41
Example V: Setting Up the Equation & Graph, Part A
48:42
Example V: Setting Up the Equation & Graph, Part B
53:20
Example V: Solving for z Using the Standard Normal Chart, Part A
57:45
Example V: Solving for z Using the Standard Normal Chart, Part B
59:17
Gamma Distribution (with Exponential & Chi-square)

1h 8m 27s

Intro
0:00
Gamma Function
0:49
The Gamma Function
0:50
Properties of the Gamma Function
2:07
Formula for the Gamma Distribution
3:50
Fixed Parameters
3:51
Density Function for Gamma Distribution
4:07
Key Properties of the Gamma Distribution
7:13
Mean
7:14
Variance
7:25
Standard Deviation
7:30
Exponential Distribution
8:03
Definition of Exponential Distribution
8:04
Density
11:23
Mean
13:26
Variance
13:48
Standard Deviation
13:55
Chi-square Distribution
14:34
Chi-square Distribution: Overview
14:35
Chi-square Distribution: Mean
16:27
Chi-square Distribution: Variance
16:37
Chi-square Distribution: Standard Deviation
16:55
Example I: Graphing Gamma Distribution
17:30
Example I: Graphing Gamma Distribution
17:31
Example I: Describe the Effects of Changing α and β on the Shape of the Graph
23:33
Example II: Exponential Distribution
27:11
Example II: Using the Exponential Distribution
27:12
Example II: Summary
35:34
Example III: Earthquake
37:05
Example III: Estimate Using Markov's Inequality
37:06
Example III: Estimate Using Tchebysheff's Inequality
40:13
Example III: Summary
44:13
Example IV: Finding Exact Probability of Earthquakes
46:45
Example IV: Finding Exact Probability of Earthquakes
46:46
Example IV: Summary
51:44
Example V: Prove and Interpret Why the Exponential Distribution is Called 'Memoryless'
52:51
Example V: Prove
52:52
Example V: Interpretation
57:44
Example V: Summary
1:03:54
Beta Distribution

52m 45s

Intro
0:00
Beta Function
0:29
Fixed parameters
0:30
Defining the Beta Function
1:19
Relationship between the Gamma & Beta Functions
2:02
Beta Distribution
3:31
Density Function for the Beta Distribution
3:32
Key Properties of the Beta Distribution
6:56
Mean
6:57
Variance
7:16
Standard Deviation
7:37
Example I: Calculate B(3,4)
8:10
Example II: Graphing the Density Functions for the Beta Distribution
12:25
Example III: Show that the Uniform Distribution is a Special Case of the Beta Distribution
24:57
Example IV: Show that this Triangular Distribution is a Special Case of the Beta Distribution
31:20
Example V: Morning Commute
37:39
Example V: Identify the Density Function
38:45
Example V: Morning Commute, Part A
42:22
Example V: Morning Commute, Part B
44:19
Example V: Summary
49:13
Moment-Generating Functions

51m 58s

Intro
0:00
Moments
0:30
Definition of Moments
0:31
Moment-Generating Functions (MGFs)
3:53
Moment-Generating Functions
3:54
Using the MGF to Calculate the Moments
5:21
Moment-Generating Functions for the Discrete Distributions
8:22
Moment-Generating Functions for Binomial Distribution
8:36
Moment-Generating Functions for Geometric Distribution
9:06
Moment-Generating Functions for Negative Binomial Distribution
9:28
Moment-Generating Functions for Hypergeometric Distribution
9:43
Moment-Generating Functions for Poisson Distribution
9:57
Moment-Generating Functions for the Continuous Distributions
11:34
Moment-Generating Functions for the Uniform Distributions
11:43
Moment-Generating Functions for the Normal Distributions
12:24
Moment-Generating Functions for the Gamma Distributions
12:36
Moment-Generating Functions for the Exponential Distributions
12:44
Moment-Generating Functions for the Chi-square Distributions
13:11
Moment-Generating Functions for the Beta Distributions
13:48
Useful Formulas with Moment-Generating Functions
15:02
Useful Formulas with Moment-Generating Functions 1
15:03
Useful Formulas with Moment-Generating Functions 2
16:21
Example I: Moment-Generating Function for the Binomial Distribution
17:33
Example II: Use the MGF for the Binomial Distribution to Find the Mean of the Distribution
24:40
Example III: Find the Moment Generating Function for the Poisson Distribution
29:28
Example IV: Use the MGF for Poisson Distribution to Find the Mean and Variance of the Distribution
36:27
Example V: Find the Moment-generating Function for the Uniform Distribution
44:47
Section 5: Multivariate Distributions
Bivariate Density & Distribution Functions

50m 52s

Intro
0:00
Bivariate Density Functions
0:21
Two Variables
0:23
Bivariate Density Function
0:52
Properties of the Density Function
1:57
Properties of the Density Function 1
1:59
Properties of the Density Function 2
2:20
We Can Calculate Probabilities
2:53
If You Have a Discrete Distribution
4:36
Bivariate Distribution Functions
5:25
Bivariate Distribution Functions
5:26
Properties of the Bivariate Distribution Functions 1
7:19
Properties of the Bivariate Distribution Functions 2
7:36
Example I: Bivariate Density & Distribution Functions
8:08
Example II: Bivariate Density & Distribution Functions
14:40
Example III: Bivariate Density & Distribution Functions
24:33
Example IV: Bivariate Density & Distribution Functions
32:04
Example V: Bivariate Density & Distribution Functions
40:26
Marginal Probability

42m 38s

Intro
0:00
Discrete Case
0:48
Marginal Probability Functions
0:49
Continuous Case
3:07
Marginal Density Functions
3:08
Example I: Compute the Marginal Probability Function
5:58
Example II: Compute the Marginal Probability Function
14:07
Example III: Marginal Density Function
24:01
Example IV: Marginal Density Function
30:47
Example V: Marginal Density Function
36:05
Conditional Probability & Conditional Expectation

1h 2m 24s

Intro
0:00
Review of Marginal Probability
0:46
Recall the Marginal Probability Functions & Marginal Density Functions
0:47
Conditional Probability, Discrete Case
3:14
Conditional Probability, Discrete Case
3:15
Conditional Probability, Continuous Case
4:15
Conditional Density of Y₁ given that Y₂ = y₂
4:16
Interpret This as a Density on Y₁ & Calculate Conditional Probability
5:03
Conditional Expectation
6:44
Conditional Expectation: Continuous
6:45
Conditional Expectation: Discrete
8:03
Example I: Conditional Probability
8:29
Example II: Conditional Probability
23:59
Example III: Conditional Probability
34:28
Example IV: Conditional Expectation
43:16
Example V: Conditional Expectation
48:28
Independent Random Variables

51m 39s

Intro
0:00
Intuition
0:55
Experiment with Two Random Variables
0:56
Intuition Formula
2:17
Definition and Formulas
4:43
Definition
4:44
Short Version: Discrete
5:10
Short Version: Continuous
5:48
Theorem
9:33
For Continuous Random Variables, Y₁ & Y₂ are Independent If & Only If: Condition 1
9:34
For Continuous Random Variables, Y₁ & Y₂ are Independent If & Only If: Condition 2
11:22
Example I: Use the Definition to Determine if Y₁ and Y₂ are Independent
12:49
Example II: Use the Definition to Determine if Y₁ and Y₂ are Independent
21:33
Example III: Are Y₁ and Y₂ Independent?
27:01
Example IV: Are Y₁ and Y₂ Independent?
34:51
Example V: Are Y₁ and Y₂ Independent?
43:44
Expected Value of a Function of Random Variables

37m 7s

Intro
0:00
Review of Single Variable Case
0:29
Expected Value of a Single Variable
0:30
Expected Value of a Function g(Y)
1:12
Bivariate Case
2:11
Expected Value of a Function g(Y₁, Y₂)
2:12
Linearity of Expectation
3:24
Linearity of Expectation 1
3:25
Linearity of Expectation 2
3:38
Linearity of Expectation 3: Additivity
4:03
Example I: Calculate E (Y₁ + Y₂)
4:39
Example II: Calculate E (Y₁Y₂)
14:47
Example III: Calculate E (U₁) and E(U₂)
19:33
Example IV: Calculate E (Y₁) and E(Y₂)
22:50
Example V: Calculate E (2Y₁ + 3Y₂)
33:05
Covariance, Correlation & Linear Functions

59m 50s

Intro
0:00
Definition and Formulas for Covariance
0:38
Definition of Covariance
0:39
Formulas to Calculate Covariance
1:36
Intuition for Covariance
3:54
Covariance is a Measure of Dependence
3:55
Dependence Doesn't Necessarily Mean that the Variables Do the Same Thing
4:12
If Variables Move Together
4:47
If Variables Move Against Each Other
5:04
Both Cases Show Dependence!
5:30
Independence Theorem
8:10
Independence Theorem
8:11
The Converse is Not True
8:32
Correlation Coefficient
9:33
Correlation Coefficient
9:34
Linear Functions of Random Variables
11:57
Linear Functions of Random Variables: Expected Value
11:58
Linear Functions of Random Variables: Variance
12:58
Linear Functions of Random Variables, Cont.
14:30
Linear Functions of Random Variables: Covariance
14:35
Example I: Calculate E (Y₁), E (Y₂), and E (Y₁Y₂)
15:31
Example II: Are Y₁ and Y₂ Independent?
29:16
Example III: Calculate V (U₁) and V (U₂)
36:14
Example IV: Calculate the Covariance Correlation Coefficient
42:12
Example V: Find the Mean and Variance of the Average
52:19
Section 6: Distributions of Functions of Random Variables
Distribution Functions

1h 7m 35s

Intro
0:00
Premise
0:44
Premise
0:45
Goal
1:38
Goal Number 1: Find the Full Distribution Function
1:39
Goal Number 2: Find the Density Function
1:55
Goal Number 3: Calculate Probabilities
2:17
Three Methods
3:05
Method 1: Distribution Functions
3:06
Method 2: Transformations
3:38
Method 3: Moment-generating Functions
3:47
Distribution Functions
4:03
Distribution Functions
4:04
Example I: Find the Density Function
6:41
Step 1: Find the Distribution Function
6:42
Step 2: Find the Density Function
10:20
Summary
11:51
Example II: Find the Density Function
14:36
Step 1: Find the Distribution Function
14:37
Step 2: Find the Density Function
18:19
Summary
19:22
Example III: Find the Cumulative Distribution & Density Functions
20:39
Step 1: Find the Cumulative Distribution
20:40
Step 2: Find the Density Function
28:58
Summary
30:20
Example IV: Find the Density Function
33:01
Step 1: Setting Up the Equation & Graph
33:02
Step 2: If u ≤ 1
38:32
Step 3: If u ≥ 1
41:02
Step 4: Find the Distribution Function
42:40
Step 5: Find the Density Function
43:11
Summary
45:03
Example V: Find the Density Function
48:32
Step 1: Exponential
48:33
Step 2: Independence
50:48
Step 2: Find the Distribution Function
51:47
Step 3: Find the Density Function
1:00:17
Summary
1:02:05
Transformations

1h 16s

Intro
0:00
Premise
0:32
Premise
0:33
Goal
1:37
Goal Number 1: Find the Full Distribution Function
1:38
Goal Number 2: Find the Density Function
1:49
Goal Number 3: Calculate Probabilities
2:04
Three Methods
2:34
Method 1: Distribution Functions
2:35
Method 2: Transformations
2:57
Method 3: Moment-generating Functions
3:05
Requirements for Transformation Method
3:22
The Transformation Method Only Works for Single-variable Situations
3:23
Must be a Strictly Monotonic Function
3:50
Example: Strictly Monotonic Function
4:50
If the Function is Monotonic, Then It is Invertible
5:30
Formula for Transformations
7:09
Formula for Transformations
7:11
Example I: Determine whether the Function is Monotonic, and if so, Find Its Inverse
8:26
Example II: Find the Density Function
12:07
Example III: Determine whether the Function is Monotonic, and if so, Find Its Inverse
17:12
Example IV: Find the Density Function for the Magnitude of the Next Earthquake
21:30
Example V: Find the Expected Magnitude of the Next Earthquake
33:20
Example VI: Find the Density Function, Including the Range of Possible Values for u
47:42
Moment-Generating Functions

1h 18m 52s

Intro
0:00
Premise
0:30
Premise
0:31
Goal
1:40
Goal Number 1: Find the Full Distribution Function
1:41
Goal Number 2: Find the Density Function
1:51
Goal Number 3: Calculate Probabilities
2:01
Three Methods
2:39
Method 1: Distribution Functions
2:40
Method 2: Transformations
2:50
Method 3: Moment-Generating Functions
2:55
Review of Moment-Generating Functions
3:04
Recall: The Moment-Generating Function for a Random Variable Y
3:05
The Moment-Generating Function is a Function of t (Not y)
3:45
Moment-Generating Functions for the Discrete Distributions
4:31
Binomial
4:50
Geometric
5:12
Negative Binomial
5:24
Hypergeometric
5:33
Poisson
5:42
Moment-Generating Functions for the Continuous Distributions
6:08
Uniform
6:09
Normal
6:17
Gamma
6:29
Exponential
6:34
Chi-square
7:05
Beta
7:48
Useful Formulas with the Moment-Generating Functions
8:48
Useful Formula 1
8:49
Useful Formula 2
9:51
How to Use Moment-Generating Functions
10:41
How to Use Moment-Generating Functions
10:42
Example I: Find the Density Function
12:22
Example II: Find the Density Function
30:58
Example III: Find the Probability Function
43:29
Example IV: Find the Probability Function
51:43
Example V: Find the Distribution
1:00:14
Example VI: Find the Density Function
1:12:10
Order Statistics

1h 4m 56s

Intro
0:00
Premise
0:11
Example Question: How Tall Will the Tallest Student in My Next Semester's Probability Class Be?
0:12
Setting
0:56
Definition 1
1:49
Definition 2
2:01
Question: What are the Distributions & Densities?
4:08
Formulas
4:47
Distribution of Max
5:11
Density of Max
6:00
Distribution of Min
7:08
Density of Min
7:18
Example I: Distribution & Density Functions
8:29
Example I: Distribution
8:30
Example I: Density
11:07
Example I: Summary
12:33
Example II: Distribution & Density Functions
14:25
Example II: Distribution
14:26
Example II: Density
17:21
Example II: Summary
19:00
Example III: Mean & Variance
20:32
Example III: Mean
20:33
Example III: Variance
25:48
Example III: Summary
30:57
Example IV: Distribution & Density Functions
35:43
Example IV: Distribution
35:44
Example IV: Density
43:03
Example IV: Summary
46:11
Example V: Find the Expected Time Until the Team's First Injury
51:14
Example V: Solution
51:15
Example V: Summary
1:01:11
Sampling from a Normal Distribution

1h 7s

Intro
0:00
Setting
0:36
Setting
0:37
Assumptions and Notation
2:18
Assumption Forever
2:19
Assumption for this Lecture Only
3:21
Notation
3:49
The Sample Mean
4:15
Statistic We'll Study the Sample Mean
4:16
Theorem
5:40
Standard Normal Distribution
7:03
Standard Normal Distribution
7:04
Converting to Standard Normal
10:11
Recall
10:12
Corollary to Theorem
10:41
Example I: Heights of Students
13:18
Example II: What Happens to This Probability as n → ∞
22:36
Example III: Units at a University
32:24
Example IV: Probability of Sample Mean
40:53
Example V: How Many Samples Should We Take?
48:34
The Central Limit Theorem

1h 9m 55s

Intro
0:00
Setting
0:52
Setting
0:53
Assumptions and Notation
2:53
Our Samples are Independent (Independent Identically Distributed)
2:54
No Longer Assume that the Population is Normally Distributed
3:30
The Central Limit Theorem
4:36
The Central Limit Theorem Overview
4:38
The Central Limit Theorem in Practice
6:24
Standard Normal Distribution
8:09
Standard Normal Distribution
8:13
Converting to Standard Normal
10:13
Recall: If Y is Normal, Then …
10:14
Corollary to Theorem
11:09
Example I: Probability of Finishing Your Homework
12:56
Example I: Solution
12:57
Example I: Summary
18:20
Example I: Confirming with the Standard Normal Distribution Chart
20:18
Example II: Probability of Selling Muffins
21:26
Example II: Solution
21:27
Example II: Summary
29:09
Example II: Confirming with the Standard Normal Distribution Chart
31:09
Example III: Probability that a Soda Dispenser Gives the Correct Amount of Soda
32:41
Example III: Solution
32:42
Example III: Summary
38:03
Example III: Confirming with the Standard Normal Distribution Chart
40:58
Example IV: How Many Samples Should She Take?
42:06
Example IV: Solution
42:07
Example IV: Summary
49:18
Example IV: Confirming with the Standard Normal Distribution Chart
51:57
Example V: Restaurant Revenue
54:41
Example V: Solution
54:42
Example V: Summary
1:04:21
Example V: Confirming with the Standard Normal Distribution Chart
1:06:48
Loading...
This is a quick preview of the lesson. For full access, please Log In or Sign up.
For more information, please see full course syllabus of Probability
Bookmark & Share Embed

Share this knowledge with your friends!

Copy & Paste this embed code into your website’s HTML

Please ensure that your website editor is in text mode when you paste the code.
(In Wordpress, the mode button is on the top right corner.)
  ×
  • - Allow users to view the embedded video in full-size.
Since this lesson is not free, only the preview will appear on your website.
  • Discussion

  • Answer Engine

  • Study Guides

  • Download Lecture Slides

  • Table of Contents

  • Transcription

Lecture Comments (2)

1 answer

Last reply by: Dr. William Murray
Thu Jun 5, 2014 12:21 PM

Post by Vivek Sharma on June 4, 2014

Professor Murray, Once we have the MGF for chi square distribution, How can we find the skewness and kurtosis from 3rd and fourth derivatives after putting t =0. i have found the derivatives but can't find skewness(zeta3) and kurtosis(zeta4)..... please help....

Moment-Generating Functions

Download Quick Notes

Moment-Generating Functions

Lecture Slides are screen-captured images of important points in the lecture. Students can download and print out these lecture slide images to do practice problems as well as take notes while watching the lecture.

  • Intro 0:00
  • Premise 0:30
    • Premise
  • Goal 1:40
    • Goal Number 1: Find the Full Distribution Function
    • Goal Number 2: Find the Density Function
    • Goal Number 3: Calculate Probabilities
  • Three Methods 2:39
    • Method 1: Distribution Functions
    • Method 2: Transformations
    • Method 3: Moment-Generating Functions
  • Review of Moment-Generating Functions 3:04
    • Recall: The Moment-Generating Function for a Random Variable Y
    • The Moment-Generating Function is a Function of t (Not y)
  • Moment-Generating Functions for the Discrete Distributions 4:31
    • Binomial
    • Geometric
    • Negative Binomial
    • Hypergeometric
    • Poisson
  • Moment-Generating Functions for the Continuous Distributions 6:08
    • Uniform
    • Normal
    • Gamma
    • Exponential
    • Chi-square
    • Beta
  • Useful Formulas with the Moment-Generating Functions 8:48
    • Useful Formula 1
    • Useful Formula 2
  • How to Use Moment-Generating Functions 10:41
    • How to Use Moment-Generating Functions
  • Example I: Find the Density Function 12:22
  • Example II: Find the Density Function 30:58
  • Example III: Find the Probability Function 43:29
  • Example IV: Find the Probability Function 51:43
  • Example V: Find the Distribution 1:00:14
  • Example VI: Find the Density Function 1:12:10

Transcription: Moment-Generating Functions

Hello, welcome back to the probability lectures here on www.educator.com, my name is Will Murray.0000

Today, we are wrapping up a three lecture series on how to find0006

the density and distribution functions for functions of random variables.0010

We had one lecture on the method of distribution functions, and then the last lecture cover the method of transformations.0016

Today, we are going to talk about moment generating functions which is the last of our three methods.0023

Let me jump in and tell you the setting here.0029

This is going to start out the same as the last two lectures.0032

The first few slides are exactly the same as the last two lectures.0035

If you have been following along diligently and you watch the last two lectures,0038

you do not need to watch these first few slides again.0042

It is just going to be a review of the exact same stuff.0045

Just setting up the same premise and then we will get into actual moment generating functions, a few slides in.0048

The premise here is that we have several random variables Y1, Y2, etc. And then, we have some function of them0055

This U is some function of Y1 through YN.0062

We might have something like U is Y1² + Y2², something like that.0065

We will have some function of these random variables.0075

What I said before, I taught you how to calculate the mean and the variance of U,0080

that was in the previous series of lectures.0085

What I did not teach you before was how to calculate the whole distribution of U.0087

The purpose of this lecture and the previous two videos is to teach you how to find the entire distribution function of U.0093

Our goal is to find this distribution function F of U is the probability that U is less than some cutoff value u.0102

And then, if we can find that F, then we can find the density function just by taking the derivative.0111

F of U is just the derivative of F of U.0118

Assuming that we know f and F, it is very easy to calculate probabilities.0122

If we want to find the probability that U is in a particular range between A and B, what we will do is,0127

if we just know the density function, we could integrate the density function from A to B.0133

Or if we know the distribution function, it is even better because we can just do F of B - F of A.0138

That is why we want to find these functions, this F and f.0145

The point of this lecture and the previous two lectures is to give you various methods for finding this F and this f.0150

These three methods that we have been discussing, the first one was distribution functions.0161

You will see that, if you scroll back up two lectures, you will see the method of distribution functions.0165

Transformation function is what we covered in the previous lecture.0170

You should be all set to go with that.0173

In this lecture, what we are talking about today is the method of moment generating functions.0176

That is what I'm about to jump into is the method of moment generating functions.0181

First, I have to review for you what a moment generating function is.0185

There was a whole lecture on moment generating functions, earlier on in the lesson.0190

If you scroll back up, you will see that we have a whole lecture here on moment generating function.0195

If you do not know what they are at all, if you did not go through that lecture before,0200

you probably want to watch that lecture before you watch this one, because it would not make much sense.0205

This is the quick and dirty and review of moment generating functions.0210

Let me just show you just quickly, remind you of what they are all about.0214

By definition, the moment generating function for random variable is the expected value of E ⁺TY.0218

If you work that out, you always end up with a function of T, it is not a function of Y.0227

Moment generating function will always be something like 1 -2T⁻³, something like that where it is a function of T.0232

You should not see any Y, in the moment generating function.0242

We practice calculating some moment generating functions in that earlier lecture,0247

that was just specifically dedicated to moment generating functions.0253

Let me show you in particular, the moment generating functions for key distributions.0257

Because, you really need to remember them or have them somewhere very close by as a reference,0262

in order to make all the examples in this lecture work.0267

Here, the key moment generating functions, we have discreet distributions, we have continuous distributions.0271

In this side, I’m going to do the discrete one and in the next slide, we will continuous ones.0277

Our distributions are binomial, geometric, negative binomial, hypergeometric, and the Poisson distribution.0281

Each one has its own moment generating function.0289

The binomial is PE ⁺T + 1- P ⁺N, where P is the probability associated with the binomial distribution, N is the number of trials.0292

By the way, this 1- P is often called Q.0302

You might see this called PE ⁺T + Q ⁺N.0308

Geometric looks similar, PE ⁺T + 1-, again, this P could be written as Q.0312

If you look at this in some sources and some textbooks, they will just this call 1- P = Q.0319

It just means the same thing.0324

Negative binomial is the same as the geometric distribution except that it is raised to the R power.0325

Again, this 1- P could be a Q.0331

Hypergeometric distribution has no closed form, no simple moment generating functions.0334

I cannot write down anything for the hypergeometric distribution.0339

The Poisson distribution is E ⁺λ × the (E ⁺T-1).0343

Notice that all of these, there no Y anywhere in here, all are functions of the variable T.0349

You want to be seeing a T, when you are looking at a moment generating function.0361

You are not going to see any Y in there.0365

We also have moment generating functions for the continuous distributions.0367

Our favorites ones are uniform, here is the moment generating function for the uniform distribution.0371

Normal is E ⁺ν T + T² σ²/2.0377

All of that is in the exponent, by the way.0383

That is all in the exponent of the E, it is quite complicated moment generating function there.0385

The gamma distribution that is a whole family, 1- β T ⁻α.0390

Remember that the exponential and the chi square distributions,0395

those should both be considered children of the gamma distribution.0399

The exponential distribution is just γ with α equal to 1.0404

If you remember the moment generating function for the gamma distribution,0412

then you can remember the exponential distribution, its moment generating function0416

just by taking the α equal to 1 in the gamma distribution.0421

Chi square is also a gamma distribution, it is where you take α is equal to ν/2 and β is equal to 2.0426

Ν is the number of degrees of freedom in the Chi square distribution.0436

Again, you can see how, if you start with a function for the gamma distribution,0445

you plug in β is equal to 2, there is right there.0450

And if you plug in α is equal to ν/2, ν by the way is the Greek letter that looks like a v.0453

If you plug in α is ν/2, that is what you get.0460

You recognize the moment generating function for the Chi square distribution.0466

A β distribution has no closed form of moment generating function.0470

It does not lend itself very easily to the problems that have to do with moment generating functions.0474

The point of using moment generating functions to solve problems is that you got to be able0481

to recognize these moment generating functions, when you see them in a dark alley or see them out on safari.0488

You see a moment generating function, and then you have to say that0495

it is the moment generating function for the gamma distribution, or that is the moment generating function for the normal distribution.0498

You really want a kind of stare at this chart on the slide and also the one on the previous side,0505

and get these functions into your head.0511

Or else, maybe have one of these charts as an easy reference, when you are solving these problems.0514

Because, the whole point of this is you got to be able to recognize these, when you see them in the wild to speak.0520

Let me show you how that works out, I had not really told you how to solve any problems yet.0526

I want to show you how it works.0532

You are going to be calculating moment generating functions.0534

There is a couple useful formulas that you are going to need to now.0537

One is that if you take a linear function of our random variable, AY + B.0541

And then, you build a new variable called Z, that is AY + B.0547

The moment generating function for Z is, you take the moment generating function for Y and0553

just wherever you see a T, you change it to AT.0560

And then, you multiply on this factor E ⁺BT on the outside.0567

The time when this is most useful is when Y is a normal variable, and you are converting to make Z a standard normal variables.0571

That is when you use this formula most often, is when you are converting from a normal variable to a standard normal.0582

Here is another very useful formula that we are going to be using in almost every exercise today.0591

It is that, if you have two independent variables and it is important that they be independent,0598

and you add them together then the moment generating function for the sum is equal to0604

the product of the moment generating function of the individual variables.0611

That is very convenient because if you want to add two variables, you just multiply their moment generating functions.0618

That is very useful, we are going to use that over and over again.0625

Moment generating functions converts addition into multiplication.0628

It behaves very nicely, as long as your variables are independent.0633

Let me show you now, how we are going to use moment generating functions.0639

We will be given a collection of random variables and we want to find the moment generating function of U, M sub U of T.0644

That can be kind of tricky and we are going to use several different tricks to do that.0655

We might use the definition of moment generating function.0660

We will often use these formulas on the previous slide, especially the one where it converts addition into multiplication.0663

M sub Y1 + Y2 of T will be equal to M sub Y1 of T × M sub Y2 of T.0672

That is going to be extremely useful to calculate the new moment generating function.0682

What we will do is we will calculate that new moment generating function,0688

and then kind of compare it against all the charts that we have of all of our moment generating functions.0691

We will make sure that we recognize it as a known distribution.0697

If we can, then we will say that is the Poisson distribution with a certain value of λ.0702

Or, that is the exponential distribution with a certain value of β.0707

And then, we will know what our distribution is.0712

There is a lot of pattern recognition involved in using moment generating functions to identify distributions.0716

But, we will do some examples and you will see how it works out.0723

The first example is actually the trickiest, if you have trouble, if you get bugged down in the first example,0727

it is okay if you want to skip to a couple of the later ones.0733

And then, maybe come back and analyze the first one because it is the most challenging to understand.0737

With example 1, we have a standard normal variable, Y is the standard normal variable.0746

We want to find the density function of U which is defined to be Y².0751

We want to use moment generating functions for this.0759

We want to calculate the moment generating function of U M sub Y of T.0762

Remember, our U, by definition is Y² of T.0770

We are going to use the definition of the moment generating function here.0778

Remember, our definition of moment generating function M sub Y of T is just the expected value of E ⁺TY.0781

In this case, we do not have Y, we have Y².0791

This is the expected value of E ⁺TY², the expected value of E ⁺TY².0794

To calculate the expected value of the function of a random variable, what you do is you take that function E ⁺TY².0806

And then, you multiply it by the density function of that random variable.0816

You integrate that over all possible values for Y.0822

This integral is going to get a little complicated because if you remember,0826

the density function for the normal variable is no joke, it is rather complicated.0831

One thing we are given here is that Y is a standard normal variable.0838

Standard is sort of a loaded term, when you are studying probability and statistics.0843

Standard normal variable means that its mean is 0 and its variance is 1.0848

That kind of simplifies some of the equations that we have to deal with, when we are looking at its density function.0859

The density function for a standard normal variables, I’m plugging in ν = 0 and σ² = 1, is 1/√2 π.0866

There is actually a σ in there, but I'm taking advantage of the fact that it is σ = 1 × E ^- Y²/2.0875

Again, I’m simplifying that as I go along.0890

The full normal variable density function would be E ⁻Y - μ²/2 σ².0893

I have simplified that, taking advantage of the fact that we have a standard normal variable.0901

We solved E ⁺TY² here and we still have to integrate this thing over all possible values of Y.0906

By the way, in this case, the possible values are -infinity to infinity because that is my range for a normal variable.0913

This looks like λ, it is tricky to solve.0920

Let me pull out the 1/√2 π because that is just a constant.0924

I see that I have two functions that both look like E ⁺Y².0929

I got E ⁺TY² and E ⁻Y²/2.0937

What I'm going to try to do is write this as E ^-, I’m going to try to factor out Y²/2.0941

E ⁺Y²/2 + TY², that is what I have here.0954

I forgot my DY there, there is DY.0961

Let me just work inside the integral for the next couple of steps.0967

This is E ⁻Y²/2, I’m going to factor that out.0970

I have a 1, this T, since I’m factoring out the negative sign becomes –T.0976

Since, I factored out ½, it becomes a 2T.0982

E ⁻Y²/2 × 1 -2T, and I have a reason for doing this, but it is not obvious right now.0986

Let me show you where I'm headed with this, I do not want to solve this integral.0993

In fact, I know that I cannot solve this integral by any direct means.0998

What I'm going to try to do, is to try to compare this integral to the density function that I recognized1002

which would be a density function for different normal variable.1012

Let me show you what I mean by that.1017

The density function for a nonstandard normal variable would be 1/σ √2 π.1019

I do not think I need a μ, but E ⁺Y²/2 σ².1032

That is the density for a nonstandard normal variable.1040

It is not the same variable the we start out with.1056

Because it is a density function, I know what it is integral is.1063

If I integrate that from - infinity to infinity, the integral of any density function must be 1.1068

If I had an integral in that form, then I would know that its integral would be 1.1077

What I have here is something that is sort of generically similar to that.1086

If I try to arrange my variables carefully, I can make this integral equal to 1 in that form.1091

What I'm going to do is, I'm going to figure out what my value should be.1099

I want -Y²/2 × 1 -2T to be equal to -Y²/2 σ².1103

I see that the -Y²/2 is going to cancel, that 1 -2T is equal to 1/σ².1114

If I flip both sides, I get σ² is 1 -2T and my σ would be 1 -2T.1124

Σ² is 1/1 -2T and σ is 1 -2T⁻¹/2.1134

That is the σ that we would be talking about, if we want to make this integral that I have here1147

match the density function for nonstandard normal variable.1153

Let me arrange, see if I can arrange things to make it work.1159

I got 1/√ 2 π and then I got the integral of E ^-, if I arrange this to be Y²/2 σ² DY,1164

that was by choosing my σ up above here.1180

I will choose my σ to make that work.1184

Now, I want to make this match this integral over here.1185

It does not quite match it as it is because it is missing that 1 σ in the denominator.1188

I'm going to fudge that σ in there, and in order to balance that, I will have to put a σ on the outside here.1194

Let me remember that there is a σ on the outside.1203

The whole point is that, this is now the density function for a nonstandard variable.1206

But I know that this density function, the integral of any density function is equal to 1.1212

I got to multiply that one by the σ as well.1219

Σ × all that is equal to σ × 1.1222

What I have got there is that, this whole thing is equal to σ.1231

Σ, remember was 1 -2T⁻¹/2.1240

What have I done here, I have just calculated the moment generating function for this variable μ.1248

I found out that the moment generating function is 1 -2T⁻¹/2.1255

What am I supposed to do with that?1263

What I do is I go back and I looked at my charts of common moment generating functions.1264

And, I see if I will recognize this moment generating function somewhere on the chart.1272

Low and behold, I do.1277

Let me remind you on the chart, the moment generating function for Chi square distribution is 1 -2T ⁻ν/2.1280

What I have here is exactly a Chi square distribution with ν = 1.1295

That is worth writing down, ν has a Chi square distribution with ν = 1 degrees of freedom.1303

From that, I can figure out the density function of U because I remember that Chi square is the gamma distribution.1330

It is just a special case of the gamma distribution with α is ν/2 and β is 2.1345

I can look up the density function for the gamma distribution.1356

Let me remind you what it was.1360

The gamma distribution, the density function for the gamma distribution,1363

I will write it in terms of U is U ⁺α -1 × E ⁻U/β divided by β ⁺α × γ of α.1368

If I plug in all my values here, I'm going to plug in U ⁺α-1.1389

Α is ν/2 and ν is 1.1397

This is U ^½ -1, U⁻¹ E ⁻U/β is 2 U/2.1401

Β ⁺α is 2 ^½ and γ of α, γ of ½.1411

There is one thing I need to remember which is γ of ½.1428

That is kind of something that you either need to remember or look up, because it is quite a lot of work to derive from scratch.1433

Γ of ½, it turns out that it is √π , it is a kind of a surprising number there.1442

That is not something you can easily figure out from the factorial property of the gamma distribution,1451

because ½ is not a whole number.1456

It is easy to figure out γ for whole numbers, γ of ½.1458

It is quite difficult the first time you work it out, from then on it is probably worth remembering that γ of ½ is √π.1463

What I have here is, U⁻¹/2 E ⁻U/2.1473

And then, in the denominator I got 2¹/2 and that is √2 and √π.1484

I’m just going to combine those together, I think, and give myself √2 π.1493

The range on the Chi square distribution, it is the same as the range on the gamma distribution.1500

It is all U greater than 0 and less then infinity.1505

Another way to think about that is to say that, since U is Y² and since Y goes from - infinity to infinity,1512

Y² will go from 0 to infinity.1524

That is my density function for U.1527

A lot of work to find that one, that is probably the hardest one we are going to do though.1530

The rest build on this one and we have done the hardest steps in this example number 1.1534

Let me remind you how all those steps went.1541

We are trying to find the density function of U = Y².1544

I used the original definition of moment generating function here.1548

The original definition of a moment generating function is the expected value of E ⁺T Y.1553

I plugged in my U there was Y², that means instead of the TY, I'm finding the expected value of E ⁺TY².1560

That means, I'm finding the integral of E ⁺TY² × the density function.1570

The density function for a standard normal variable is that right there.1576

Standard normal variable means μ is 0 and σ² = 1.1580

I pulled out the 1 /√2 π, I combine the exponents.1586

I got this thing that is a little messy and is definitely not something I'm going to be able to integrate with any ease.1590

The trick to integrating that is to combine those exponent using a little bit of clever factoring and then,1598

to try to identify it as a density function for another normal distribution.1605

Here is the density function for another normal distribution.1612

What I know is that if I integrate the density function for any distribution, I should get 1.1616

That is of course, because in any experiment, the total probability is 1.1624

In order to make this match this density function, I set my two exponents equal to each other.1630

And then, I solved and I figure out that my σ had to be 1 -2T⁻¹/2.1637

I plugged in that value of σ, I converted this into σ.1645

It almost match the density function but there was this one extra factor of σ that I did not have before.1650

In order to create that factor of σ in the denominator, I had to multiply it in the numerator1656

which meant I also had to multiply it on the other side.1662

That density function, if I integrate that is equal 1 but then there is one extra factor of σ here, which is left over,1665

that σ tracks on down there.1675

What I'm left with, everything else drops out very nicely.1678

Thanks to the fact that integrating a density function gives you 1.1682

I’m left with 1 -2T⁻¹/2, and what I do there is I go back and look at my charts of the common moment generating functions.1685

Because, what I just calculated was the moment generating function for U.1698

I go back and look at my charts, and I say that looks a lot like the moment generating function for Chi square distribution.1701

It is the Chi square distribution, if I just take my ν equal to 1.1711

1 degree of freedom, I got a Chi square distribution and then, I want to write the density function for that.1717

In order to do that, I had to remember that Chi square was a gamma distribution, was special values of the α and β.1722

Α is ν/2 and β is equal to 2.1730

My gamma distribution, I wrote down the density function for the gamma distribution, in general.1733

I did it in terms of U, when we originally learned that, I gave it you in terms of Y but our variable now is U.1740

It is U ⁺α -1, E ⁻U/β, β ⁺α, γ of α.1745

I plugged in my α is ν/2 and my ν is 1.1752

I plug in β is equal to 2.1759

This all simplified fairly well, except of this γ ½.1762

I'm just remembering the γ of ½ is √π.1766

That is kind of a lot of work to figure that out, I do not do that work every time.1771

I just looked up that value of γ of ½ is √π.1775

It is not so obvious like the way that γ of a whole number is easy to calculate using factorials,1780

you have to do a lot of integrals, in order to figure out that γ of ½ is √π.1788

That is why I did not show you the details of that.1793

In the denominator, I combined 2¹/2 and √π, I got the √2 π.1795

And that gave me my density function for U = Y².1801

My range, that is the generic range for Chi square distribution.1807

But, I also could have figured it out by looking at my original range for Y and then, by figuring out U = Y².1811

By the way, this is one reason why we study the Chi square distribution.1820

It is because it is very common to look at the square of a standard normal variable, it turns out to have a chi square distribution.1825

We are going to use the result from this example again in example 2.1834

Make sure you understand this example, or at least make sure that you believe the answer,1840

before we move on to example 2.1845

I do not want to do all this work again in example 2.1847

I’m just going to invoke the answer from this example again, in example 2.1850

Example 2 looks a lot like example 1, except we have two independent standard normal variables,1859

instead of the one that we had in example 1.1865

We want to find the density function of Y1² + Y2².1869

I am going to use the answer from example 1 to help me solve example 2.1873

If you have not worked through example 1 of this lecture, then I really recommend going back and looking at example 1.1878

It is a lot of work, if you do not want to work through all the details there,1885

just make sure that you understand the answer.1889

We are going to use that answer as intermediate result here in example 2.1892

It will make example 2 a lot less work.1897

Let us work out example 2, we want to find the density function of U = Y1² + Y2².1899

The way we are going to do that is via moment generating function.1909

We are going to find the moment generating function for U, M sub U of T that is equal to M sub Y1².1912

U is Y1² + Y2².1924

The lovely thing about moment generating functions, that good property that I gave you on one of the introductory slide1933

is that they convert addition into multiplication, when you have independent variables.1939

Here, we do have independent variables.1945

This is M Y1² of T × this is where the multiplication comes in, M Y2² of T.1949

That is really nice, that our addition converts into multiplication.1959

Let me now invoke the answer from example 1.1966

From example 1, each Y is², Y1² and Y2² has a Chi square distribution with 1 degree of freedom.1970

Let me say with ν =1, I will say it that way.1992

If you are wondering where that comes from, you got to go back and watch example 1.1995

Example 1 is a lot of work and I cannot redo it here.2004

If you trust example 1, it is worth knowing that if you start with the standard normal variable2008

and you square it, you get a Chi square distribution.2014

We can look up what the moment generating function of a Chi square distribution is.2018

From the chart of the moment generating function of a Chi square distribution is 1 -2 T.2026

It is always a function of T, remember, raised to the –ν.2041

In this case, it is just 1 -2T⁻¹.2053

What I have here is a product of two functions here, 1 -2T ⁻ν/2.2061

1 -2T⁻¹/2 and 1 -2T⁻¹/2, that is not so obvious that it is negative because2077

I let my negative run into my line there, my fraction line.2092

That is a little more obvious now.2096

Let me multiply those two together, if you multiply those then the exponents just add.2099

I get 1 -2T⁻¹, that is my moment generating function for U.2104

Let me write this as -ν/2 because then, that will make it more obvious that2119

the moment generating function that I just discovered for U, is again a Chi² distribution.2127

Remember, this whole lecture is about pattern recognition.2138

You calculate a moment generating function, then you stare at the chart and you try say2142

that is the Chi square distribution or that is the exponential distribution.2146

This in fact is the Chi square distribution with ν is equal to, -1 is -ν/2, ν would be 2 there.2150

We have a Chi square distribution with 2 degrees of freedom.2163

Now, I can find its density function, I will remember that Chi square is a gamma distribution.2169

It is a sub family of the γ family.2178

Let me remind myself of the density function for the gamma distribution.2182

F of U = U ⁺α - 1 × E ⁻U/β.2188

That β got a little squashed there, it ended up looking like a Δ.2199

U/β and β ⁺α × γ of α.2202

That is the density function for a gamma distribution.2211

Chi square is γ with α is equal to ν/2 and β is equal to 2.2215

I'm going to plug in those values into my gamma distribution, F sub U of U is U ⁺α -1.2229

Α is ν/2, we said ν is equal 2, 2/2 -1 is 0, that term drops out.2242

I will go ahead and write it as U⁰, just in case you are wondering where it went, E ⁻U/2.2249

In my denominator, I got β ⁺α is 2¹.2257

And then, γ of α is just γ of 1.2263

Γ of 1 is 0Factorial which is just 1.2268

Finally, my density function for my U is F sub U of U which is the U⁰ drops out, the γ of 1 drops out.2273

It is ½ E ⁻U/2, and my range for Chi square distribution is U goes from 0 to infinity.2283

That is my density function.2295

That is officially the end of that problem, let me make a coupe of notes about this.2302

One note is that you might recognize that density function as an exponential distribution.2307

It is in fact an exponential distribution.2312

That is not so relevant to this problem because that pattern does not really continue.2316

The fact that was an exponential distribution is sort of a fluke of nature on this problem.2322

Let me tell you what is not a fluke of nature on this problem ,2327

which is that we got a Chi square distribution with 2 degrees of freedom2330

by adding up the squares of two standard normals that is not a fluke.2335

In general, let me say Y1 through YN, if we add up N standard normal, Y1, Y2, up to YN are independent standard normal.2342

U is Y1² up to YN², then U has a Chi square distribution with N degrees of freedom.2379

U is Chi² distribution with N degrees of freedom.2391

Let me say with a ν, ν is the number of degrees of freedom, in this case it will come out to be N.2398

That is not so surprising, if you kind of look at this step right here, instead of having two factors, we would get N factors.2406

This exponent would turn into N/2, we would just get Chi square distribution with ν = N.2414

This does generalize to adding up N independent squares of standard normal,2424

what you get is a Chi square distribution with N degrees of freedom.2431

That is really a big reason why the Chi square distribution is significant in probability and statistics.2435

It is because it kind of flows out of the standard normal distribution.2442

Let me recap the steps here.2448

We want to find the density function of Y1² + Y2².2450

We start out just by that definition U is Y1² + Y2², the definition of moment generating function.2455

But quickly, we are going to use this property that we had, I think I called it a useful formula on one of the earlier slides.2463

The really useful fact is that when we have independent variables, it converts addition,2472

when we are adding the variables into multiplication of moment generating functions.2477

What we do is we multiply the moment generating functions for Y1² and Y2².2483

We figured out the moment generating functions for each one, back in example 1.2489

If the introduction of the Chi square variable suddenly came out of the left field for you,2494

what you want do is go back and watch example 1.2501

You will see where we figured out that the distribution for Y1² is just Chi square with 1 degree of freedom.2504

Its moment generating function, we figure that out on the chart earlier on in this lecture.2514

It is 1 -2T ⁻ν/2, we get that for both of these variables multiplying together and we get 1 -2T⁻¹.2520

We notice that, that Chi square again, this is sort of pattern recognition.2531

That is still a Chi square, the difference is that the exponent is bigger now, we have 2 degrees of freedom.2535

If we want to find again the density function, we have to remember that Chi square comes from the gamma distribution.2542

I wrote down my formula for the density function for the gamma distribution.2550

And then, I plugged in Chi square is gamma distribution with α is ν/2 and β = 2.2554

I plugged in those values, I plug in ν = 2, I plug in all those values to my γ density function.2561

I simplified it down and got my density function for my U there.2569

Of course, the range for Chi square distribution is from 0 to infinity.2576

What I noticed along the way is that, this is sort of a pattern with two variables.2580

But if we had N variables, we could have just extended this up to N moment generating functions and2585

we would have gotten a Chi square distribution with N degrees of freedom.2592

That is kind of a good thing to know in probability and statistics, in general,2596

which is that if you add up N standard normal variables, squaring each one,2600

then you get a Chi square distribution with N degrees of freedom.2605

In examples 3, we have R independent binomial variables.2611

They all represent flipping the same coins.2616

The coin comes up heads with probability P.2619

P is not necessarily ½, it could be a loaded coin, nobody told us that it is a fair coin.2622

Each one represents a different number of flips, N1 through NR.2628

What we want do is add these variables together and call it U.2633

We want to find the probability function of U.2637

Our method that we are exploring in this lecture is moment generating functions.2641

We are going to find the moment generating function of U.2647

In the meantime, along the way we are going to need the moment generating function of the individual Y.2651

I want to find the moment generating function of YI of T.2657

I get that just by looking at my chart for moment generating functions.2661

The binomial is discreet, if you scroll back a few slides in this lecture,2667

you will see the chart for moment generating functions of discrete variables.2675

The one for binomial is PE ⁺T + 1- O ⁺Nth but Yi variable is Ni flips.2680

I’m going to write N sub I here.2696

This is coming from the chart earlier on in this lecture, just scroll back and you will find it.2698

It is the discrete distributions.2707

We want to find my moment generating function for U.2710

Than U that we have been given here which is the sum of the Yi, Y1 up to YN.2714

The lovely thing about moment generating functions is that they convert addition into multiplication.2723

You can only do that when you have independent variables, which is what we have here.2729

This is M Y1 of T multiplied, this is multiplied now, I’m not adding any more.2735

Which is, I do not know why I try to write an addition sign there.2741

This is MYN of T, I'm going to plug in the moment generating functions for each one.2744

PE ⁺T + 1 – P ⁺N1, I'm going to multiply that all the way through up to PE ⁺T + 1- P ⁺N,2752

I called it Y sub N, of course, I should have called it Y sub R.2772

There are R in these things, I will try not to reuse the variable N.2777

This is N sub R in my exponent.2782

The lovely thing about this is, I got the same base everywhere,2786

I can just combine all those exponent and you add the exponents.2790

This is PE ⁺T + 1- P, I add the exponents N1 up to N sub R.2794

Maybe this is obvious, but if it is not obvious, go back and look at your chart of moment generating functions.2807

Stare at this and you recognize that it is a binomial distribution, again.2813

Remember, that is how moment generating functions work.2823

You work out the MGF and then you go back and look at your chart, and you try to recognize it.2825

This is binomial with N = N1 + NR.2831

I know what my probability function for binomial distribution is.2842

The probability of any given value of U, this is the discrete probability function.2846

We had a whole lecture on the binomial distribution, if you are completely lost with the word binomial,2851

just scroll back and you will see our probability function for the binomial distribution.2857

I’m going to use U instead of Y, we use Y back then.2861

It is N choose U × P ⁺U × Q ⁺N- U.2864

In this case P of U, I will fill in my N is.2874

It is N1 added up to NR choose U P ⁺U and Q ⁺N1 up to NR-U.2879

My range here is that U goes between 0 and N, including both of them.2896

In this case, U goes between 0 and my N is N1 up to NR.2903

That is really not very surprising, it is like you are taking a coin and you are flipping it N1 ×.2913

And then, you flip it N2 × and then you flip it N3 ×, and you keep on flipping until you finally flip it NR ×.2921

And what you have really done is you flip it N1 + N2 + N3 up to NR × total.2931

You get a binomial distribution where your N is just the sum of all those n.2937

This is really not very shocking and it is nice to have a moment generating functions2943

to confirm what our intuition probably should have already told us.2949

Let me review the steps there.2955

We are trying to find the moment generating function for U.2958

But, U was Y1 through YR, the sum of Y1 through YR.2961

Our useful formula on moment generating functions says that, it converts addition into multiplication.2966

I got addition in my subscript here, that converted into multiplication here.2974

I had to know the moment generating function for each one of the Yi.2979

I looked at the moment generating function for binomial distribution, because I was told that the Yi were binomial.2983

The moment generating function for binomial distribution, on my chart is PE ⁺T + 1- P ^,2991

whatever the N is for that distribution.2998

In this case, it is N1 through N sub R.3001

I use those as my exponents but then all those terms, since they are multiplied together,3005

they combine together and we just get one big exponent at the top and 1 + up to NR.3010

And then, I looked back at my charts, see if you can identify this moment generating function.3018

And of course, that is a binomial, again it is just binomial where your exponent tells you the N.3026

N is N1 up to NR, added together.3033

I looked at my binomial probability function, this comes back from our earliest lecture on the binomial distribution.3037

You can look this up, you will see this formula except you will see a Y instead of U.3046

Here our variable is U and here is the range for U.3051

I just plug in what N was, N was N1 through NR.3055

I plugged that in all the way through here and my range for U goes from 0 to N.3061

Again, this is not surprising, this kind of fits what your instinct should tell you because3067

you want to think about flipping the same coin N1 ×, and then you start all over and flip it N2 ×.3072

You will keep flipping until you finally flip it NR ×, that is just the same as flipping it many times over.3079

N1 + N2 + N3, up to NR ×.3087

It is not surprising that the total number of heads will give you a binomial distribution,3092

based on that total number of flips.3100

In example 4, we got two independent Poisson variables with means λ 1 and λ 2.3105

We want to find the probability function of U which is Y1 + Y2.3111

Let me set up what we are going to need here.3118

I know I'm going to need the moment generating functions of Y1 and Y2.3119

Let me go ahead and write those down.3127

I'm looking these up from the chart.3129

We did have a whole section on how to calculate moment generating functions.3131

You could look that up much earlier in the series of lectures, if you want.3136

What we did was we eventually found this chart and I’m not going to calculate these again from scratch.3141

M sub YI is just E ⁺λ I ×, in the exponent E ⁺T-1.3146

That is going to be useful, as I try to calculate the moment generating function of U.3158

Let me try to do that, M sub U of T is M sub Y1 + Y2 of T.3164

The whole point or one of the really nice features of moment generating functions is that3173

they convert addition into multiplication, when you have independent variables.3179

We do have independent variables here, this is the M sub Y of T × M sub Y2 of T.3185

I can plug in what I have found to be the moment generating functions of each one of those variables.3194

This is E ⁺λ 1 × E ⁺T-1 × E ⁺λ 2 × E ⁺T-1.3201

Of course, since I have similar exponents, I can add them.3212

This is E ⁺λ 1, I will factor out the E ⁺T-1 λ 1 + λ 2 × E ⁺T-1.3216

I will go back and I will look at my chart of moment generating functions, and see if I find anything like this.3227

Of course, I will find something like this because that is the moment generating function for Poisson distribution.3234

The chart tells us this is Poisson with this mean λ is equal to λ 1 + λ 2.3242

I know I have a Poisson distribution with means λ 1 + λ 2.3256

If I look up my probability function for a Poisson distribution, what it is, is λ ⁺U × E ⁻λ all divided by U!.3261

The range there is from U goes from 0 to infinity.3281

Let me plug in what λ is, λ is λ 1 + λ 2 ⁺U E ^- λ 1- λ 2 divided by U!, where U goes from 0 to infinity.3287

This is also not surprising and let me try to explain this.3316

Remember what the Poisson distribution models, it models random occurrences.3322

A kind of prototypical example of the Poisson distribution is, you are sitting at an intersection on a country road,3327

a kind of a not very crowded country road, and you are counting the number of cars that go by this intersection.3337

It does not happen very often, every once in a while a car goes by.3343

You might say that Y1 is the number of cars through an intersection on a country road.3348

The Poisson distribution models that perfectly because you might have a whole bunch of cars, you might not have any cars.3362

Maybe, you are calculating this over the course of 1 hour, how many cars go through this one intersection over 1 hour?3370

Y2 could be the number of trucks through the same intersection, again, that is going to follow a Poisson distribution.3377

Probably, we will have a different mean because depending on the area, you might have more cars or you might have more trucks.3387

If it is a rural community, you might have more trucks because people are carrying stuff around their farms.3393

If it is an urban community, you might have more cars.3398

But any way, you will have different means for the average number of cars and trucks through the intersection.3402

What you are really keeping track of, if you look at Y1 + Y2, it is the of total number of cars and trucks.3408

The total number of vehicles through the intersection.3415

Again, it is not too surprising, the one we calculate out, the distribution there,3422

what we discovered is that is also a Poisson distribution.3429

Then, you are just kind of sitting there at that intersection and just every time something with wheels goes through,3432

every time a car or truck goes through, you count it as 1.3438

It is a Poisson distribution because every once in a while something goes through.3442

Sometimes you get a lot of cars and trucks, sometimes you get nothing.3447

It is not surprising that, when we calculate the probability function of Y1 + Y2, we end up with the Poisson distribution again.3451

Let me recap the steps there.3459

We figure out the moment generating function for Poisson distribution.3462

I’m being a little charitable when I say I figure that out, I really use the chart that I gave you early on.3466

It was in the discrete distributions, earlier on in this lecture.3474

If you are really want to know where that comes from, you have to go back and watch the earlier lecture,3478

the previous video which covered moment generating functions.3483

That is the moment generating function for a single Poisson distribution.3486

We want to combine them, we are finding U is Y1 + Y2.3490

A very useful formula which showed that, that converts addition into multiplication,3495

for a moment generating functions.3501

That is because these variables are independent, you can convert addition into multiplication.3503

We multiply the two moment generating functions together and it combined very nice and get this λ 1 + λ 2 factoring out.3509

If we look back at the chart, that is still the moment generating function for Poisson distribution.3517

The only difference is the λ has changed, the new mean is λ 1 + λ 2.3524

If you look up the probability function for Poisson distribution, this is something we covered earlier on,3529

when we are talking about discreet distributions.3536

We had a whole lecture on the Poisson distribution.3538

Here is the probability function for Poisson distribution and here is the range.3541

I think we used Y before, but now are using U because that is the name of our variable.3546

The only difference is that the λ here is λ 1 + λ 2.3552

I plug that in everywhere I saw a λ and then I got my probability function for U.3558

Again, this is not surprising if you remember what the Poisson distribution measures in real life.3565

One way to think about it is, it measures random events that happened with no effect on each other.3572

If you are sitting by an intersection, sometimes you see a lot of cars and3581

sometimes you see a lot of trucks, and sometimes you do not see anything.3585

But, you can have one variable the counts the number of cars, one variable that counts the number of trucks,3588

and one variable that counts everything together.3594

You are just adding the cars and trucks.3597

All three of those are Poisson variables, it is not too surprising when we actually calculate,3599

if we add two Poisson variables, the answer is still a Poisson variable.3608

In example 5, we have independent normal variables, each one has the same mean and variance.3616

Each one has mean μ and variance σ².3622

We want to find the distribution of Y ̅, Y ̅ is the average of the variables.3624

You can think of it as the mean, but that gets confusing because we also use mean in another sense.3630

Y ̅ is 1/N × Y1 + Y2 up to YN.3636

We are going to use moment generating functions for this.3643

Let me find the moment generating function for any particular normal variable M sub YI.3645

I got a normal variable, I have to look this up from the chart look of continuous distributions.3657

You will see the moment generating function for a normal variable is E ⁺μ T + σ² T²/2,3665

that is all in the exponent there.3684

That is the moment generating function for any single variable here.3686

I want to find the moment generating function for Y ̅, but I do not think I'm going to find it directly.3691

I think I’m going to first find the moment generating function for Y1 through YN.3695

I will call that Y, and I will moment generating function from that first.3701

And then, I will figure out what to do with that 1/N.3709

M sub Y of T is, Y is just Y1 up to YN of T.3712

Remember, moment generating functions for independent variables which we have here,3725

they turn addition into multiplication.3730

M sub Y1 of T multiplying up to M sub YN of T.3733

That is just E ⁺μ T + σ² T²/2, multiply it together N ×.3740

It is the same moment generating function every time, E ⁺μt + σ² T²/2.3752

What I get there is E ⁺μ T + σ² T²/2 ⁺nth.3761

I'm going to go ahead and distribute that in into the exponent.3772

That is E ⁺μ MT + σ² T² N/2, all of that is in the exponent there.3776

That is the moment generating function for Y but that is not quite what I wanted.3788

I wanted Y ̅, let me show you how I can deal with that.3792

I noticed that Y ̅ is just the same as Y divided by N, it is 1/N × Y.3797

Let me remind you of a really useful property, this is listed in fact as a useful formula earlier on in this video.3804

Scroll back and you will see the following formula, that M sub AY + the moment generating function of AY + B of T is equal to,3812

You start with the moment generating function for Y, you plug in AT whenever you saw a T.3825

I forgot to include the extra term there, our extra factor is E ⁺BT × M sub Y of AT.3831

That is one of the useful formulas that we have for moment generating functions.3841

In this case, what we have is A is equal to 1/N and B is equal to 0, because we have Y ̅ is 1/NY.3846

M sub Y ̅ of T is equal to M sub Y of, our A is 1/N so 1/N × T.3865

I'm going to take my moment generating function for Y.3878

I'm going to plug in 1/N wherever I saw a 1/N × T, wherever I saw AT before.3887

I get E ⁺μ N 1/MT, 1/M × T + σ².3894

I see a T, I got to put in 1/N × T² × N/2.3905

This is actually quite nice because it simplifies E ⁺μ, the N and 1/N cancels, I get E ⁺μ T +, I got Σ² + σ².3914

I have got T/N², that is T²/N² × N.3928

The N cancels with one of the N in the denominator but not both of them.3933

Σ²/N and then, I still have a T² and I still have a 2 there.3937

That is all in my exponent, that is my moment generating function for Y ̅.3947

What I want to do is go back and look at my chart now3952

and see if I recognize that as the moment generating function for any of my known distributions.3955

I go back and look at the chart, and what I recognize is that, that is the moment generating function from normal distribution.3962

This is the moment generating function for a normal distribution.3974

Not quite in the format that was given in the chart though with mean,3991

The mean looks good, the mean is ν, that fits the pattern but the moment generating function4001

for the normal distribution was E ⁺μ T + σ² T²/2.4008

What I have here is σ²/N × T²/2.4016

My variance is slightly different here, instead of σ² by itself, σ²/N.4020

That is what my distribution of Y ̅ is, my distribution for Y ̅ is normal and it has mean U but its variance is σ²/N.4031

It is not the same variance that I started with.4045

That is my answer and this is not too surprising because we have a bunch of variables,4048

we expect their average to have the same mean as the individual variables.4055

However, the average does not have the same variance because we are sampling over more variables.4060

It makes the average be less variable, that is the law of large numbers.4068

A greater sample size gives smaller variance in the average.4074

This is something that is sort of very fundamental to statistics.4093

That is why you try and take bigger samples, when you are trying to understand the population.4097

It is because, if you take an average of more samples there will be less variance in your calculations.4103

By the way, we did calculate this same example back in the lecture on distribution functions.4110

If you go back and look at the lecture on distribution functions,4117

you will see the same example and you will see the same answer.4121

I’m sorry, it was not the lecture on distribution functions, it was the lecture on linear combinations of random variables.4126

It was back in the previous chapter, you will see the same example, same answer,4133

but calculated using very different methods.4137

We were not using moment generating functions back then.4140

Let me review the steps here.4143

First of all, I wrote down the moment generating function for a normal variable.4144

I got that from the chart, I did not calculate that from scratch.4150

And then, I want to find the moment generating function for a particular Y, which was Y1 through YN,4156

which means I kind of ignored the 1/N to start with here.4163

I just called that stuff inside the parentheses Y, I was not going to even worry about the 1/N until later.4167

The point of that is that, I have the some of variables and moment generating functions converts sums into products.4174

It converts addition into multiplication and that is because these variables are independent.4182

And then, I filled in what each one of the individual moment generating functions are.4188

Since, I’m multiplying them together, I can just raise it up to the Nth power.4196

I can distribute that exponent in, there is N in that exponent there.4200

I have to figure out what that 1/N on the outside does to it.4207

I was using an old property of moment generating functions that the moment generating function for4211

AY + B is E ⁺BT × M sub Y of AT.4216

That was listed in, I think it was called the useful formula on one of the introductory slide of this lecture.4223

In this case, my A, my coefficient is 1/N.4229

I’m plugging in, in place of T I’m substituting in 1/NT.4233

There is that 1/NT manifesting itself right there and right there.4238

It is very nice on the left, it just cancel off the N, we got that same μ again.4243

It does not quite cancel with this N because it gets².4249

We have a N² in the denominator and N in the numerator, that is why we still end up with 1N in the denominator.4252

Once I got that moment generating function, I went back and look at my chart and said do I recognize this.4260

I did spot on the chart, it looks a lot like the moment generating function from normal distribution.4266

In fact, the μ is the same, the mean is the same, but the difference is that there was no N for the normal distribution.4271

What I have to do is change my variance to be σ²/N,4279

that would give me this moment generating function here with the σ²/N.4283

I still have a normal distribution, I have the same mean as before, that is not surprising if you take a bunch of samples,4289

you expect their average to be the same as the average of the population.4295

The variance though is lower, the variance of a bunch of samples will be lower than4300

the variance of an individual member of the population.4306

The variance that we have now is σ²/N.4311

Notice that, if you take more samples which means you make N bigger then you will have a lower variance,4316

which is really why surveys with many samples are more accurate than surveys4321

with sample of few members of the population.4327

In examples 6, we are looking at to two independent exponential variables.4332

Each one has mean 3 and we want to find the density function of Y1 + Y2.4338

Let me remind you of how this works.4345

First, we got to know the moment generating function for an exponential variable since,4347

everything here is based on moment generating functions.4354

M sub YI, the individual ones, I’m going to look up my moment generating function for the exponential variable on my chart,4358

that is earlier on in this lecture.4372

If you scroll back in this lecture, you will see the moment generating functions for continuous variables.4375

The one for the exponential function is 1- β T to the -1.4381

In this case, our mean is given that β = 3, it is 1 -3 T⁻¹.4390

We are going to use that when we find the moment generating function for U,4400

that is the moment generating function for Y1 + Y2.4404

The whole point of moment generating functions or one of the very useful properties4410

that they have is that, it converts addition into multiplication.4414

M sub Y1 × M sub Y2, that is 1 -3T⁻¹ × 1 – 3T⁻¹.4419

We just get 1 -3T⁻², we are going to look back in my chart and say do I recognize this4434

as the moment generating function for any of my known distributions.4442

If you look back at the chart, you will see that the gamma distribution does have a moment generating function.4448

The gamma distribution does have a moment generating function of 1- β T ⁻α.4456

What I have here is a gamma distribution with α is 2 and β is 3.4469

I can find the density function now as the density function from the gamma distribution.4477

Here is the density function for the gamma distribution.4482

I learned this way back in one of the earlier videos on the gamma distribution.4487

You can look this up, if you do not remember it.4491

It is U ⁺α -1 × E ⁻U/β divided by β ⁺α × γ of α.4494

In this case, U ⁺α -1, α is 2 so this is just U¹ × E ⁻U/3.4507

Β ⁺α is 3² and γ of α is γ of 2.4518

Γ of 2, remember is 2 -1!, 1! is going to be 1.4524

That is easy to work out, γ of a whole number because it is related to the factorial function.4530

Let me simplify that, F sub U of U is UE ⁻U/3 divided by 3² is 9.4535

My range for gamma distribution is U goes from 0 to infinity.4551

I found my density function for U.4557

That is it, let me review the steps there.4566

It was given that we had exponential variables.4569

The first thing I did was, look up the moment generating function for the exponential variable on the chart.4573

It is β T in general, but β is the mean of the exponential distribution, that is 3, in this case.4579

We are given that it was 3 and U is Y1 + Y2.4586

If I want to calculate its moment generating function, it converts addition into multiplication,4591

using the fact that we have independent variables there.4598

I multiply together two copies of 1 -3 T⁻¹, I get 1 -3T⁻².4602

I go back and look at the chart, and I'm looking at my continuous distributions.4610

I'm saying do I recognize this moment generating function.4614

And I say, yes this is the MGF, the moment generating function for the gamma distribution because,4622

the moment generating function for the gamma distribution has this form, 1- β T ⁻α.4634

I just recognize that this is the right thing with α = 2 and β = 3.4640

I know I got a gamma distribution and I know my formula for gamma distribution,4646

my density function for gamma distribution is just given by this.4651

This comes from our earlier lecture on the gamma distribution.4656

You can go back and look that up, if this formula seems to come out of left field.4660

And then, I plugged in my α and my β.4664

Remember, the γ of N is just N -1!, if N is a whole number.4670

Γ of 2 is just 1Factorial which is just 1.4676

I just simplified everything here and I got down to UE ⁻U/3, they are all divided by 9.4682

My range for the gamma distribution is going from 0 to infinity.4692

That wraps up our lecture on moment generating functions, this is kind of a long one.4697

I really appreciate if you stuck with me through all of that.4701

That wraps up this three lecture series on finding distributions of functions of random variables.4704

We had one on distribution functions, one on transformations, and now this last one on moment generating functions.4711

Next up, we are going to talk about order statistics, I hope you will stay tuned for that.4718

This is part of the larger series of probability lectures here on www.educator.com.4722

I, as always, I’m your host Will Murray, thank you for joining me today, bye.4728

Educator®

Please sign in to participate in this lecture discussion.

Resetting Your Password?
OR

Start Learning Now

Our free lessons will get you started (Adobe Flash® required).
Get immediate access to our entire library.

Membership Overview

  • Available 24/7. Unlimited Access to Our Entire Library.
  • Search and jump to exactly what you want to learn.
  • *Ask questions and get answers from the community and our teachers!
  • Practice questions with step-by-step solutions.
  • Download lecture slides for taking notes.
  • Track your course viewing progress.
  • Accessible anytime, anywhere with our Android and iOS apps.