  William Murray

Moment-Generating Functions

Slide Duration:

Section 1: Probability by Counting
Experiments, Outcomes, Samples, Spaces, Events

59m 30s

Intro
0:00
Terminology
0:19
Experiment
0:26
Outcome
0:56
Sample Space
1:16
Event
1:55
Key Formula
2:47
Formula for Finding the Probability of an Event
2:48
Example: Drawing a Card
3:36
Example I
5:01
Experiment
5:38
Outcomes
5:54
Probability of the Event
8:11
Example II
12:00
Experiment
12:17
Outcomes
12:34
Probability of the Event
13:49
Example III
16:33
Experiment
17:09
Outcomes
17:33
Probability of the Event
18:25
Example IV
21:20
Experiment
21:21
Outcomes
22:00
Probability of the Event
23:22
Example V
31:41
Experiment
32:14
Outcomes
32:35
Probability of the Event
33:27
Alternate Solution
40:16
Example VI
43:33
Experiment
44:08
Outcomes
44:24
Probability of the Event
53:35

1h 2m 47s

Intro
0:00
Unions of Events
0:40
Unions of Events
0:41
Disjoint Events
3:42
Intersections of Events
4:18
Intersections of Events
4:19
Conditional Probability
5:47
Conditional Probability
5:48
Independence
8:20
Independence
8:21
Warning: Independent Does Not Mean Disjoint
9:53
If A and B are Independent
11:20
Example I: Choosing a Number at Random
12:41
Solving by Counting
12:52
Solving by Probability
17:26
Example II: Combination
22:07
Combination Deal at a Restaurant
22:08
Example III: Rolling Two Dice
24:18
Define the Events
24:20
Solving by Counting
27:35
Solving by Probability
29:32
Example IV: Flipping a Coin
35:07
Flipping a Coin Four Times
35:08
Example V: Conditional Probabilities
41:22
Define the Events
42:23
Calculate the Conditional Probabilities
46:21
Example VI: Independent Events
53:42
Define the Events
53:43
Are Events Independent?
55:21
Choices: Combinations & Permutations

56m 3s

Intro
0:00
Choices: With or Without Replacement?
0:12
Choices: With or Without Replacement?
0:13
Example: With Replacement
2:17
Example: Without Replacement
2:55
Choices: Ordered or Unordered?
4:10
Choices: Ordered or Unordered?
4:11
Example: Unordered
4:52
Example: Ordered
6:08
Combinations
9:23
Definition & Equation: Combinations
9:24
Example: Combinations
12:12
Permutations
13:56
Definition & Equation: Permutations
13:57
Example: Permutations
15:00
Key Formulas
17:19
Number of Ways to Pick r Things from n Possibilities
17:20
Example I: Five Different Candy Bars
18:31
Example II: Five Identical Candy Bars
24:53
Example III: Five Identical Candy Bars
31:56
Example IV: Five Different Candy Bars
39:21
Example V: Pizza & Toppings
45:03
Inclusion & Exclusion

43m 40s

Intro
0:00
Inclusion/Exclusion: Two Events
0:09
Inclusion/Exclusion: Two Events
0:10
Inclusion/Exclusion: Three Events
2:30
Inclusion/Exclusion: Three Events
2:31
Example I: Inclusion & Exclusion
6:24
Example II: Inclusion & Exclusion
11:01
Example III: Inclusion & Exclusion
18:41
Example IV: Inclusion & Exclusion
28:24
Example V: Inclusion & Exclusion
39:33
Independence

46m 9s

Intro
0:00
Formula and Intuition
0:12
Definition of Independence
0:19
Intuition
0:49
Common Misinterpretations
1:37
Myth & Truth 1
1:38
Myth & Truth 2
2:23
Combining Independent Events
3:56
Recall: Formula for Conditional Probability
3:58
Combining Independent Events
4:10
Example I: Independence
5:36
Example II: Independence
14:14
Example III: Independence
21:10
Example IV: Independence
32:45
Example V: Independence
41:13
Bayes' Rule

1h 2m 10s

Intro
0:00
When to Use Bayes' Rule
0:08
When to Use Bayes' Rule: Disjoint Union of Events
0:09
Bayes' Rule for Two Choices
2:50
Bayes' Rule for Two Choices
2:51
Bayes' Rule for Multiple Choices
5:03
Bayes' Rule for Multiple Choices
5:04
Example I: What is the Chance that She is Diabetic?
6:55
Example I: Setting up the Events
6:56
Example I: Solution
11:33
Example II: What is the chance that It Belongs to a Woman?
19:28
Example II: Setting up the Events
19:29
Example II: Solution
21:45
Example III: What is the Probability that She is a Democrat?
27:31
Example III: Setting up the Events
27:32
Example III: Solution
32:08
Example IV: What is the chance that the Fruit is an Apple?
39:11
Example IV: Setting up the Events
39:12
Example IV: Solution
43:50
Example V: What is the Probability that the Oldest Child is a Girl?
51:16
Example V: Setting up the Events
51:17
Example V: Solution
53:07
Section 2: Random Variables
Random Variables & Probability Distribution

38m 21s

Intro
0:00
Intuition
0:15
Intuition for Random Variable
0:16
Example: Random Variable
0:44
Intuition, Cont.
2:52
Example: Random Variable as Payoff
2:57
Definition
5:11
Definition of a Random Variable
5:13
Example: Random Variable in Baseball
6:02
Probability Distributions
7:18
Probability Distributions
7:19
Example I: Probability Distribution for the Random Variable
9:29
Example II: Probability Distribution for the Random Variable
14:52
Example III: Probability Distribution for the Random Variable
21:52
Example IV: Probability Distribution for the Random Variable
27:25
Example V: Probability Distribution for the Random Variable
34:12
Expected Value (Mean)

46m 14s

Intro
0:00
Definition of Expected Value
0:20
Expected Value of a (Discrete) Random Variable or Mean
0:21
Indicator Variables
3:03
Indicator Variable
3:04
Linearity of Expectation
4:36
Linearity of Expectation for Random Variables
4:37
Expected Value of a Function
6:03
Expected Value of a Function
6:04
Example I: Expected Value
7:30
Example II: Expected Value
14:14
Example III: Expected Value of Flipping a Coin
21:42
Example III: Part A
21:43
Example III: Part B
30:43
Example IV: Semester Average
36:39
Example V: Expected Value of a Function of a Random Variable
41:28
Variance & Standard Deviation

47m 23s

Intro
0:00
Definition of Variance
0:11
Variance of a Random Variable
0:12
Variance is a Measure of the Variability, or Volatility
1:06
Most Useful Way to Calculate Variance
2:46
Definition of Standard Deviation
3:44
Standard Deviation of a Random Variable
3:45
Example I: Which of the Following Sets of Data Has the Largest Variance?
5:34
Example II: Which of the Following Would be the Least Useful in Understanding a Set of Data?
9:02
Example III: Calculate the Mean, Variance, & Standard Deviation
11:48
Example III: Mean
12:56
Example III: Variance
14:06
Example III: Standard Deviation
15:42
Example IV: Calculate the Mean, Variance, & Standard Deviation
17:54
Example IV: Mean
18:47
Example IV: Variance
20:36
Example IV: Standard Deviation
25:34
Example V: Calculate the Mean, Variance, & Standard Deviation
29:56
Example V: Mean
30:13
Example V: Variance
33:28
Example V: Standard Deviation
34:48
Example VI: Calculate the Mean, Variance, & Standard Deviation
37:29
Example VI: Possible Outcomes
38:09
Example VI: Mean
39:29
Example VI: Variance
41:22
Example VI: Standard Deviation
43:28
Markov's Inequality

26m 45s

Intro
0:00
Markov's Inequality
0:25
Markov's Inequality: Definition & Condition
0:26
Markov's Inequality: Equation
1:15
Markov's Inequality: Reverse Equation
2:48
Example I: Money
4:11
Example II: Rental Car
9:23
Example III: Probability of an Earthquake
12:22
Example IV: Defective Laptops
16:52
Example V: Cans of Tuna
21:06
Tchebysheff's Inequality

42m 11s

Intro
0:00
Tchebysheff's Inequality (Also Known as Chebyshev's Inequality)
0:52
Tchebysheff's Inequality: Definition
0:53
Tchebysheff's Inequality: Equation
1:19
Tchebysheff's Inequality: Intuition
3:21
Tchebysheff's Inequality in Reverse
4:09
Tchebysheff's Inequality in Reverse
4:10
Intuition
5:13
Example I: Money
5:55
Example II: College Units
13:20
Example III: Using Tchebysheff's Inequality to Estimate Proportion
16:40
Example IV: Probability of an Earthquake
25:21
Example V: Using Tchebysheff's Inequality to Estimate Proportion
32:57
Section 3: Discrete Distributions
Binomial Distribution (Bernoulli Trials)

52m 36s

Intro
0:00
Binomial Distribution
0:29
Binomial Distribution (Bernoulli Trials) Overview
0:30
Prototypical Examples: Flipping a Coin n Times
1:36
Process with Two Outcomes: Games Between Teams
2:12
Process with Two Outcomes: Rolling a Die to Get a 6
2:42
Formula for the Binomial Distribution
3:45
Fixed Parameters
3:46
Formula for the Binomial Distribution
6:27
Key Properties of the Binomial Distribution
9:54
Mean
9:55
Variance
10:56
Standard Deviation
11:13
Example I: Games Between Teams
11:36
Example II: Exam Score
17:01
Example III: Expected Grade & Standard Deviation
25:59
Example IV: Pogo-sticking Championship, Part A
33:25
Example IV: Pogo-sticking Championship, Part B
38:24
Example V: Expected Championships Winning & Standard Deviation
45:22
Geometric Distribution

52m 50s

Intro
0:00
Geometric Distribution
0:22
Geometric Distribution: Definition
0:23
Prototypical Example: Flipping a Coin Until We Get a Head
1:08
Geometric Distribution vs. Binomial Distribution.
1:31
Formula for the Geometric Distribution
2:13
Fixed Parameters
2:14
Random Variable
2:49
Formula for the Geometric Distribution
3:16
Key Properties of the Geometric Distribution
6:47
Mean
6:48
Variance
7:10
Standard Deviation
7:25
Geometric Series
7:46
Recall from Calculus II: Sum of Infinite Series
7:47
Application to Geometric Distribution
10:10
Example I: Drawing Cards from a Deck (With Replacement) Until You Get an Ace
13:02
Example I: Question & Solution
13:03
Example II: Mean & Standard Deviation of Winning Pin the Tail on the Donkey
16:32
Example II: Mean
16:33
Example II: Standard Deviation
18:37
Example III: Rolling a Die
22:09
Example III: Setting Up
22:10
Example III: Part A
24:18
Example III: Part B
26:01
Example III: Part C
27:38
Example III: Summary
32:02
Example IV: Job Interview
35:16
Example IV: Setting Up
35:15
Example IV: Part A
37:26
Example IV: Part B
38:33
Example IV: Summary
39:37
Example V: Mean & Standard Deviation of Time to Conduct All the Interviews
41:13
Example V: Setting Up
42:50
Example V: Mean
46:05
Example V: Variance
47:37
Example V: Standard Deviation
48:22
Example V: Summary
49:36
Negative Binomial Distribution

51m 39s

Intro
0:00
Negative Binomial Distribution
0:11
Negative Binomial Distribution: Definition
0:12
Prototypical Example: Flipping a Coin Until We Get r Successes
0:46
Negative Binomial Distribution vs. Binomial Distribution
1:04
Negative Binomial Distribution vs. Geometric Distribution
1:33
Formula for Negative Binomial Distribution
3:39
Fixed Parameters
3:40
Random Variable
4:57
Formula for Negative Binomial Distribution
5:18
Key Properties of Negative Binomial
7:44
Mean
7:47
Variance
8:03
Standard Deviation
8:09
Example I: Drawing Cards from a Deck (With Replacement) Until You Get Four Aces
8:32
Example I: Question & Solution
8:33
Example II: Chinchilla Grooming
12:37
Example II: Mean
12:38
Example II: Variance
15:09
Example II: Standard Deviation
15:51
Example II: Summary
17:10
Example III: Rolling a Die Until You Get Four Sixes
18:27
Example III: Setting Up
19:38
Example III: Mean
19:38
Example III: Variance
20:31
Example III: Standard Deviation
21:21
Example IV: Job Applicants
24:00
Example IV: Setting Up
24:01
Example IV: Part A
26:16
Example IV: Part B
29:53
Example V: Mean & Standard Deviation of Time to Conduct All the Interviews
40:10
Example V: Setting Up
40:11
Example V: Mean
45:24
Example V: Variance
46:22
Example V: Standard Deviation
47:01
Example V: Summary
48:16
Hypergeometric Distribution

36m 27s

Intro
0:00
Hypergeometric Distribution
0:11
Hypergeometric Distribution: Definition
0:12
Random Variable
1:38
Formula for the Hypergeometric Distribution
1:50
Fixed Parameters
1:51
Formula for the Hypergeometric Distribution
2:53
Key Properties of Hypergeometric
6:14
Mean
6:15
Variance
6:42
Standard Deviation
7:16
Example I: Students Committee
7:30
Example II: Expected Number of Women on the Committee in Example I
11:08
Example III: Pairs of Shoes
13:49
Example IV: What is the Expected Number of Left Shoes in Example III?
20:46
Example V: Using Indicator Variables & Linearity of Expectation
25:40
Poisson Distribution

52m 19s

Intro
0:00
Poisson Distribution
0:18
Poisson Distribution: Definition
0:19
Formula for the Poisson Distribution
2:16
Fixed Parameter
2:17
Formula for the Poisson Distribution
2:59
Key Properties of the Poisson Distribution
5:30
Mean
5:34
Variance
6:07
Standard Deviation
6:27
Example I: Forest Fires
6:41
Example II: Call Center, Part A
15:56
Example II: Call Center, Part B
20:50
Example III: Confirming that the Mean of the Poisson Distribution is λ
26:53
Example IV: Find E (Y²) for the Poisson Distribution
35:24
Example V: Earthquakes, Part A
37:57
Example V: Earthquakes, Part B
44:02
Section 4: Continuous Distributions
Density & Cumulative Distribution Functions

57m 17s

Intro
0:00
Density Functions
0:43
Density Functions
0:44
Density Function to Calculate Probabilities
2:41
Cumulative Distribution Functions
4:28
Cumulative Distribution Functions
4:29
Using F to Calculate Probabilities
5:58
Properties of the CDF (Density & Cumulative Distribution Functions)
7:27
F(-∞) = 0
7:34
F(∞) = 1
8:30
F is Increasing
9:14
F'(y) = f(y)
9:21
Example I: Density & Cumulative Distribution Functions, Part A
9:43
Example I: Density & Cumulative Distribution Functions, Part B
14:16
Example II: Density & Cumulative Distribution Functions, Part A
21:41
Example II: Density & Cumulative Distribution Functions, Part B
26:16
Example III: Density & Cumulative Distribution Functions, Part A
32:17
Example III: Density & Cumulative Distribution Functions, Part B
37:08
Example IV: Density & Cumulative Distribution Functions
43:34
Example V: Density & Cumulative Distribution Functions, Part A
51:53
Example V: Density & Cumulative Distribution Functions, Part B
54:19
Mean & Variance for Continuous Distributions

36m 18s

Intro
0:00
Mean
0:32
Mean for a Continuous Random Variable
0:33
Expectation is Linear
2:07
Variance
2:55
Variance for Continuous random Variable
2:56
Easier to Calculate Via the Mean
3:26
Standard Deviation
5:03
Standard Deviation
5:04
Example I: Mean & Variance for Continuous Distributions
5:43
Example II: Mean & Variance for Continuous Distributions
10:09
Example III: Mean & Variance for Continuous Distributions
16:05
Example IV: Mean & Variance for Continuous Distributions
26:40
Example V: Mean & Variance for Continuous Distributions
30:12
Uniform Distribution

32m 49s

Intro
0:00
Uniform Distribution
0:15
Uniform Distribution
0:16
Each Part of the Region is Equally Probable
1:39
Key Properties of the Uniform Distribution
2:45
Mean
2:46
Variance
3:27
Standard Deviation
3:48
Example I: Newspaper Delivery
5:25
Example II: Picking a Real Number from a Uniform Distribution
8:21
Example III: Dinner Date
11:02
Example IV: Proving that a Variable is Uniformly Distributed
18:50
Example V: Ice Cream Serving
27:22
Normal (Gaussian) Distribution

1h 3m 54s

Intro
0:00
Normal (Gaussian) Distribution
0:35
Normal (Gaussian) Distribution & The Bell Curve
0:36
Fixed Parameters
0:55
Formula for the Normal Distribution
1:32
Formula for the Normal Distribution
1:33
Calculating on the Normal Distribution can be Tricky
3:32
Standard Normal Distribution
5:12
Standard Normal Distribution
5:13
Graphing the Standard Normal Distribution
6:13
Standard Normal Distribution, Cont.
8:30
Standard Normal Distribution Chart
8:31
Nonstandard Normal Distribution
14:44
Nonstandard Normal Variable & Associated Standard Normal
14:45
Finding Probabilities for Z
15:39
Example I: Chance that Standard Normal Variable Will Land Between 1 and 2?
16:46
Example I: Setting Up the Equation & Graph
16:47
Example I: Solving for z Using the Standard Normal Chart
19:05
Example II: What Proportion of the Data Lies within Two Standard Deviations of the Mean?
20:41
Example II: Setting Up the Equation & Graph
20:42
Example II: Solving for z Using the Standard Normal Chart
24:38
Example III: Scores on an Exam
27:34
Example III: Setting Up the Equation & Graph, Part A
27:35
Example III: Setting Up the Equation & Graph, Part B
33:48
Example III: Solving for z Using the Standard Normal Chart, Part A
38:23
Example III: Solving for z Using the Standard Normal Chart, Part B
40:49
Example IV: Temperatures
42:54
Example IV: Setting Up the Equation & Graph
42:55
Example IV: Solving for z Using the Standard Normal Chart
47:03
Example V: Scores on an Exam
48:41
Example V: Setting Up the Equation & Graph, Part A
48:42
Example V: Setting Up the Equation & Graph, Part B
53:20
Example V: Solving for z Using the Standard Normal Chart, Part A
57:45
Example V: Solving for z Using the Standard Normal Chart, Part B
59:17
Gamma Distribution (with Exponential & Chi-square)

1h 8m 27s

Intro
0:00
Gamma Function
0:49
The Gamma Function
0:50
Properties of the Gamma Function
2:07
Formula for the Gamma Distribution
3:50
Fixed Parameters
3:51
Density Function for Gamma Distribution
4:07
Key Properties of the Gamma Distribution
7:13
Mean
7:14
Variance
7:25
Standard Deviation
7:30
Exponential Distribution
8:03
Definition of Exponential Distribution
8:04
Density
11:23
Mean
13:26
Variance
13:48
Standard Deviation
13:55
Chi-square Distribution
14:34
Chi-square Distribution: Overview
14:35
Chi-square Distribution: Mean
16:27
Chi-square Distribution: Variance
16:37
Chi-square Distribution: Standard Deviation
16:55
Example I: Graphing Gamma Distribution
17:30
Example I: Graphing Gamma Distribution
17:31
Example I: Describe the Effects of Changing α and β on the Shape of the Graph
23:33
Example II: Exponential Distribution
27:11
Example II: Using the Exponential Distribution
27:12
Example II: Summary
35:34
Example III: Earthquake
37:05
Example III: Estimate Using Markov's Inequality
37:06
Example III: Estimate Using Tchebysheff's Inequality
40:13
Example III: Summary
44:13
Example IV: Finding Exact Probability of Earthquakes
46:45
Example IV: Finding Exact Probability of Earthquakes
46:46
Example IV: Summary
51:44
Example V: Prove and Interpret Why the Exponential Distribution is Called 'Memoryless'
52:51
Example V: Prove
52:52
Example V: Interpretation
57:44
Example V: Summary
1:03:54
Beta Distribution

52m 45s

Intro
0:00
Beta Function
0:29
Fixed parameters
0:30
Defining the Beta Function
1:19
Relationship between the Gamma & Beta Functions
2:02
Beta Distribution
3:31
Density Function for the Beta Distribution
3:32
Key Properties of the Beta Distribution
6:56
Mean
6:57
Variance
7:16
Standard Deviation
7:37
Example I: Calculate B(3,4)
8:10
Example II: Graphing the Density Functions for the Beta Distribution
12:25
Example III: Show that the Uniform Distribution is a Special Case of the Beta Distribution
24:57
Example IV: Show that this Triangular Distribution is a Special Case of the Beta Distribution
31:20
Example V: Morning Commute
37:39
Example V: Identify the Density Function
38:45
Example V: Morning Commute, Part A
42:22
Example V: Morning Commute, Part B
44:19
Example V: Summary
49:13
Moment-Generating Functions

51m 58s

Intro
0:00
Moments
0:30
Definition of Moments
0:31
Moment-Generating Functions (MGFs)
3:53
Moment-Generating Functions
3:54
Using the MGF to Calculate the Moments
5:21
Moment-Generating Functions for the Discrete Distributions
8:22
Moment-Generating Functions for Binomial Distribution
8:36
Moment-Generating Functions for Geometric Distribution
9:06
Moment-Generating Functions for Negative Binomial Distribution
9:28
Moment-Generating Functions for Hypergeometric Distribution
9:43
Moment-Generating Functions for Poisson Distribution
9:57
Moment-Generating Functions for the Continuous Distributions
11:34
Moment-Generating Functions for the Uniform Distributions
11:43
Moment-Generating Functions for the Normal Distributions
12:24
Moment-Generating Functions for the Gamma Distributions
12:36
Moment-Generating Functions for the Exponential Distributions
12:44
Moment-Generating Functions for the Chi-square Distributions
13:11
Moment-Generating Functions for the Beta Distributions
13:48
Useful Formulas with Moment-Generating Functions
15:02
Useful Formulas with Moment-Generating Functions 1
15:03
Useful Formulas with Moment-Generating Functions 2
16:21
Example I: Moment-Generating Function for the Binomial Distribution
17:33
Example II: Use the MGF for the Binomial Distribution to Find the Mean of the Distribution
24:40
Example III: Find the Moment Generating Function for the Poisson Distribution
29:28
Example IV: Use the MGF for Poisson Distribution to Find the Mean and Variance of the Distribution
36:27
Example V: Find the Moment-generating Function for the Uniform Distribution
44:47
Section 5: Multivariate Distributions
Bivariate Density & Distribution Functions

50m 52s

Intro
0:00
Bivariate Density Functions
0:21
Two Variables
0:23
Bivariate Density Function
0:52
Properties of the Density Function
1:57
Properties of the Density Function 1
1:59
Properties of the Density Function 2
2:20
We Can Calculate Probabilities
2:53
If You Have a Discrete Distribution
4:36
Bivariate Distribution Functions
5:25
Bivariate Distribution Functions
5:26
Properties of the Bivariate Distribution Functions 1
7:19
Properties of the Bivariate Distribution Functions 2
7:36
Example I: Bivariate Density & Distribution Functions
8:08
Example II: Bivariate Density & Distribution Functions
14:40
Example III: Bivariate Density & Distribution Functions
24:33
Example IV: Bivariate Density & Distribution Functions
32:04
Example V: Bivariate Density & Distribution Functions
40:26
Marginal Probability

42m 38s

Intro
0:00
Discrete Case
0:48
Marginal Probability Functions
0:49
Continuous Case
3:07
Marginal Density Functions
3:08
Example I: Compute the Marginal Probability Function
5:58
Example II: Compute the Marginal Probability Function
14:07
Example III: Marginal Density Function
24:01
Example IV: Marginal Density Function
30:47
Example V: Marginal Density Function
36:05
Conditional Probability & Conditional Expectation

1h 2m 24s

Intro
0:00
Review of Marginal Probability
0:46
Recall the Marginal Probability Functions & Marginal Density Functions
0:47
Conditional Probability, Discrete Case
3:14
Conditional Probability, Discrete Case
3:15
Conditional Probability, Continuous Case
4:15
Conditional Density of Y₁ given that Y₂ = y₂
4:16
Interpret This as a Density on Y₁ & Calculate Conditional Probability
5:03
Conditional Expectation
6:44
Conditional Expectation: Continuous
6:45
Conditional Expectation: Discrete
8:03
Example I: Conditional Probability
8:29
Example II: Conditional Probability
23:59
Example III: Conditional Probability
34:28
Example IV: Conditional Expectation
43:16
Example V: Conditional Expectation
48:28
Independent Random Variables

51m 39s

Intro
0:00
Intuition
0:55
Experiment with Two Random Variables
0:56
Intuition Formula
2:17
Definition and Formulas
4:43
Definition
4:44
Short Version: Discrete
5:10
Short Version: Continuous
5:48
Theorem
9:33
For Continuous Random Variables, Y₁ & Y₂ are Independent If & Only If: Condition 1
9:34
For Continuous Random Variables, Y₁ & Y₂ are Independent If & Only If: Condition 2
11:22
Example I: Use the Definition to Determine if Y₁ and Y₂ are Independent
12:49
Example II: Use the Definition to Determine if Y₁ and Y₂ are Independent
21:33
Example III: Are Y₁ and Y₂ Independent?
27:01
Example IV: Are Y₁ and Y₂ Independent?
34:51
Example V: Are Y₁ and Y₂ Independent?
43:44
Expected Value of a Function of Random Variables

37m 7s

Intro
0:00
Review of Single Variable Case
0:29
Expected Value of a Single Variable
0:30
Expected Value of a Function g(Y)
1:12
Bivariate Case
2:11
Expected Value of a Function g(Y₁, Y₂)
2:12
Linearity of Expectation
3:24
Linearity of Expectation 1
3:25
Linearity of Expectation 2
3:38
4:03
Example I: Calculate E (Y₁ + Y₂)
4:39
Example II: Calculate E (Y₁Y₂)
14:47
Example III: Calculate E (U₁) and E(U₂)
19:33
Example IV: Calculate E (Y₁) and E(Y₂)
22:50
Example V: Calculate E (2Y₁ + 3Y₂)
33:05
Covariance, Correlation & Linear Functions

59m 50s

Intro
0:00
Definition and Formulas for Covariance
0:38
Definition of Covariance
0:39
Formulas to Calculate Covariance
1:36
Intuition for Covariance
3:54
Covariance is a Measure of Dependence
3:55
Dependence Doesn't Necessarily Mean that the Variables Do the Same Thing
4:12
If Variables Move Together
4:47
If Variables Move Against Each Other
5:04
Both Cases Show Dependence!
5:30
Independence Theorem
8:10
Independence Theorem
8:11
The Converse is Not True
8:32
Correlation Coefficient
9:33
Correlation Coefficient
9:34
Linear Functions of Random Variables
11:57
Linear Functions of Random Variables: Expected Value
11:58
Linear Functions of Random Variables: Variance
12:58
Linear Functions of Random Variables, Cont.
14:30
Linear Functions of Random Variables: Covariance
14:35
Example I: Calculate E (Y₁), E (Y₂), and E (Y₁Y₂)
15:31
Example II: Are Y₁ and Y₂ Independent?
29:16
Example III: Calculate V (U₁) and V (U₂)
36:14
Example IV: Calculate the Covariance Correlation Coefficient
42:12
Example V: Find the Mean and Variance of the Average
52:19
Section 6: Distributions of Functions of Random Variables
Distribution Functions

1h 7m 35s

Intro
0:00
Premise
0:44
Premise
0:45
Goal
1:38
Goal Number 1: Find the Full Distribution Function
1:39
Goal Number 2: Find the Density Function
1:55
Goal Number 3: Calculate Probabilities
2:17
Three Methods
3:05
Method 1: Distribution Functions
3:06
Method 2: Transformations
3:38
Method 3: Moment-generating Functions
3:47
Distribution Functions
4:03
Distribution Functions
4:04
Example I: Find the Density Function
6:41
Step 1: Find the Distribution Function
6:42
Step 2: Find the Density Function
10:20
Summary
11:51
Example II: Find the Density Function
14:36
Step 1: Find the Distribution Function
14:37
Step 2: Find the Density Function
18:19
Summary
19:22
Example III: Find the Cumulative Distribution & Density Functions
20:39
Step 1: Find the Cumulative Distribution
20:40
Step 2: Find the Density Function
28:58
Summary
30:20
Example IV: Find the Density Function
33:01
Step 1: Setting Up the Equation & Graph
33:02
Step 2: If u ≤ 1
38:32
Step 3: If u ≥ 1
41:02
Step 4: Find the Distribution Function
42:40
Step 5: Find the Density Function
43:11
Summary
45:03
Example V: Find the Density Function
48:32
Step 1: Exponential
48:33
Step 2: Independence
50:48
Step 2: Find the Distribution Function
51:47
Step 3: Find the Density Function
1:00:17
Summary
1:02:05
Transformations

1h 16s

Intro
0:00
Premise
0:32
Premise
0:33
Goal
1:37
Goal Number 1: Find the Full Distribution Function
1:38
Goal Number 2: Find the Density Function
1:49
Goal Number 3: Calculate Probabilities
2:04
Three Methods
2:34
Method 1: Distribution Functions
2:35
Method 2: Transformations
2:57
Method 3: Moment-generating Functions
3:05
Requirements for Transformation Method
3:22
The Transformation Method Only Works for Single-variable Situations
3:23
Must be a Strictly Monotonic Function
3:50
Example: Strictly Monotonic Function
4:50
If the Function is Monotonic, Then It is Invertible
5:30
Formula for Transformations
7:09
Formula for Transformations
7:11
Example I: Determine whether the Function is Monotonic, and if so, Find Its Inverse
8:26
Example II: Find the Density Function
12:07
Example III: Determine whether the Function is Monotonic, and if so, Find Its Inverse
17:12
Example IV: Find the Density Function for the Magnitude of the Next Earthquake
21:30
Example V: Find the Expected Magnitude of the Next Earthquake
33:20
Example VI: Find the Density Function, Including the Range of Possible Values for u
47:42
Moment-Generating Functions

1h 18m 52s

Intro
0:00
Premise
0:30
Premise
0:31
Goal
1:40
Goal Number 1: Find the Full Distribution Function
1:41
Goal Number 2: Find the Density Function
1:51
Goal Number 3: Calculate Probabilities
2:01
Three Methods
2:39
Method 1: Distribution Functions
2:40
Method 2: Transformations
2:50
Method 3: Moment-Generating Functions
2:55
Review of Moment-Generating Functions
3:04
Recall: The Moment-Generating Function for a Random Variable Y
3:05
The Moment-Generating Function is a Function of t (Not y)
3:45
Moment-Generating Functions for the Discrete Distributions
4:31
Binomial
4:50
Geometric
5:12
Negative Binomial
5:24
Hypergeometric
5:33
Poisson
5:42
Moment-Generating Functions for the Continuous Distributions
6:08
Uniform
6:09
Normal
6:17
Gamma
6:29
Exponential
6:34
Chi-square
7:05
Beta
7:48
Useful Formulas with the Moment-Generating Functions
8:48
Useful Formula 1
8:49
Useful Formula 2
9:51
How to Use Moment-Generating Functions
10:41
How to Use Moment-Generating Functions
10:42
Example I: Find the Density Function
12:22
Example II: Find the Density Function
30:58
Example III: Find the Probability Function
43:29
Example IV: Find the Probability Function
51:43
Example V: Find the Distribution
1:00:14
Example VI: Find the Density Function
1:12:10
Order Statistics

1h 4m 56s

Intro
0:00
Premise
0:11
Example Question: How Tall Will the Tallest Student in My Next Semester's Probability Class Be?
0:12
Setting
0:56
Definition 1
1:49
Definition 2
2:01
Question: What are the Distributions & Densities?
4:08
Formulas
4:47
Distribution of Max
5:11
Density of Max
6:00
Distribution of Min
7:08
Density of Min
7:18
Example I: Distribution & Density Functions
8:29
Example I: Distribution
8:30
Example I: Density
11:07
Example I: Summary
12:33
Example II: Distribution & Density Functions
14:25
Example II: Distribution
14:26
Example II: Density
17:21
Example II: Summary
19:00
Example III: Mean & Variance
20:32
Example III: Mean
20:33
Example III: Variance
25:48
Example III: Summary
30:57
Example IV: Distribution & Density Functions
35:43
Example IV: Distribution
35:44
Example IV: Density
43:03
Example IV: Summary
46:11
Example V: Find the Expected Time Until the Team's First Injury
51:14
Example V: Solution
51:15
Example V: Summary
1:01:11
Sampling from a Normal Distribution

1h 7s

Intro
0:00
Setting
0:36
Setting
0:37
Assumptions and Notation
2:18
Assumption Forever
2:19
Assumption for this Lecture Only
3:21
Notation
3:49
The Sample Mean
4:15
Statistic We'll Study the Sample Mean
4:16
Theorem
5:40
Standard Normal Distribution
7:03
Standard Normal Distribution
7:04
Converting to Standard Normal
10:11
Recall
10:12
Corollary to Theorem
10:41
Example I: Heights of Students
13:18
Example II: What Happens to This Probability as n → ∞
22:36
Example III: Units at a University
32:24
Example IV: Probability of Sample Mean
40:53
Example V: How Many Samples Should We Take?
48:34
The Central Limit Theorem

1h 9m 55s

Intro
0:00
Setting
0:52
Setting
0:53
Assumptions and Notation
2:53
Our Samples are Independent (Independent Identically Distributed)
2:54
No Longer Assume that the Population is Normally Distributed
3:30
The Central Limit Theorem
4:36
The Central Limit Theorem Overview
4:38
The Central Limit Theorem in Practice
6:24
Standard Normal Distribution
8:09
Standard Normal Distribution
8:13
Converting to Standard Normal
10:13
Recall: If Y is Normal, Then …
10:14
Corollary to Theorem
11:09
Example I: Probability of Finishing Your Homework
12:56
Example I: Solution
12:57
Example I: Summary
18:20
Example I: Confirming with the Standard Normal Distribution Chart
20:18
Example II: Probability of Selling Muffins
21:26
Example II: Solution
21:27
Example II: Summary
29:09
Example II: Confirming with the Standard Normal Distribution Chart
31:09
Example III: Probability that a Soda Dispenser Gives the Correct Amount of Soda
32:41
Example III: Solution
32:42
Example III: Summary
38:03
Example III: Confirming with the Standard Normal Distribution Chart
40:58
Example IV: How Many Samples Should She Take?
42:06
Example IV: Solution
42:07
Example IV: Summary
49:18
Example IV: Confirming with the Standard Normal Distribution Chart
51:57
Example V: Restaurant Revenue
54:41
Example V: Solution
54:42
Example V: Summary
1:04:21
Example V: Confirming with the Standard Normal Distribution Chart
1:06:48
Bookmark & Share Embed

## Copy & Paste this embed code into your website’s HTML

Please ensure that your website editor is in text mode when you paste the code.
(In Wordpress, the mode button is on the top right corner.)
×
• - Allow users to view the embedded video in full-size.
Since this lesson is not free, only the preview will appear on your website.

• ## Transcription

### Start Learning Now

Our free lessons will get you started (Adobe Flash® required).

### Membership Overview

• *Ask questions and get answers from the community and our teachers!
• Practice questions with step-by-step solutions.
• Track your course viewing progress.
• Learn at your own pace... anytime, anywhere!

### Moment-Generating Functions

Lecture Slides are screen-captured images of important points in the lecture. Students can download and print out these lecture slide images to do practice problems as well as take notes while watching the lecture.

• Intro 0:00
• Moments 0:30
• Definition of Moments
• Moment-Generating Functions (MGFs) 3:53
• Moment-Generating Functions
• Using the MGF to Calculate the Moments
• Moment-Generating Functions for the Discrete Distributions 8:22
• Moment-Generating Functions for Binomial Distribution
• Moment-Generating Functions for Geometric Distribution
• Moment-Generating Functions for Negative Binomial Distribution
• Moment-Generating Functions for Hypergeometric Distribution
• Moment-Generating Functions for Poisson Distribution
• Moment-Generating Functions for the Continuous Distributions 11:34
• Moment-Generating Functions for the Uniform Distributions
• Moment-Generating Functions for the Normal Distributions
• Moment-Generating Functions for the Gamma Distributions
• Moment-Generating Functions for the Exponential Distributions
• Moment-Generating Functions for the Chi-square Distributions
• Moment-Generating Functions for the Beta Distributions
• Useful Formulas with Moment-Generating Functions 15:02
• Useful Formulas with Moment-Generating Functions 1
• Useful Formulas with Moment-Generating Functions 2
• Example I: Moment-Generating Function for the Binomial Distribution 17:33
• Example II: Use the MGF for the Binomial Distribution to Find the Mean of the Distribution 24:40
• Example III: Find the Moment Generating Function for the Poisson Distribution 29:28
• Example IV: Use the MGF for Poisson Distribution to Find the Mean and Variance of the Distribution 36:27
• Example V: Find the Moment-generating Function for the Uniform Distribution 44:47

### Transcription: Moment-Generating Functions

Hi, welcome back to the probability lectures here on www.educator.com, my name is Will Murray.0000

We are going to talk today about moment generating functions.0005

Moment generating functions are one of the most confusing topics that people encounter in probability.0010

I'm going to try to walk you through them and show you what they are used for.0017

You might prepare yourself to be a little confused at first because every time I taught it,0021

it is my students who always found them to be a little confusing.0026

I will try to show you how it works.0029

The initial idea I want to talk about is a moments.0032

We start with a random variable and it can be discreet or continuous.0037

We will talk about moment generating functions for all of the distributions that we have been studying,0040

all of the discrete ones, binomial, geometric, and so on, and all of the continuous distributions, uniform and normal, and so on.0046

We can talk about moments and we can talk about moment generating functions for all of these distributions.0055

The first definition is the Kth moment of Y taken around the mean.0063

Let me highlight that.0069

The Kth moment of Y taken around the mean is just the expected value of Y ⁺K.0070

The Kth there can be 1, 2, 3, and it can be 0, although people do not usually need to look0079

at 0 as moment because that is not very illuminating.0084

I said mean but I meant to say R gen.0088

We are also going to talk about moments around the mean.0091

But, it is important here that we are talking about the moments around the origin.0093

There is some notation that is sometimes used for this which is ν K prime.0100

That is really not obvious why we would use the notation ν K prime.0106

I’m not going to use that notation in this lecture, but if you are following along0111

in your own probability course or in your own probability book, you might see the notation ν K prime.0115

What that means is the expected value of Y ⁺K.0121

Those mean the same thing.0126

There is another notation that you might see in your book which is that idea of central moments.0128

Instead of taking the moment around the origin, we will talk about taking the moment about the mean.0138

Which means, instead of talking about Y ⁺K, you do Y – μ ⁺K where μ is the mean of the original distribution.0144

And that is called μ sub K and that is why we have to use μ sub K prime for the one that we are studying.0155

I want to emphasize that there are 2 different ideas here.0162

There is the moment around the origin and there is the moment around the mean.0166

In this lecture, in the probability lectures here on www.educator.com,0170

I'm just going to look at the moment taken around the origin.0175

I have got some more common one and it is easier to understand the ideas for that one.0179

I'm not going to talk anymore about the central moment.0186

We will talk about the moment around the mean.0189

I just mention that, in case you see it in your course, you know what the difference is.0192

You do not really need to study both of them, you can figure it out.0198

If you know one, you can figure out the other one just by doing some computations.0202

It is not necessary to study both of them.0208

You pick a system and then you follow that, and you can find all the information you need within one system.0210

The system we are going to use is the moments around the origin.0216

I just included it, in case you see it in your course.0224

This has told us what the moment generating function is, let me jump onto the next slide and show you what that is.0227

The moment generating function for Y is M sub Y of T.0235

That := means it is defined to be.0241

It is defined to be the expected value of E ⁺TY.0244

That is a very illuminating definition, I do want to highlight it here0249

because it is probably the most important definition we are going to have in this whole lecture.0255

It is not obvious what it means right now, and I'm not going to clarify it right away .0262

I’m just going to throw the definition at you and then we will practice using it to solve some problems.0268

MY of T remember is defined to be the expected value of E ⁺TY, that is E like the exponential function.0274

The important things that you need to remember right now is,0284

the first one is that the moment generating function is a function of T not of Y.0289

When you calculate the moment generating function for distribution, you should have a function of T.0303

By the time you simplify it down, you will not see a Y.0310

We will do some examples and you will see how it works out.0314

The Y always disappear, you always end up with a function of T.0317

Here is how you use the moment generating function.0323

Once you know it, this first line is kind of trivial but I included because it will make the other lines make more sense.0326

The expected value of Y⁰ is equal to the moment generating function with 0 plugged in for T.0337

The expected value of Y⁰, Y⁰ is always 1 because anything to the 0 is 1.0347

That is the expected value of 1 which of course will be 1.0355

It is not like we are really learning anything much from the moment generating function,0359

because we already knew that the expected value of Y⁰ is 1.0363

In the next line, the moment generating function starts to become useful.0367

What you do is you take the derivative of the moment generating function.0372

And again, you plug in 0 for T and what that tells you is the expected value for your distribution.0376

Now, we have something useful, we fused the moment generating function to find the mean of the distribution.0383

In the second line, what we have done is we have take another derivative M prime prime.0391

We plug in T is equal to 0.0398

What that tells us is, the second moment of the distribution E of Y².0401

Why is that useful, the reason that is useful is because it helps us to find the variance of the distribution.0407

We can use this to find the variance.0414

Be careful here, the variance is not the expected value of Y².0419

Let me remind you how we calculate the variance.0424

We calculate the variance as σ² is equal to the expected value of Y² - the expected value of (Y)².0428

If we can figure out these 2 moments using the moment generating function,0440

what we can do is drop in the expected value of Y² here from the MGF.0446

We can use the MGF, the moment generating function, to calculate the expected value of Y².0455

We can also use the moment generating function to calculate the expected value of Y MGF.0464

Both of these ingredients, they go into calculating the variance come from the moment generating function.0473

That is how we use the moment generating function, is to find these two ingredients to calculate the variance.0480

There are other uses for the moment generating function, later on in statistics0487

but I'm not going to get into them right away in this lecture.0492

Instead, what I want to do is show you some of the moment generating functions for our favorite distributions.0496

We have here, all our favorite discreet distributions, binomial, geometric,0506

negative binomial, hypergeometric, and Poisson distribution.0511

Here are what the moment generating functions turn out to be.0517

For binomial, it is PE ⁺T + 1 - (P) ⁺n.0521

By the way, the binomial distribution, we often define Q to be 1 – P.0527

That term of 1 – P, people often write that as Q, and they simplify the way0535

to write the moment generating function somewhat.0543

For the geometric distribution, PE ⁺T/1 - (1 - P) E ⁺T.0547

Again, there is a Q in there, that is equal to 1 – P.0554

If you want to simplify this down, you can write this as PE ⁺T/1 - Q × E ⁺T.0557

A little bit simpler to write at the expense of having one more variable.0565

Negative binomial distribution is almost the same thing, except on there is an R in the exponent.0569

Almost the same as the geometric distribution.0577

Again, you can put in a Q for 1 – P, if you like that.0579

The hypergeometric distribution has no closed form moment generating function.0583

If you try to calculate the moment generating function of a hypergeometric distribution, it just blows up in your face.0589

There is no reason to go there, we would not go there.0595

The Poisson distribution much more well behaved, it is E ⁺λ × E ⁺T – 1.0597

Couple of things I want to mention about all of these, one is you might be wondering where these come from,0605

how do you calculate these moment generating functions.0610

Stay tuned, I will tell you because we will work out a couple of these in the examples.0612

Or you can just scroll down right now, if you are bursting with curiosity.0619

Check out example 1 and 3, I think, we are going to do the binomial distribution.0622

We will calculate the moment generating function.0631

For example 3, we are going to take the Poisson distribution and calculate moment generating function.0633

You will be able to see where these come from.0639

Another thing that I want to point out about this is, that you notice that nowhere on here do you see the variable Y.0641

All of these are functions of T, you see T everywhere here.0650

The moment generating function is always a function of T not Y, it is a function of T.0664

If you are calculating a moment generating function, if you still have Y on your paper0672

then you need to keep going until you can get rid of the Y, and try to simplify it down into a function of T.0677

These are just the discreet distributions, we also have a number of continuous distributions.0688

Let us go ahead and look at those.0693

Here, our favorite continuous distributions, uniform, normal, gamma, exponential, Chi square, and the β distribution.0696

The uniform distribution is a very simple distribution.0704

It has a surprisingly complicated moment generating function, E ⁺T θ2 – E ⁺T θ1 ÷ T × θ2 – θ1.0707

I keep saying my θ in the wrong order.0721

We are going to calculate that one out by hand, I think that is example 5.0724

If you want, you can scroll down and take a look at example 5.0732

You will see how we calculate the uniform distribution.0737

The others are more difficult, I did not put them into examples.0740

The normal distribution E ⁺ν T + T² σ²/2.0745

All of this is in the exponent of the E.0750

There is a lot in the exponent there.0754

The gamma distribution is 1 - β T ⁻α.0757

The next two distributions, remember are actually special cases of the gamma distribution.0764

The exponential distribution is just the gamma distribution where we take α equal to 1.0770

If you look at the gamma distribution, the moment generating function, and just plug in α = 1,0778

you get the moment generating function for the exponential distribution.0786

It is quite nice and simple.0789

The Chi square distribution is the gamma distribution with α defined to be ν/2.0791

Ν is the number of degrees of freedom and β is equal to 2.0801

If you take the gamma distribution and you plug in α is equal to ν/2 and β is equal to 2,0808

you get the moment generating function for the Chi square distribution, 1 -2T ⁻ν/2.0818

The β distribution, if you try to calculate the moment generating function,0829

you will get into a horrible mess and it just blows up in your face.0834

We say that there is no closed formula in moment generating function for the β distribution.0839

By the way, if you are a little rusty on what all these words mean, uniform, normal, gamma,0844

exponential, chi square, β, we have separate lectures about each one of these distributions.0849

You can go back and you can read up on the uniform distribution.0854

You can practice the normal distribution.0857

You can study the gamma distribution.0859

Of course, the exponential and chi square distribution, those are special cases of gamma distribution.0862

You will find those in the lecture on gamma distribution.0867

Just scroll up here and you will see the lecture on gamma distribution.0871

You will get the exponential and Chi square thrown in there as a bonus.0874

There is also a lecture on the β distribution, you can read up all about that.0878

The only things that are not in those lectures are the moment generating functions.0884

That is what I'm telling you about right now.0889

Let us go ahead and jump into some examples, and see how we actually derive0892

these moment generating functions, and then see how we can use them to calculate some means and some variances.0894

I see we have one more slide before I talk about the examples.0905

A couple of useful formulas for the moment generating functions.0908

If you have one known random variable Y and you do a linear change of variables.0913

If you define Z to be AY + B, := means defined to be.0920

If you define Z to be AY + B, then the moment generating function for Z is related0927

to the moment generating function for Y, except that there is an A missing in there.0935

Let me just go ahead and write that A in there.0945

It is just MY of AT and then E × E ⁺BT.0951

That is how you get from the moment generating function of Y to the moment generating function of Z.0959

Very useful, by the way, when you are converting normal distributions.0966

When you convert to a standard normal variable, you are doing exactly this kind of variable change.0970

This is quite useful, when you want to calculate the moment generating function.0978

Second useful formula, when Y1 and Y2 are independent variables.0982

Z is Y1 + Y2, there you are defining Z to be Y1 + Y2.0987

This only works for independent variables.0995

But when they are not independent, you can say that the moment generating function for Z0997

is the moment generating function for Y × the moment generating function for Y2.1002

What moment generating functions do is they convert sums into products.1008

That is really not surprising, that is essentially based on the fact that E ⁺X + Y is equal to E ⁺X × E ⁺Y.1013

Remember, our initial definition of moment generating function was in terms of the expected value of an exponential.1022

The fact that moment generating functions convert sums of variables into products of functions,1030

converts addition into multiplication, is really not very surprising.1038

But, you do have to check that you are talking about independent variables.1042

Let us go on and talk about some examples where we will actually calculate some moment generating functions.1047

In example 1, we want to find the moment generating function for the binomial distribution.1055

Let me remind you of the probability function for the binomial distribution.1061

It is been awhile since we studied that.1067

If you do not know what the binomial distribution is at all, just check back in the list up above,1069

you will see a whole lecture on the binomial distribution.1075

The take away from that lecturer right now, is that the probability of a value of Y is equal to N choose Y,1079

that is a binomial coefficient.1087

P ⁺Y Q ⁺N-Y, that is for Y ranging between 0 and N.1089

It represents the probability of getting Y heads when you flip a coin N ×.1098

Let us try to figure out the moment generating function for that distribution.1106

M sub Y of T, using that definition of moment generating function, defined to be the expected value of E ⁺TY.1110

How do you find the expected value of a function of Y?1124

Here is how you do it, I showed you this in a very early lecture.1127

It is the sum over all values of Y, of the probability of that particular Y, × that function of Y, E ⁺TY.1133

We need to expand that and figure it out.1147

What values of Y are we talking about?1150

I read from here that the range of values of Y is from my equal 0 to N.1152

The probability of each Y, I wrote that down right above.1158

It is N choose Y × P ⁺Y × Q ⁺N- Y.1162

Now, I have to multiply on this term E ⁺TY.1169

What can I do with this, remember I’m trying to simplify this into a function of T,1175

which means I'm trying to get rid of the Y, which means I have to do something clever.1180

Here is what I can do, I notice that I have P ⁺Y here.1184

Here, I have E ⁺TY which I can write as E ^(T) ⁺Y.1189

I can combine those two factors, that is what I'm going to do.1196

Y = 0 ⁺N of n choose Y of PE ⁺T ⁺Y × Q ⁺N-Y.1200

If you stare at this very heart, you are supposed to recognize something, to have a small epiphany, if you will.1216

In fact, you might want to stop the video right now and stare at this formula,1223

and go ahead and have that epiphany.1228

I will wait, did you come and have that epiphany?1231

I think it is worth staring at that equation because it is really fun to recognize something.1235

What you are supposed to recognize in this formula is the binomial theorem.1239

I will remind you what the binomial theorem says.1244

It says (A + B) ⁺n is equal to the sum from Y =0 to N of N choose Y A ⁺Y B ⁺N-Y.1247

You might have seen the binomial theorem used in a slightly different variable,1262

but it should be the same theorem because it is a universal truth.1267

What we have here is exactly that formula.1271

We are sort of reverse engineering the binomial theorem now, but my A is going to be PE ⁺T, my B is Q.1275

We have a perfect match of the binomial theorem.1284

It is A + B ⁺N, that is PE ⁺T + Q ⁺N.1287

Notice here that, we have a function of T.1299

T only, there are no Y left anymore.1307

The moment generating function is now a function of T, we have solved the problem.1312

If you do not like that Q, where did that Q come from.1319

You can always put it back in two terms of P.1322

You could write this as PE ⁺T, Q is 1 – P, all of that is still raised to the nth power.1325

I think that is the version that I gave you on the chart of moment generating functions a couple of slides ago.1335

Now you know how those two correspond to each other.1341

We are done with that example, we found the moment generating function for the binomial distribution.1346

Let me recap the steps we went through.1352

First of all, I have reminded myself of the probability function for the binomial distribution.1354

Here it is, N choose Y P ⁺Y Q ⁺N-Y.1360

Here is the range of Y values involved.1364

And then, I used the definition of the moment generating function, found on one of the earlier slides in this lecture.1367

It is the expected value of E ⁺TY.1375

The expected value of any function, the way you calculate it is you sum/Y.1378

This would be an integral, if you are in a continuous distribution.1383

But since binomial is discrete, we are using the sum.1386

The probability of Y × that function E ⁺TY, I expanded P of Y that is what I did here, I expand P of Y into that.1389

And then, I noticed that there is a P ⁺Y and E ⁺TY.1403

I can combine those, if I cleverly write E ⁺TY as E ⁺T ⁺Y.1407

I combined those together as PE ⁺T ⁺Y.1413

And then, I really had an epiphany, I said look, that is exactly the binomial theorem.1417

I reminded myself of the binomial theorem here.1423

I noticed how this fits that pattern and this is exactly PE ⁺T + Q ⁺nth.1427

Notice that, it is a function of T, there are no more Y left in this.1435

If you do not like the Q, you could always expand it out into 1 – P.1441

That was the role that Q played in the binomial distribution.1445

Hang onto this moment generating function because we have not really used it for anything yet.1449

We just figured out what it was.1455

I just justified this formula on the chart at the beginning of this lecture, but I have not used it for anything yet.1457

What I'm going to do in the next example is, we will use this formula to calculate the mean of the binomial distribution.1465

We will see for the first time what MGF can be good for.1473

Do no forget this formula, we are going to use it again right away in example 2.1477

In example 2, we are going to use the MGF for the binomial distribution to find the mean of the distribution.1482

We calculated the moment generating function for the binomial distribution in the previous example, example 1.1489

If you did not just watch example 1, maybe go back and watch it right now.1496

What you will find out is that the moment generating function, this is what we calculated in example 1,1500

turned out to be PE ⁺T + Q ⁺nth.1507

What is that mean? I have no idea.1514

But let me show you how we can use it.1516

Remember that, we can calculate the mean of the distribution, the expected value of Y.1519

The way you calculate that, the way you calculate it now that we have the event Scientific Technology1527

of the moment generating function is to take M sub Y prime of T at T =0.1533

You take its derivative and then you plug in T = 0.1545

This is something that we learned in the second slide, I think, of this lecture.1549

If you scroll back a few slides and look at that, you will see where this comes from.1554

Let us figure out what the derivative of this is, PE ⁺T + Q ⁺N.1559

Remember, T is my variable, everything else is a constant P, Q, E, N, those are all constants.1565

N is an exponent, I'm going to use the power rule.1572

It is time to review your calculus 1.1575

The derivative of something to the nth is N × all that stuff, PE ⁺T + Q ⁺N-1 × the derivative of this stuff inside.1578

That is the chain rule, I have to do PE ⁺T, and Q is a constant, I do not have to do anything about that.1591

That is the chain rule that I had to write PE ⁺T on the outside there.1599

At T = 0, I got to plug in T = 0.1603

If I plug in T = 0, it is N × P × E ⁺T is just 1 + Q ⁺N -1 × P × E⁰ is just 1.1609

In the parentheses there, I see that I have P + Q.1625

Remember that, Q is 1 – P, that means P + Q is equal to 1.1628

I have got N × 1, that P + Q magically simplifies into 1.1635

1 ⁺N-1 × P × 1.1642

1 ⁺N- 1 is just 1, and I have got N × P.1648

That is the mean of the binomial distribution, you can call it the expected value or the mean,1655

I do not care which one you use because they both mean the same thing.1660

This is something that we did know years and years ago, when we study the binomial distribution.1664

But, it is nice to have the moment generating function to confirm it.1672

The mean of the binomial distribution is N × P.1676

Let me recap the steps there.1680

I started off with the moment generating function that I calculated back in example 1.1682

That comes from example 1, if you did not just watched example 1 then you are missing out1688

because you would not know how we derived that.1693

Maybe you go back and watch example 1 to see where that came from.1696

To find the expected value of any distribution, what you do is you can take1701

the moment generating function take its derivative, and then plug in T = 0.1707

We took its derivative, a little bit of calculus 1 coming here.1713

We got the power rule N × PE ⁺T + Q ⁺N -1.1716

The chain rule means you have to multiply on the derivative with the stuff inside.1721

That is where the PE ⁺T came from, and the Q just goes away because it is a constant.1725

And then, I plug in T = 0 that is why I got E ⁺T is 1 here.1731

P + Q turn into 1, and that all simplifies down 1 ⁺N-1 just turns into 1.1738

That simplifies down to NP, now, I know what the mean of the binomial distribution is.1746

We are going to do this again, something similar with the Poisson distribution.1755

If this still does not make sense then you got a chance to see the same kind of process with the Poisson distribution.1760

Stick around for examples 3 and 4.1766

In example 3, we are going to find the moment generating function for the Poisson distribution.1770

It is kind of working from scratch there.1775

Let me remind you, first of all, the probability function for the Poisson distribution.1777

The probability function for the Poisson distribution, there was a λ parameter in there.1782

It is λ ⁺Y/Y! × E⁻λ.1787

The possible values of Y there could be anything from 0 up to, it is unbounded.1795

That is the probability function for the Poisson distribution.1803

If you do not remember that, if it looks like I just completely brought that in from that field.1807

Maybe, what you want to do is re-watch the video about the Poisson distribution which can be found in the same set of lectures.1813

Just scroll up, you will see a whole video on the Poisson distribution.1821

In particular, you will see this formula in there, you will see where it comes from.1825

Now, I want to find the moment generating function for the Poisson distribution, N sub Y of T.1829

By definition, this is the definition I gave you earlier in this lecture.1838

I highlighted it, you really would not miss it.1843

It is the expected value of E ⁺T × Y.1845

How will I calculate the expected value?1852

For a discreet distribution, you take the sum overall possible values of Y,1855

the probability of each of those values × the function that you are calculating E ⁺TY.1862

If this were a continuous distribution, it would be almost the same, except, instead of the sum,1868

we would have an integral.1874

Also, instead of the P we have an F.1876

But it would still be the same basic format, just you might want to get comfortable switching back and forth1879

between sums and integrals in your mind, because they really play the same role.1886

One for discrete distributions and one is for continuous distributions.1890

I'm going to plug in what P of Y is, it is the sum on Y.1897

I guess Y is equal to 0 to infinity, that is coming from this range on Y here.1902

P of Y is λ ⁺Y/Y! × E ⁻λ.1907

I also have this term of E ⁺TY, what can I do with that.1914

One thing I notice is that E ⁻λ is not really doing anything.1918

Because it does not have a Y in it, that means it is constant, I can pull that outside.1924

E ⁻λ × the sum from Y = 0 to infinity.1928

Λ ⁺Y and E ⁺TY, I can combine those.1935

E ⁺TY is the same as E ⁺T ⁺Y.1940

This is λ E ⁺T ⁺Y/Y!.1946

I do not need to write the E ⁻λ because I wrote it outside, and that accounts for all terms here.1954

Again, I'm going to pause and let you stare at this for a moment or 2,1962

and have an epiphany because there really is a revelation to be made with this formula.1967

Do you see the revelation that we had at this formula, just stare at it, there is something really good.1975

As a hint, I will remind you of the old Taylor series for E ⁺X.1982

The Taylor series for E ⁺X is the sum from N = 0 to infinity of X ⁺N/N!.1987

Look at this, we have got the same formula here except that in place of N, we have got Y.1997

In place of X, we got λ E ⁺T.2004

What we really have here, of course we still got E ⁻λ, is E ⁺λ E ⁺T, very nice and simple.2009

By the way, notice now, that we have gotten rid of the Y.2022

We got it down to a function of T, that is very convenient because that is2025

what a moment generating function is supposed to be.2031

It is supposed to be a function T not of Y.2033

That is essentially the mean right now, I will do a little algebra to simplify it but we have done the hard part.2038

I can combine these E ⁺λ E ⁺T – λ.2044

E ⁺λ, if I factor that out × E ⁺T-1.2050

That is the moment generating function for the Poisson distribution.2056

We are done with that problem.2061

To recap the steps there, in case anybody is a little confused.2070

Poisson distribution is one we studied earlier, there is another video lecture on the Poisson distribution.2074

Just scroll up and you will see it.2079

In particular, you will see the probability function for the Poisson distribution.2081

There it is right there, λ ⁺Y/Y! × E ⁻λ.2085

Λ is the parameter that comes in for the Poisson distribution, that you sort of fix ahead of time, it is a constant on.2090

There is the range of Y, 0 to infinity.2098

To find the moment generating function, we take the expected value of E ⁺TY which means we sum the Y,2101

of the probability of Y × E ⁺TY.2109

And then, I just dropped the probability function in there.2112

There is the probability function, I sum of all the ranges of Y that we are interested in, that came from right here.2115

This E ⁺TY, I discovered that I can write it as E ⁺T ⁺Y.2126

I can combine it with λ ⁺Y.2132

I factored out E ⁻λ, I can factor that out because there is no Y in there, it is a constant.2135

What I realized here is that, this exactly matches my Taylor series formula for E ⁺X.2141

What I get here is E ⁺λ E ⁺T.2148

And then, I did a little algebra to clean that up into E ⁺λ × E ⁺T – 1.2152

Hang onto this moment generating function, we are going to use it again in the next example.2160

We are going to find the mean and the variance of the Poisson distribution,2165

using the moment generating function.2169

Make sure you understand this, and when you are pretty confident with it, go ahead and work on example 4.2173

You will see how we use this moment generating function to find the mean and the variance.2181

In example 4, we are going to use the moment generating function for the Poisson distribution,2188

to find the mean and the variance of the distribution.2194

We just calculated in example 3, the moment generating function MY of T is E ⁺λ × E ⁺T-1.2198

That was the moment generating function.2211

If you do not remember how we did that, it means you did not just watched example 3.2213

Go back and watch examples 3, that should make sense.2217

There was an earlier fact that I gave you earlier in this lecture which is that E of Y is always the moment generating function.2222

You take its derivative and then you plug in 0.2234

We will use that to find the mean E of Y² is the second derivative of the moment generating function.2237

You plug in 0, that is not the variance directly but you can use that very quickly to find the variance.2247

We are going to take the second derivative of this moment generating function.2253

It is going to get a little messy but it is not too bad, especially after we plug in 0, it is really not bad.2259

Y prime of T is equal to, we have an exponential function, it is just E to all that same stuff.2265

E ⁺λ × E ⁺T-1 ×, chain rule coming in here, the derivative of all that stuff in the exponent.2274

That is λ × E ⁺T - λ × 1.2282

Λ × 1 is constant, its derivative just goes away.2288

That is it, let me go ahead and take the second derivative while I'm at it.2292

N double prime of T, this is going to be nasty.2298

We are going to have to use the product rule for it.2309

It is not that bad, it is just kind of basic calculus 1 stuff.2312

Let me factor out the λ because that is a constant, I factor that right now.2316

The first × the derivative of the second.2320

The first function is E ⁺λ × E ⁺T-1, × the second one is E ⁺T.2322

I'm ignoring this λ now because I have pulled that to the outside.2334

That was the first × the derivative of the second.2342

The derivative of E ⁺T is E ⁺T.2344

The second function × the derivative of the first one is little a messier.2347

The second function is E ⁺T, the derivative of the first one is E ⁺λ × E ⁺T-1 × its derivative which by the chain rule is λ × E ⁺T.2351

All of that multiplied by a λ.2365

I could have simplify that but I do not think it is worth doing.2367

Instead, what I'm going to do is plug in 0 to each of these functions.2373

Let me go back above and Y prime of 0 is E ⁺λ ×, E ⁺T is E⁰, E⁰ is 1.2377

So 1-1 is 0, it is E ⁺λ × 0 × λ × E⁰ × 1.2392

That E⁰ is 1 is just λ.2401

N double prime of 0, go through here and plug in 0 everywhere I see a T.2406

Λ × E ⁺λ, E ⁺T is E⁰ which is 1.2414

E ⁺λ × 0, E ⁺T is 1 + E⁰ is 1, E ⁺λ × 0 × λ × E⁰ is 1.2422

Let us simplify this down.2440

This is λ × E⁰ is 1 + I see another one × λ, 1 + λ.2441

This simplifies down to λ + λ².2451

How are we to use all this information?2456

Remember, the expected value of Y is M prime of Y, M prime of 0.2459

The expected value of Y is MY prime of 0 which we figure out was λ.2466

That is λ right there, and that is that mean.2480

We figured out the mean of our distribution is λ, very nice to know.2484

To find the variance, it is a little more complicated.2491

Sigma² is not just M double prime, it is the expected value of (Y)² - the expected value of Y².2497

This is N double prime is E of Y², that is λ + λ² -, E of Y², the E of Y we figure out was λ, λ².2506

This is very nice, the λ² cancel.2528

For the variance, we also get λ, how convenient.2531

What we have done is we have calculated the mean and variance of the Poisson distribution,2537

based solely on the moment generating function.2543

Once you understand the moment generating function, you can find the mean and variance of the distribution.2545

Let me show you the steps there, again2553

We calculated, first of all the moment generating function, that came from example 3.2556

The work here was all done in example 3, and there was some work to be done there.2562

And then, we took its derivative which was kind of, no product rule in that but there was a chain rule.2567

You took its second derivative and there was a big product rule, and lots of little chain rules coming in.2574

It got a little messy, but when we plunged in 0 then all the E⁰ turned into 1, that simplify a lot there.2581

M single prime turn into λ, the M double prime, all the 0 turn into 1.2591

It simplified down to λ + λ².2600

Here is how we use those, remember, I told you on the 2nd slide of this lecture, that M prime is E of Y.2603

M prime gives you E of Y which right away is the mean of the distribution, that λ is coming from there.2613

The M double prime is the E of Y² which is not the variance yet but it factors into calculating the variance,2625

because the variance is E of Y² – E of (Y)².2634

That λ + λ² is where we got that λ + λ².2640

The E of Y also came from up here.2645

We plug in that λ in there, we got λ² which canceled off the λ² from E of Y².2654

It just reduced down to the variance of the Poisson distribution is λ.2660

Of course, those answers agree with what I told you several lectures ago, when we talked about the Poisson distribution.2667

It is really reassuring to have those agree with what we had previously suspected there.2674

In example 5, we are going to find the moment generating function for the uniform distribution.2689

This is kind of nice because the other examples were both discreet distributions.2693

This is the only continuous distribution we are going to calculate.2699

The others are kind of messy.2703

Even doing this for the uniform distribution, it is a little messy than you might expect,2705

considering that the uniform distribution is so simple.2710

Let me remind you what the uniform distribution is.2714

The density function for the uniform distribution is F of Y is always equal to,2718

those three lines mean it is constantly equal to 1/θ2 – θ1, where Y ranges between θ1 and θ2.2723

It is just the constant distribution, that is why it is called uniform.2733

Let us find the moment generating function.2737

By definition, the moment generating function is := means defined to be, the expected value of E ⁺T × Y.2739

The way you calculate the expected value of a function is, with the discreet distribution we are studying before was the sum.2752

For a continuous distribution, it is an integral.2761

The integral, this is also a definition of expected value.2765

It is the integral of the density function F of Y × whatever function you are trying to find the expected value of,2769

in this case E ⁺TY DY.2777

And then, you integrate that over your whole range for Y, which in this case is θ1 to θ2.2780

Now, we just have to do some calculus.2788

This is the integral from θ1 to θ2.2792

F of Y is 1/θ2 – θ1, that is just a constant there.2795

E ⁺TY DY, not such a bad integral, really not too bad.2802

The answer is 1/θ2 – θ1, that is a constant, I can pull it out.2807

What is the integral of the E ⁺TY, remember here, our variable is Y.2813

We are integrating with respect to Y.2818

The integral of E ⁺TY, if you do a little substitution there, let me go ahead and do it in my head.2824

It is just E ⁺TY × 1/T, that is because we are thinking of T as being constant here.2830

Y is the variable of integration, it is just 1/T.2839

If you take the derivative of that with respect to Y, you get back to E ⁺TY.2844

We want to evaluate that from Y is equal to θ1 to Y is equal to θ2.2848

We get, I will combine the T in the θ2 – θ1.2857

We are plugging in these values for Y.2865

E ⁺θ2 × T – E ⁺θ1 × T, I need parentheses here.2871

I could write that over a common denominator, E ⁺θ2t –e ⁺θ1t.2882

We divide that by T × (θ2- θ1).2894

That is my moment generating function for the uniform distribution.2901

Notice that, this is a function of T now, there are no Y anywhere.2906

That is what is supposed to happen with a moment generating function.2913

It should always be a function of T, it should not have any Y anywhere in there.2917

This is my complete answer here and I'm done with that example, except for a quick recap of the steps there.2923

Just to remind you, we have a whole lecture on the uniform distribution.2935

If you do not remember the basic premise of the uniform distribution, you can go back and do a quick review there.2939

The density function is 1/θ2 – θ1.2946

In particular, it is constant that is why I have three lines here to show that is always equal to.2950

The range goes from θ1 to θ2.2955

The moment generating function, by definition, we learned that in this lecture, it is the expected value of E ⁺TY.2958

The expected value of any function is the integral of the density function × that function.2967

If this were discreet, we have the sigma sign summation, instead of an integral,2974

and we have a probability function P, instead of a density function F.2981

It is really the same idea, when you look at these formulas, if you kind of blur your eyes a little bit,2985

you should see how they are really the same idea.2990

Integrals, like adding things up, the probability function is kind of the analogue of the density function.2993

Instead of the summation of P of Y, we have the integral of F of Y, and then, we still have E ⁺TY.3001

F of Y from above is just 1/θ2 – θ1, that comes from up above.3007

We will pull that out, since it is a constant.3014

Now, we have to integrate E ⁺TY, I did a u substitution.3016

My u was TY, my DU was T DY, DY was 1/T DU.3021

That is where I got that 1/T on the outside there.3033

It is the opposite of the chain rule or a substitution.3037

We still have E ⁺TY that is because we are integrating with respect to Y, not with respect to T.3042

The range on Y goes from θ1 to θ2, I plug those in and I still had 1/T × θ2 – θ1.3046

It is still quite complicated considering that it is a uniform distribution,3057

you might expect something simpler for the uniform distribution.3062

But you end up with this function of T that does represent the moment generating function for the uniform distribution.3067

I’m not going to take this one any farther, but if you want to, you could use this3074

to find the mean and the variance of the uniform distribution.3078

The same that we did in example 4, with the Poisson distribution.3084

You can calculate those out, it gets a little messy so I'm not going to do it here.3087

Instead, I'm going to wrap up this lecture here on moment generating functions.3092

This is part of the probability lecture series here on www.educator.com.3097

Next up, we are going to talk about by Bivariate distribution, we will have a Y1 and Y2.3102

That is another whole chapter of excitement, I hope you will stick around for that.3107

You are watching probability lectures on www.educator.com, my name is Will Murray, thank you very much for joining me, bye.3112

OR

### Start Learning Now

Our free lessons will get you started (Adobe Flash® required).