William Murray

William Murray

Poisson Distribution

Slide Duration:

Table of Contents

Section 1: Probability by Counting
Experiments, Outcomes, Samples, Spaces, Events

59m 30s

Intro
0:00
Terminology
0:19
Experiment
0:26
Outcome
0:56
Sample Space
1:16
Event
1:55
Key Formula
2:47
Formula for Finding the Probability of an Event
2:48
Example: Drawing a Card
3:36
Example I
5:01
Experiment
5:38
Outcomes
5:54
Probability of the Event
8:11
Example II
12:00
Experiment
12:17
Outcomes
12:34
Probability of the Event
13:49
Example III
16:33
Experiment
17:09
Outcomes
17:33
Probability of the Event
18:25
Example IV
21:20
Experiment
21:21
Outcomes
22:00
Probability of the Event
23:22
Example V
31:41
Experiment
32:14
Outcomes
32:35
Probability of the Event
33:27
Alternate Solution
40:16
Example VI
43:33
Experiment
44:08
Outcomes
44:24
Probability of the Event
53:35
Combining Events: Multiplication & Addition

1h 2m 47s

Intro
0:00
Unions of Events
0:40
Unions of Events
0:41
Disjoint Events
3:42
Intersections of Events
4:18
Intersections of Events
4:19
Conditional Probability
5:47
Conditional Probability
5:48
Independence
8:20
Independence
8:21
Warning: Independent Does Not Mean Disjoint
9:53
If A and B are Independent
11:20
Example I: Choosing a Number at Random
12:41
Solving by Counting
12:52
Solving by Probability
17:26
Example II: Combination
22:07
Combination Deal at a Restaurant
22:08
Example III: Rolling Two Dice
24:18
Define the Events
24:20
Solving by Counting
27:35
Solving by Probability
29:32
Example IV: Flipping a Coin
35:07
Flipping a Coin Four Times
35:08
Example V: Conditional Probabilities
41:22
Define the Events
42:23
Calculate the Conditional Probabilities
46:21
Example VI: Independent Events
53:42
Define the Events
53:43
Are Events Independent?
55:21
Choices: Combinations & Permutations

56m 3s

Intro
0:00
Choices: With or Without Replacement?
0:12
Choices: With or Without Replacement?
0:13
Example: With Replacement
2:17
Example: Without Replacement
2:55
Choices: Ordered or Unordered?
4:10
Choices: Ordered or Unordered?
4:11
Example: Unordered
4:52
Example: Ordered
6:08
Combinations
9:23
Definition & Equation: Combinations
9:24
Example: Combinations
12:12
Permutations
13:56
Definition & Equation: Permutations
13:57
Example: Permutations
15:00
Key Formulas
17:19
Number of Ways to Pick r Things from n Possibilities
17:20
Example I: Five Different Candy Bars
18:31
Example II: Five Identical Candy Bars
24:53
Example III: Five Identical Candy Bars
31:56
Example IV: Five Different Candy Bars
39:21
Example V: Pizza & Toppings
45:03
Inclusion & Exclusion

43m 40s

Intro
0:00
Inclusion/Exclusion: Two Events
0:09
Inclusion/Exclusion: Two Events
0:10
Inclusion/Exclusion: Three Events
2:30
Inclusion/Exclusion: Three Events
2:31
Example I: Inclusion & Exclusion
6:24
Example II: Inclusion & Exclusion
11:01
Example III: Inclusion & Exclusion
18:41
Example IV: Inclusion & Exclusion
28:24
Example V: Inclusion & Exclusion
39:33
Independence

46m 9s

Intro
0:00
Formula and Intuition
0:12
Definition of Independence
0:19
Intuition
0:49
Common Misinterpretations
1:37
Myth & Truth 1
1:38
Myth & Truth 2
2:23
Combining Independent Events
3:56
Recall: Formula for Conditional Probability
3:58
Combining Independent Events
4:10
Example I: Independence
5:36
Example II: Independence
14:14
Example III: Independence
21:10
Example IV: Independence
32:45
Example V: Independence
41:13
Bayes' Rule

1h 2m 10s

Intro
0:00
When to Use Bayes' Rule
0:08
When to Use Bayes' Rule: Disjoint Union of Events
0:09
Bayes' Rule for Two Choices
2:50
Bayes' Rule for Two Choices
2:51
Bayes' Rule for Multiple Choices
5:03
Bayes' Rule for Multiple Choices
5:04
Example I: What is the Chance that She is Diabetic?
6:55
Example I: Setting up the Events
6:56
Example I: Solution
11:33
Example II: What is the chance that It Belongs to a Woman?
19:28
Example II: Setting up the Events
19:29
Example II: Solution
21:45
Example III: What is the Probability that She is a Democrat?
27:31
Example III: Setting up the Events
27:32
Example III: Solution
32:08
Example IV: What is the chance that the Fruit is an Apple?
39:11
Example IV: Setting up the Events
39:12
Example IV: Solution
43:50
Example V: What is the Probability that the Oldest Child is a Girl?
51:16
Example V: Setting up the Events
51:17
Example V: Solution
53:07
Section 2: Random Variables
Random Variables & Probability Distribution

38m 21s

Intro
0:00
Intuition
0:15
Intuition for Random Variable
0:16
Example: Random Variable
0:44
Intuition, Cont.
2:52
Example: Random Variable as Payoff
2:57
Definition
5:11
Definition of a Random Variable
5:13
Example: Random Variable in Baseball
6:02
Probability Distributions
7:18
Probability Distributions
7:19
Example I: Probability Distribution for the Random Variable
9:29
Example II: Probability Distribution for the Random Variable
14:52
Example III: Probability Distribution for the Random Variable
21:52
Example IV: Probability Distribution for the Random Variable
27:25
Example V: Probability Distribution for the Random Variable
34:12
Expected Value (Mean)

46m 14s

Intro
0:00
Definition of Expected Value
0:20
Expected Value of a (Discrete) Random Variable or Mean
0:21
Indicator Variables
3:03
Indicator Variable
3:04
Linearity of Expectation
4:36
Linearity of Expectation for Random Variables
4:37
Expected Value of a Function
6:03
Expected Value of a Function
6:04
Example I: Expected Value
7:30
Example II: Expected Value
14:14
Example III: Expected Value of Flipping a Coin
21:42
Example III: Part A
21:43
Example III: Part B
30:43
Example IV: Semester Average
36:39
Example V: Expected Value of a Function of a Random Variable
41:28
Variance & Standard Deviation

47m 23s

Intro
0:00
Definition of Variance
0:11
Variance of a Random Variable
0:12
Variance is a Measure of the Variability, or Volatility
1:06
Most Useful Way to Calculate Variance
2:46
Definition of Standard Deviation
3:44
Standard Deviation of a Random Variable
3:45
Example I: Which of the Following Sets of Data Has the Largest Variance?
5:34
Example II: Which of the Following Would be the Least Useful in Understanding a Set of Data?
9:02
Example III: Calculate the Mean, Variance, & Standard Deviation
11:48
Example III: Mean
12:56
Example III: Variance
14:06
Example III: Standard Deviation
15:42
Example IV: Calculate the Mean, Variance, & Standard Deviation
17:54
Example IV: Mean
18:47
Example IV: Variance
20:36
Example IV: Standard Deviation
25:34
Example V: Calculate the Mean, Variance, & Standard Deviation
29:56
Example V: Mean
30:13
Example V: Variance
33:28
Example V: Standard Deviation
34:48
Example VI: Calculate the Mean, Variance, & Standard Deviation
37:29
Example VI: Possible Outcomes
38:09
Example VI: Mean
39:29
Example VI: Variance
41:22
Example VI: Standard Deviation
43:28
Markov's Inequality

26m 45s

Intro
0:00
Markov's Inequality
0:25
Markov's Inequality: Definition & Condition
0:26
Markov's Inequality: Equation
1:15
Markov's Inequality: Reverse Equation
2:48
Example I: Money
4:11
Example II: Rental Car
9:23
Example III: Probability of an Earthquake
12:22
Example IV: Defective Laptops
16:52
Example V: Cans of Tuna
21:06
Tchebysheff's Inequality

42m 11s

Intro
0:00
Tchebysheff's Inequality (Also Known as Chebyshev's Inequality)
0:52
Tchebysheff's Inequality: Definition
0:53
Tchebysheff's Inequality: Equation
1:19
Tchebysheff's Inequality: Intuition
3:21
Tchebysheff's Inequality in Reverse
4:09
Tchebysheff's Inequality in Reverse
4:10
Intuition
5:13
Example I: Money
5:55
Example II: College Units
13:20
Example III: Using Tchebysheff's Inequality to Estimate Proportion
16:40
Example IV: Probability of an Earthquake
25:21
Example V: Using Tchebysheff's Inequality to Estimate Proportion
32:57
Section 3: Discrete Distributions
Binomial Distribution (Bernoulli Trials)

52m 36s

Intro
0:00
Binomial Distribution
0:29
Binomial Distribution (Bernoulli Trials) Overview
0:30
Prototypical Examples: Flipping a Coin n Times
1:36
Process with Two Outcomes: Games Between Teams
2:12
Process with Two Outcomes: Rolling a Die to Get a 6
2:42
Formula for the Binomial Distribution
3:45
Fixed Parameters
3:46
Formula for the Binomial Distribution
6:27
Key Properties of the Binomial Distribution
9:54
Mean
9:55
Variance
10:56
Standard Deviation
11:13
Example I: Games Between Teams
11:36
Example II: Exam Score
17:01
Example III: Expected Grade & Standard Deviation
25:59
Example IV: Pogo-sticking Championship, Part A
33:25
Example IV: Pogo-sticking Championship, Part B
38:24
Example V: Expected Championships Winning & Standard Deviation
45:22
Geometric Distribution

52m 50s

Intro
0:00
Geometric Distribution
0:22
Geometric Distribution: Definition
0:23
Prototypical Example: Flipping a Coin Until We Get a Head
1:08
Geometric Distribution vs. Binomial Distribution.
1:31
Formula for the Geometric Distribution
2:13
Fixed Parameters
2:14
Random Variable
2:49
Formula for the Geometric Distribution
3:16
Key Properties of the Geometric Distribution
6:47
Mean
6:48
Variance
7:10
Standard Deviation
7:25
Geometric Series
7:46
Recall from Calculus II: Sum of Infinite Series
7:47
Application to Geometric Distribution
10:10
Example I: Drawing Cards from a Deck (With Replacement) Until You Get an Ace
13:02
Example I: Question & Solution
13:03
Example II: Mean & Standard Deviation of Winning Pin the Tail on the Donkey
16:32
Example II: Mean
16:33
Example II: Standard Deviation
18:37
Example III: Rolling a Die
22:09
Example III: Setting Up
22:10
Example III: Part A
24:18
Example III: Part B
26:01
Example III: Part C
27:38
Example III: Summary
32:02
Example IV: Job Interview
35:16
Example IV: Setting Up
35:15
Example IV: Part A
37:26
Example IV: Part B
38:33
Example IV: Summary
39:37
Example V: Mean & Standard Deviation of Time to Conduct All the Interviews
41:13
Example V: Setting Up
42:50
Example V: Mean
46:05
Example V: Variance
47:37
Example V: Standard Deviation
48:22
Example V: Summary
49:36
Negative Binomial Distribution

51m 39s

Intro
0:00
Negative Binomial Distribution
0:11
Negative Binomial Distribution: Definition
0:12
Prototypical Example: Flipping a Coin Until We Get r Successes
0:46
Negative Binomial Distribution vs. Binomial Distribution
1:04
Negative Binomial Distribution vs. Geometric Distribution
1:33
Formula for Negative Binomial Distribution
3:39
Fixed Parameters
3:40
Random Variable
4:57
Formula for Negative Binomial Distribution
5:18
Key Properties of Negative Binomial
7:44
Mean
7:47
Variance
8:03
Standard Deviation
8:09
Example I: Drawing Cards from a Deck (With Replacement) Until You Get Four Aces
8:32
Example I: Question & Solution
8:33
Example II: Chinchilla Grooming
12:37
Example II: Mean
12:38
Example II: Variance
15:09
Example II: Standard Deviation
15:51
Example II: Summary
17:10
Example III: Rolling a Die Until You Get Four Sixes
18:27
Example III: Setting Up
19:38
Example III: Mean
19:38
Example III: Variance
20:31
Example III: Standard Deviation
21:21
Example IV: Job Applicants
24:00
Example IV: Setting Up
24:01
Example IV: Part A
26:16
Example IV: Part B
29:53
Example V: Mean & Standard Deviation of Time to Conduct All the Interviews
40:10
Example V: Setting Up
40:11
Example V: Mean
45:24
Example V: Variance
46:22
Example V: Standard Deviation
47:01
Example V: Summary
48:16
Hypergeometric Distribution

36m 27s

Intro
0:00
Hypergeometric Distribution
0:11
Hypergeometric Distribution: Definition
0:12
Random Variable
1:38
Formula for the Hypergeometric Distribution
1:50
Fixed Parameters
1:51
Formula for the Hypergeometric Distribution
2:53
Key Properties of Hypergeometric
6:14
Mean
6:15
Variance
6:42
Standard Deviation
7:16
Example I: Students Committee
7:30
Example II: Expected Number of Women on the Committee in Example I
11:08
Example III: Pairs of Shoes
13:49
Example IV: What is the Expected Number of Left Shoes in Example III?
20:46
Example V: Using Indicator Variables & Linearity of Expectation
25:40
Poisson Distribution

52m 19s

Intro
0:00
Poisson Distribution
0:18
Poisson Distribution: Definition
0:19
Formula for the Poisson Distribution
2:16
Fixed Parameter
2:17
Formula for the Poisson Distribution
2:59
Key Properties of the Poisson Distribution
5:30
Mean
5:34
Variance
6:07
Standard Deviation
6:27
Example I: Forest Fires
6:41
Example II: Call Center, Part A
15:56
Example II: Call Center, Part B
20:50
Example III: Confirming that the Mean of the Poisson Distribution is λ
26:53
Example IV: Find E (Y²) for the Poisson Distribution
35:24
Example V: Earthquakes, Part A
37:57
Example V: Earthquakes, Part B
44:02
Section 4: Continuous Distributions
Density & Cumulative Distribution Functions

57m 17s

Intro
0:00
Density Functions
0:43
Density Functions
0:44
Density Function to Calculate Probabilities
2:41
Cumulative Distribution Functions
4:28
Cumulative Distribution Functions
4:29
Using F to Calculate Probabilities
5:58
Properties of the CDF (Density & Cumulative Distribution Functions)
7:27
F(-∞) = 0
7:34
F(∞) = 1
8:30
F is Increasing
9:14
F'(y) = f(y)
9:21
Example I: Density & Cumulative Distribution Functions, Part A
9:43
Example I: Density & Cumulative Distribution Functions, Part B
14:16
Example II: Density & Cumulative Distribution Functions, Part A
21:41
Example II: Density & Cumulative Distribution Functions, Part B
26:16
Example III: Density & Cumulative Distribution Functions, Part A
32:17
Example III: Density & Cumulative Distribution Functions, Part B
37:08
Example IV: Density & Cumulative Distribution Functions
43:34
Example V: Density & Cumulative Distribution Functions, Part A
51:53
Example V: Density & Cumulative Distribution Functions, Part B
54:19
Mean & Variance for Continuous Distributions

36m 18s

Intro
0:00
Mean
0:32
Mean for a Continuous Random Variable
0:33
Expectation is Linear
2:07
Variance
2:55
Variance for Continuous random Variable
2:56
Easier to Calculate Via the Mean
3:26
Standard Deviation
5:03
Standard Deviation
5:04
Example I: Mean & Variance for Continuous Distributions
5:43
Example II: Mean & Variance for Continuous Distributions
10:09
Example III: Mean & Variance for Continuous Distributions
16:05
Example IV: Mean & Variance for Continuous Distributions
26:40
Example V: Mean & Variance for Continuous Distributions
30:12
Uniform Distribution

32m 49s

Intro
0:00
Uniform Distribution
0:15
Uniform Distribution
0:16
Each Part of the Region is Equally Probable
1:39
Key Properties of the Uniform Distribution
2:45
Mean
2:46
Variance
3:27
Standard Deviation
3:48
Example I: Newspaper Delivery
5:25
Example II: Picking a Real Number from a Uniform Distribution
8:21
Example III: Dinner Date
11:02
Example IV: Proving that a Variable is Uniformly Distributed
18:50
Example V: Ice Cream Serving
27:22
Normal (Gaussian) Distribution

1h 3m 54s

Intro
0:00
Normal (Gaussian) Distribution
0:35
Normal (Gaussian) Distribution & The Bell Curve
0:36
Fixed Parameters
0:55
Formula for the Normal Distribution
1:32
Formula for the Normal Distribution
1:33
Calculating on the Normal Distribution can be Tricky
3:32
Standard Normal Distribution
5:12
Standard Normal Distribution
5:13
Graphing the Standard Normal Distribution
6:13
Standard Normal Distribution, Cont.
8:30
Standard Normal Distribution Chart
8:31
Nonstandard Normal Distribution
14:44
Nonstandard Normal Variable & Associated Standard Normal
14:45
Finding Probabilities for Z
15:39
Example I: Chance that Standard Normal Variable Will Land Between 1 and 2?
16:46
Example I: Setting Up the Equation & Graph
16:47
Example I: Solving for z Using the Standard Normal Chart
19:05
Example II: What Proportion of the Data Lies within Two Standard Deviations of the Mean?
20:41
Example II: Setting Up the Equation & Graph
20:42
Example II: Solving for z Using the Standard Normal Chart
24:38
Example III: Scores on an Exam
27:34
Example III: Setting Up the Equation & Graph, Part A
27:35
Example III: Setting Up the Equation & Graph, Part B
33:48
Example III: Solving for z Using the Standard Normal Chart, Part A
38:23
Example III: Solving for z Using the Standard Normal Chart, Part B
40:49
Example IV: Temperatures
42:54
Example IV: Setting Up the Equation & Graph
42:55
Example IV: Solving for z Using the Standard Normal Chart
47:03
Example V: Scores on an Exam
48:41
Example V: Setting Up the Equation & Graph, Part A
48:42
Example V: Setting Up the Equation & Graph, Part B
53:20
Example V: Solving for z Using the Standard Normal Chart, Part A
57:45
Example V: Solving for z Using the Standard Normal Chart, Part B
59:17
Gamma Distribution (with Exponential & Chi-square)

1h 8m 27s

Intro
0:00
Gamma Function
0:49
The Gamma Function
0:50
Properties of the Gamma Function
2:07
Formula for the Gamma Distribution
3:50
Fixed Parameters
3:51
Density Function for Gamma Distribution
4:07
Key Properties of the Gamma Distribution
7:13
Mean
7:14
Variance
7:25
Standard Deviation
7:30
Exponential Distribution
8:03
Definition of Exponential Distribution
8:04
Density
11:23
Mean
13:26
Variance
13:48
Standard Deviation
13:55
Chi-square Distribution
14:34
Chi-square Distribution: Overview
14:35
Chi-square Distribution: Mean
16:27
Chi-square Distribution: Variance
16:37
Chi-square Distribution: Standard Deviation
16:55
Example I: Graphing Gamma Distribution
17:30
Example I: Graphing Gamma Distribution
17:31
Example I: Describe the Effects of Changing α and β on the Shape of the Graph
23:33
Example II: Exponential Distribution
27:11
Example II: Using the Exponential Distribution
27:12
Example II: Summary
35:34
Example III: Earthquake
37:05
Example III: Estimate Using Markov's Inequality
37:06
Example III: Estimate Using Tchebysheff's Inequality
40:13
Example III: Summary
44:13
Example IV: Finding Exact Probability of Earthquakes
46:45
Example IV: Finding Exact Probability of Earthquakes
46:46
Example IV: Summary
51:44
Example V: Prove and Interpret Why the Exponential Distribution is Called 'Memoryless'
52:51
Example V: Prove
52:52
Example V: Interpretation
57:44
Example V: Summary
1:03:54
Beta Distribution

52m 45s

Intro
0:00
Beta Function
0:29
Fixed parameters
0:30
Defining the Beta Function
1:19
Relationship between the Gamma & Beta Functions
2:02
Beta Distribution
3:31
Density Function for the Beta Distribution
3:32
Key Properties of the Beta Distribution
6:56
Mean
6:57
Variance
7:16
Standard Deviation
7:37
Example I: Calculate B(3,4)
8:10
Example II: Graphing the Density Functions for the Beta Distribution
12:25
Example III: Show that the Uniform Distribution is a Special Case of the Beta Distribution
24:57
Example IV: Show that this Triangular Distribution is a Special Case of the Beta Distribution
31:20
Example V: Morning Commute
37:39
Example V: Identify the Density Function
38:45
Example V: Morning Commute, Part A
42:22
Example V: Morning Commute, Part B
44:19
Example V: Summary
49:13
Moment-Generating Functions

51m 58s

Intro
0:00
Moments
0:30
Definition of Moments
0:31
Moment-Generating Functions (MGFs)
3:53
Moment-Generating Functions
3:54
Using the MGF to Calculate the Moments
5:21
Moment-Generating Functions for the Discrete Distributions
8:22
Moment-Generating Functions for Binomial Distribution
8:36
Moment-Generating Functions for Geometric Distribution
9:06
Moment-Generating Functions for Negative Binomial Distribution
9:28
Moment-Generating Functions for Hypergeometric Distribution
9:43
Moment-Generating Functions for Poisson Distribution
9:57
Moment-Generating Functions for the Continuous Distributions
11:34
Moment-Generating Functions for the Uniform Distributions
11:43
Moment-Generating Functions for the Normal Distributions
12:24
Moment-Generating Functions for the Gamma Distributions
12:36
Moment-Generating Functions for the Exponential Distributions
12:44
Moment-Generating Functions for the Chi-square Distributions
13:11
Moment-Generating Functions for the Beta Distributions
13:48
Useful Formulas with Moment-Generating Functions
15:02
Useful Formulas with Moment-Generating Functions 1
15:03
Useful Formulas with Moment-Generating Functions 2
16:21
Example I: Moment-Generating Function for the Binomial Distribution
17:33
Example II: Use the MGF for the Binomial Distribution to Find the Mean of the Distribution
24:40
Example III: Find the Moment Generating Function for the Poisson Distribution
29:28
Example IV: Use the MGF for Poisson Distribution to Find the Mean and Variance of the Distribution
36:27
Example V: Find the Moment-generating Function for the Uniform Distribution
44:47
Section 5: Multivariate Distributions
Bivariate Density & Distribution Functions

50m 52s

Intro
0:00
Bivariate Density Functions
0:21
Two Variables
0:23
Bivariate Density Function
0:52
Properties of the Density Function
1:57
Properties of the Density Function 1
1:59
Properties of the Density Function 2
2:20
We Can Calculate Probabilities
2:53
If You Have a Discrete Distribution
4:36
Bivariate Distribution Functions
5:25
Bivariate Distribution Functions
5:26
Properties of the Bivariate Distribution Functions 1
7:19
Properties of the Bivariate Distribution Functions 2
7:36
Example I: Bivariate Density & Distribution Functions
8:08
Example II: Bivariate Density & Distribution Functions
14:40
Example III: Bivariate Density & Distribution Functions
24:33
Example IV: Bivariate Density & Distribution Functions
32:04
Example V: Bivariate Density & Distribution Functions
40:26
Marginal Probability

42m 38s

Intro
0:00
Discrete Case
0:48
Marginal Probability Functions
0:49
Continuous Case
3:07
Marginal Density Functions
3:08
Example I: Compute the Marginal Probability Function
5:58
Example II: Compute the Marginal Probability Function
14:07
Example III: Marginal Density Function
24:01
Example IV: Marginal Density Function
30:47
Example V: Marginal Density Function
36:05
Conditional Probability & Conditional Expectation

1h 2m 24s

Intro
0:00
Review of Marginal Probability
0:46
Recall the Marginal Probability Functions & Marginal Density Functions
0:47
Conditional Probability, Discrete Case
3:14
Conditional Probability, Discrete Case
3:15
Conditional Probability, Continuous Case
4:15
Conditional Density of Y₁ given that Y₂ = y₂
4:16
Interpret This as a Density on Y₁ & Calculate Conditional Probability
5:03
Conditional Expectation
6:44
Conditional Expectation: Continuous
6:45
Conditional Expectation: Discrete
8:03
Example I: Conditional Probability
8:29
Example II: Conditional Probability
23:59
Example III: Conditional Probability
34:28
Example IV: Conditional Expectation
43:16
Example V: Conditional Expectation
48:28
Independent Random Variables

51m 39s

Intro
0:00
Intuition
0:55
Experiment with Two Random Variables
0:56
Intuition Formula
2:17
Definition and Formulas
4:43
Definition
4:44
Short Version: Discrete
5:10
Short Version: Continuous
5:48
Theorem
9:33
For Continuous Random Variables, Y₁ & Y₂ are Independent If & Only If: Condition 1
9:34
For Continuous Random Variables, Y₁ & Y₂ are Independent If & Only If: Condition 2
11:22
Example I: Use the Definition to Determine if Y₁ and Y₂ are Independent
12:49
Example II: Use the Definition to Determine if Y₁ and Y₂ are Independent
21:33
Example III: Are Y₁ and Y₂ Independent?
27:01
Example IV: Are Y₁ and Y₂ Independent?
34:51
Example V: Are Y₁ and Y₂ Independent?
43:44
Expected Value of a Function of Random Variables

37m 7s

Intro
0:00
Review of Single Variable Case
0:29
Expected Value of a Single Variable
0:30
Expected Value of a Function g(Y)
1:12
Bivariate Case
2:11
Expected Value of a Function g(Y₁, Y₂)
2:12
Linearity of Expectation
3:24
Linearity of Expectation 1
3:25
Linearity of Expectation 2
3:38
Linearity of Expectation 3: Additivity
4:03
Example I: Calculate E (Y₁ + Y₂)
4:39
Example II: Calculate E (Y₁Y₂)
14:47
Example III: Calculate E (U₁) and E(U₂)
19:33
Example IV: Calculate E (Y₁) and E(Y₂)
22:50
Example V: Calculate E (2Y₁ + 3Y₂)
33:05
Covariance, Correlation & Linear Functions

59m 50s

Intro
0:00
Definition and Formulas for Covariance
0:38
Definition of Covariance
0:39
Formulas to Calculate Covariance
1:36
Intuition for Covariance
3:54
Covariance is a Measure of Dependence
3:55
Dependence Doesn't Necessarily Mean that the Variables Do the Same Thing
4:12
If Variables Move Together
4:47
If Variables Move Against Each Other
5:04
Both Cases Show Dependence!
5:30
Independence Theorem
8:10
Independence Theorem
8:11
The Converse is Not True
8:32
Correlation Coefficient
9:33
Correlation Coefficient
9:34
Linear Functions of Random Variables
11:57
Linear Functions of Random Variables: Expected Value
11:58
Linear Functions of Random Variables: Variance
12:58
Linear Functions of Random Variables, Cont.
14:30
Linear Functions of Random Variables: Covariance
14:35
Example I: Calculate E (Y₁), E (Y₂), and E (Y₁Y₂)
15:31
Example II: Are Y₁ and Y₂ Independent?
29:16
Example III: Calculate V (U₁) and V (U₂)
36:14
Example IV: Calculate the Covariance Correlation Coefficient
42:12
Example V: Find the Mean and Variance of the Average
52:19
Section 6: Distributions of Functions of Random Variables
Distribution Functions

1h 7m 35s

Intro
0:00
Premise
0:44
Premise
0:45
Goal
1:38
Goal Number 1: Find the Full Distribution Function
1:39
Goal Number 2: Find the Density Function
1:55
Goal Number 3: Calculate Probabilities
2:17
Three Methods
3:05
Method 1: Distribution Functions
3:06
Method 2: Transformations
3:38
Method 3: Moment-generating Functions
3:47
Distribution Functions
4:03
Distribution Functions
4:04
Example I: Find the Density Function
6:41
Step 1: Find the Distribution Function
6:42
Step 2: Find the Density Function
10:20
Summary
11:51
Example II: Find the Density Function
14:36
Step 1: Find the Distribution Function
14:37
Step 2: Find the Density Function
18:19
Summary
19:22
Example III: Find the Cumulative Distribution & Density Functions
20:39
Step 1: Find the Cumulative Distribution
20:40
Step 2: Find the Density Function
28:58
Summary
30:20
Example IV: Find the Density Function
33:01
Step 1: Setting Up the Equation & Graph
33:02
Step 2: If u ≤ 1
38:32
Step 3: If u ≥ 1
41:02
Step 4: Find the Distribution Function
42:40
Step 5: Find the Density Function
43:11
Summary
45:03
Example V: Find the Density Function
48:32
Step 1: Exponential
48:33
Step 2: Independence
50:48
Step 2: Find the Distribution Function
51:47
Step 3: Find the Density Function
1:00:17
Summary
1:02:05
Transformations

1h 16s

Intro
0:00
Premise
0:32
Premise
0:33
Goal
1:37
Goal Number 1: Find the Full Distribution Function
1:38
Goal Number 2: Find the Density Function
1:49
Goal Number 3: Calculate Probabilities
2:04
Three Methods
2:34
Method 1: Distribution Functions
2:35
Method 2: Transformations
2:57
Method 3: Moment-generating Functions
3:05
Requirements for Transformation Method
3:22
The Transformation Method Only Works for Single-variable Situations
3:23
Must be a Strictly Monotonic Function
3:50
Example: Strictly Monotonic Function
4:50
If the Function is Monotonic, Then It is Invertible
5:30
Formula for Transformations
7:09
Formula for Transformations
7:11
Example I: Determine whether the Function is Monotonic, and if so, Find Its Inverse
8:26
Example II: Find the Density Function
12:07
Example III: Determine whether the Function is Monotonic, and if so, Find Its Inverse
17:12
Example IV: Find the Density Function for the Magnitude of the Next Earthquake
21:30
Example V: Find the Expected Magnitude of the Next Earthquake
33:20
Example VI: Find the Density Function, Including the Range of Possible Values for u
47:42
Moment-Generating Functions

1h 18m 52s

Intro
0:00
Premise
0:30
Premise
0:31
Goal
1:40
Goal Number 1: Find the Full Distribution Function
1:41
Goal Number 2: Find the Density Function
1:51
Goal Number 3: Calculate Probabilities
2:01
Three Methods
2:39
Method 1: Distribution Functions
2:40
Method 2: Transformations
2:50
Method 3: Moment-Generating Functions
2:55
Review of Moment-Generating Functions
3:04
Recall: The Moment-Generating Function for a Random Variable Y
3:05
The Moment-Generating Function is a Function of t (Not y)
3:45
Moment-Generating Functions for the Discrete Distributions
4:31
Binomial
4:50
Geometric
5:12
Negative Binomial
5:24
Hypergeometric
5:33
Poisson
5:42
Moment-Generating Functions for the Continuous Distributions
6:08
Uniform
6:09
Normal
6:17
Gamma
6:29
Exponential
6:34
Chi-square
7:05
Beta
7:48
Useful Formulas with the Moment-Generating Functions
8:48
Useful Formula 1
8:49
Useful Formula 2
9:51
How to Use Moment-Generating Functions
10:41
How to Use Moment-Generating Functions
10:42
Example I: Find the Density Function
12:22
Example II: Find the Density Function
30:58
Example III: Find the Probability Function
43:29
Example IV: Find the Probability Function
51:43
Example V: Find the Distribution
1:00:14
Example VI: Find the Density Function
1:12:10
Order Statistics

1h 4m 56s

Intro
0:00
Premise
0:11
Example Question: How Tall Will the Tallest Student in My Next Semester's Probability Class Be?
0:12
Setting
0:56
Definition 1
1:49
Definition 2
2:01
Question: What are the Distributions & Densities?
4:08
Formulas
4:47
Distribution of Max
5:11
Density of Max
6:00
Distribution of Min
7:08
Density of Min
7:18
Example I: Distribution & Density Functions
8:29
Example I: Distribution
8:30
Example I: Density
11:07
Example I: Summary
12:33
Example II: Distribution & Density Functions
14:25
Example II: Distribution
14:26
Example II: Density
17:21
Example II: Summary
19:00
Example III: Mean & Variance
20:32
Example III: Mean
20:33
Example III: Variance
25:48
Example III: Summary
30:57
Example IV: Distribution & Density Functions
35:43
Example IV: Distribution
35:44
Example IV: Density
43:03
Example IV: Summary
46:11
Example V: Find the Expected Time Until the Team's First Injury
51:14
Example V: Solution
51:15
Example V: Summary
1:01:11
Sampling from a Normal Distribution

1h 7s

Intro
0:00
Setting
0:36
Setting
0:37
Assumptions and Notation
2:18
Assumption Forever
2:19
Assumption for this Lecture Only
3:21
Notation
3:49
The Sample Mean
4:15
Statistic We'll Study the Sample Mean
4:16
Theorem
5:40
Standard Normal Distribution
7:03
Standard Normal Distribution
7:04
Converting to Standard Normal
10:11
Recall
10:12
Corollary to Theorem
10:41
Example I: Heights of Students
13:18
Example II: What Happens to This Probability as n → ∞
22:36
Example III: Units at a University
32:24
Example IV: Probability of Sample Mean
40:53
Example V: How Many Samples Should We Take?
48:34
The Central Limit Theorem

1h 9m 55s

Intro
0:00
Setting
0:52
Setting
0:53
Assumptions and Notation
2:53
Our Samples are Independent (Independent Identically Distributed)
2:54
No Longer Assume that the Population is Normally Distributed
3:30
The Central Limit Theorem
4:36
The Central Limit Theorem Overview
4:38
The Central Limit Theorem in Practice
6:24
Standard Normal Distribution
8:09
Standard Normal Distribution
8:13
Converting to Standard Normal
10:13
Recall: If Y is Normal, Then …
10:14
Corollary to Theorem
11:09
Example I: Probability of Finishing Your Homework
12:56
Example I: Solution
12:57
Example I: Summary
18:20
Example I: Confirming with the Standard Normal Distribution Chart
20:18
Example II: Probability of Selling Muffins
21:26
Example II: Solution
21:27
Example II: Summary
29:09
Example II: Confirming with the Standard Normal Distribution Chart
31:09
Example III: Probability that a Soda Dispenser Gives the Correct Amount of Soda
32:41
Example III: Solution
32:42
Example III: Summary
38:03
Example III: Confirming with the Standard Normal Distribution Chart
40:58
Example IV: How Many Samples Should She Take?
42:06
Example IV: Solution
42:07
Example IV: Summary
49:18
Example IV: Confirming with the Standard Normal Distribution Chart
51:57
Example V: Restaurant Revenue
54:41
Example V: Solution
54:42
Example V: Summary
1:04:21
Example V: Confirming with the Standard Normal Distribution Chart
1:06:48
Loading...
This is a quick preview of the lesson. For full access, please Log In or Sign up.
For more information, please see full course syllabus of Probability
Bookmark & Share Embed

Share this knowledge with your friends!

Copy & Paste this embed code into your website’s HTML

Please ensure that your website editor is in text mode when you paste the code.
(In Wordpress, the mode button is on the top right corner.)
  ×
  • - Allow users to view the embedded video in full-size.
Since this lesson is not free, only the preview will appear on your website.
  • Discussion

  • Answer Engine

  • Study Guides

  • Download Lecture Slides

  • Table of Contents

  • Transcription

Lecture Comments (7)

2 answers

Last reply by: Dr. William Murray
Sat Feb 27, 2016 10:30 AM

Post by YILEI GE on February 24, 2016

Hi, am i wrong if I use Markov's ineuality for example 5?

1 answer

Last reply by: Dr. William Murray
Fri May 30, 2014 4:04 PM

Post by Carl Scaglione on May 29, 2014

Dr. Murray,

On this slide, referring to the last entered equation, I see the following:

e^(-lambda) * lambda * f^prime (lambda) = ... .

In your entry, lambda is missing which was multiplied through from the previous equation, but it requires an explanation.  I do not see that it influences the outcome, i.e.  Expected value = lambda, but its absence is notable.  

1 answer

Last reply by: Dr. William Murray
Mon Mar 31, 2014 10:57 PM

Post by Burhan Akram on March 27, 2014

In the last example, You could have easily solve it through quadratic formula and you would get around 3 :)

Poisson Distribution

Download Quick Notes

Poisson Distribution

Lecture Slides are screen-captured images of important points in the lecture. Students can download and print out these lecture slide images to do practice problems as well as take notes while watching the lecture.

  • Intro 0:00
  • Poisson Distribution 0:18
    • Poisson Distribution: Definition
  • Formula for the Poisson Distribution 2:16
    • Fixed Parameter
    • Formula for the Poisson Distribution
  • Key Properties of the Poisson Distribution 5:30
    • Mean
    • Variance
    • Standard Deviation
  • Example I: Forest Fires 6:41
  • Example II: Call Center, Part A 15:56
  • Example II: Call Center, Part B 20:50
  • Example III: Confirming that the Mean of the Poisson Distribution is λ 26:53
  • Example IV: Find E (Y²) for the Poisson Distribution 35:24
  • Example V: Earthquakes, Part A 37:57
  • Example V: Earthquakes, Part B 44:02

Transcription: Poisson Distribution

Hi, these are the probability lectures here on www.educator.com, my name is Will Murray.0000

We have been working through the discrete distributions.0004

We are going to talk about the last distribution which is that the Poisson distribution.0008

After that, we will get into continuous distributions in some future lectures.0013

Let us jump right into Poisson.0017

The Poisson distribution describes events that occur randomly and independently.0021

The way we think about this is, every once in while something happens and0027

there is sort of no connection between different instances of this event.0032

The typical example of the Poisson distribution is, if you are working in a call center.0037

Maybe, you are working tech support for apple, or something like that.0043

You are just waiting for the phone ring, and every so often0046

you get a phone call from someone in the world asking you for help in some problem.0050

There is really no connection between the number of phone calls that you get from 1 hour to the next.0056

It might be that this hour you get 2 phone calls, in the hour you got 5 phone calls,0062

and the hour after that, you get no phone calls at all.0068

The random variable that we are keeping track of, is the number of calls that you get in an hour.0071

There is a lot of different physical phenomenon that can be modeled by the Poisson distribution.0079

Other typical examples would be, if you are sitting by the side of a country road and0085

just keeping track of the number of cars ago by per unit of time.0090

Maybe, each hour, 1 hour, there are 3 cars that go by.0095

In the next hour, there are 7 cars that go by.0098

In the next hour, there are 4 cars that go by.0101

There is no connection between one car and the next.0104

You are just keeping track of how many cars there are in any given unit of time.0109

Another example would be the number of earthquakes that strike a particular region per unit of time.0115

All these things are modeled by the Poisson distribution.0121

It counts the number of occurrences of a random event,0124

when there is no connection between one instance of the event and the next instance.0128

Let us go ahead and learn some formulas for the Poisson distribution.0133

There is one parameter that you have to keep track of, and it is traditional to use λ for this parameter.0137

That is the Greek letter λ right there.0143

Λ is the average number of occurrences of the event per unit of time, it does not have to be a whole number.0146

For example, if you are working in a call center and after many days, you kept track of the average number of calls.0152

On average, maybe you get 5 calls per hour would be λ.0159

It does not mean you are going to get 5 calls every hour.0162

Some hours you might get no calls at all.0165

Some hours you might get 15 calls, but that is the long-term average.0166

It is the only parameter that goes into the Poisson distribution is that value λ.0172

Like I said, it does not have to be a whole number.0176

The probability distribution is P of Y is equal to λ ⁺Y E ⁻λ/Y!.0179

That is for whole number values of Y.0189

It could be as little as 0, because if you are working in your call center and you are waiting for calls to come in,0193

it could be that you know get a whole hour and get no calls at all.0202

You could get one call, you could get 2 calls, you could get hundreds of thousands of calls,0205

if you are particularly unlucky that hour.0211

The range for Y is any whole number between 0 and infinity.0213

This formula for the Poisson distribution is a little difficult to memorize.0222

It is something you do need to memorize, if you are going to take probability.0226

I often remember it based on the Taylor series.0230

I remember the Taylor series for E ⁺λ.0234

You probably learn this using X, when you first learn about Taylor series nut I'm going to use it.0238

Let me remind you of the Taylor series for E ⁺X in the form that you probably know.0247

E ⁺X is the sum from N equals 0 to infinity of X ⁺N/N!.0252

We tend to use slightly different variables in probability, instead of N we use Y.0261

The X here is taking the place of λ, it is becoming λ.0268

If we change that according to those variables, then E ⁺λ is the sum from Y equals 0 to infinity.0274

X ⁺B becomes λ ⁺Y and N! becomes Y!.0283

If you look at that old formula for Taylor series, that is exactly the formula that we have for the Poisson probability distribution.0287

That is how you can remember that part, λ ⁺Y/Y!.0303

If you can remember the Taylor series for E ⁺λ.0307

That E ⁻λ, you want to think about that as a constant being multiplied on.0311

That E ⁻λ, λ is a fixed parameter, that is just a constant, that is independent of Y.0316

Just like every distribution, we have to figure out its mean, variance, and its standard deviation.0326

The key properties of the Poisson distribution here.0335

The mean is always the same as the expected value, those are synonymous.0338

The expected value and mean, those are always the same.0342

For the Poisson distribution, it is just λ by itself.0346

Of course, that should not be surprising, that should not be difficult to remember,0350

because we set this up knowing ahead of time that you average λ calls per hour.0355

The fact that the mean comes up to be λ is not all surprising, and in fact, it should look like that.0362

The slightly more surprising result is that, the variance also comes out to be λ.0368

Σ² here, the V of Y is also λ.0374

That is kind of nice to remember, that was not obvious.0378

If you do the calculus to calculate the variance, that it does come out to be λ.0382

Remember, the standard deviation is always the square root of the variance.0388

In this case, since, the variance is λ, the standard deviation is just the square root of λ.0393

Let us go ahead and talk about some examples of the Poisson distribution.0401

Our first example, we have California averaging 6 major forest fires per year.0407

Somebody has done a study, maybe over the last 50 years,0412

and figure out that on average California has 6 major forest fires per year.0415

They were interested in the chance that there will be exactly 4 fires in this coming year.0420

Also, what is the chance that there will be at least 4 fires?0427

That is a typical Poisson distribution problem because forest fires,0432

just every once in a while kind of essentially randomly, a forest fire happens.0436

There is not a lot of connection between one forest fire and the next.0442

Let me remind you of the basic formula for the Poisson probability distribution.0447

That was P of Y is equal to E ⁻λ × λ ⁺Y/Y!.0452

Remember, the λ use the parameter that represents the average number of occurrences per unit time.0462

In this case, we have been told that California averages 6 forest fires per year.0469

That is the value of λ that we are going to use, λ = 6.0474

The first question that we have to answer is, what the chance is that there will be exactly 4 fires this year?0478

That means, Y is equal to 4, we want to find the probability of 4.0485

That is E λ, -6.0490

Λ is 6, 6 ⁺Y is 6⁴, and Y! Is 4!.0494

This is a fairly amenable fraction to simplification.0503

Let me go ahead and simplify that a bit.0508

E⁻⁶, I will just put that in the denominator is E⁶.0510

4! Is 24, 6⁴ is 6 × 6 × 6 × 6.0515

I can simplify that a bit, I can cancel one of these 6 with the 24 and get 4 here.0522

4 is 2 powers of 2, I can cancel two of the 6 down to 3.0529

It is just taking some 2’s out of their.0533

I'm left with, in the denominator E⁶.0536

In the numerator, 3 × 3 × 6 is 9 × 6 = 54.0540

That is my probability of there will be an exactly 4 fires in the next year.0549

I could also calculate a decimal approximation of that and I did that with my calculator.0558

I just did 54 divided by E⁶ and it came out to be about 13.4%.0564

If somebody wants to know, what is the chance of getting exactly 4 forest fires next year,0571

it will come out to be exactly 13.4%, or approximately 13.4%.0577

The second part of this question is, what is the chance that there will be at least 4 fires next year?0584

That is the probability that Y is greater than or equal to 4.0592

I think the easier way to think about that is to reverse the question and say,0598

what is the probability that there will be 3 or fewer fires?0603

Because we can just add up from 0 to 3, and then subtract that from 1.0607

This is 1 -, the probabilities of their being fewer fires, there could be no fires at all.0611

Also, P of 0, let me put this in parenthesis.0618

P of 0 + P of 1 + P of 2 + P of 3, the probability that there will be anywhere from 0 through 3 fires, next year.0623

That is the opposite of what we are calculating.0635

We can use that to find the probability that there will be 4 or more fires.0638

We are going to use this formula, this generic formula for the Poisson distribution.0643

Notice that, there is E ⁺λ and that is independent of Y.0648

I’m going to factor that out.0653

This is 1 – E⁻⁶.0655

The other terms are just λ ⁺Y/Y!.0663

In this case, for Y = 0, 6⁰/0! + 6¹/1! + 6²/2! + 6³/3!.0669

And that, we will simplify quite a bit, 6⁰/0!, that is 1 + 6 + 6²/2 + 6³/3! Is 6.0688

By the way, what you should notice here is that, we are essentially finding the first few terms of the Taylor series for E⁶.0705

This is really the Taylor series for E⁶, 1 + 6 + 6²/2! + 6³/3!.0715

If you recognize that, that can really help you check your work when you do a Poisson distribution problem,0733

because the Poisson distribution is very closely connected to the Taylor series for E ⁺X.0738

Let me simplify this with the fractions work out fairly nicely here.0747

I will put the E⁻⁶ as the denominator.0753

I think it will work a little more nicely there, E⁶ in the denominator, that is 7 +, 6² is 36 divided by 2 is 18.0756

6³ divided by 6, we would have a cancellation of a 6 from both top and bottom there.0767

We need to calculate 6³, it is just 6².0774

6² is 36.0779

What we get there is 1 -, 7 + 18 is 25 + 36 is 61.0783

1 - 61/E⁶ is our exact probability there.0792

I did calculate that into a decimal.0798

My decimal approximation, when I put that out in my calculator was 84.9%.0802

84.9% is approximately the probability that there will be at least 4 forest fires in California next year.0810

It is pretty high there, I do expect to see at least 4 forest fires in California, probably more.0820

That is not surprising, since the average number of fires per year is 6.0828

Let me recap where everything came from here.0832

It came from this master Poisson probability distribution formula P of Y is E λ × λ ⁺Y/Y!.0835

The λ here is the average number of occurrences, which in this case was 6, 6 fires per year, that is where we got the λ.0846

We plug that 6 in here for λ everywhere, and then it is a question of figuring out what values of Y you are interested in.0856

First, we are figuring out exactly 4 fires.0864

That is why I plugged in Y is equal to 4.0867

I plugged that into the formula, simplify it down, and I got the probability to be about 13.4% .0871

At least 4 fires means the probability that Y will be greater than or equal to 4, that is hard to calculate directly.0879

Instead, I flipped it around and calculated the probability that Y will be less than or equal to 3.0887

That is what I'm doing here, P of 0, P of 1, P of 2, P of 3,0893

that is the probability that you will have no fires, or 1 fire, or 2 fires, or 3 fires.0897

And then, I subtracted that from 1 to get the probability of being greater than 4.0903

For each one of these, I used this Poisson distribution formula again.0908

Since, there was E ⁺λ in all of these and that is a constant, I just factor that out.0915

That is what this E⁻⁶ is doing out here.0920

And then I went through the other values of Y and drop them in, 0, 1, 2, and 3.0923

It looks like I forgot the point on my factorial there.0929

I drop those values in and what I noticed was that this turns into the Taylor series for E⁶.0935

That helps me check my work a little bit.0943

It does not really how we calculate it but it helps me check my work.0945

And then, I just simplify the fractions and reduced it down to a percent, 84.9%.0948

On our second example, we have a call center receiving 2 calls per minute, on average.0957

At first, we are going to use Markov’s inequality to estimate the chance that fewer than 5 calls will come in, in the next minutes.0965

It looks a little strange, let me write that down because the word in, occurs twice.0975

That is saying 5 calls will come in, in the next minute.0978

In the second part, we are going to find the exact chance that fewer than 5 calls will come in, in the next minute.0983

Let me remind you what Markov’s inequality was.0991

That was something that we learn several lessons ago, there is a video here on Markov’s inequality.0994

If you do not remember that, just scroll up and you will see a whole video on Markov's inequality, that can help you out there.1000

I will give you the short version of that, right now.1006

It says that, let me go ahead and label this.1011

This is for Markov’s inequality, says that the probability that Y is bigger than or equal1015

to some constant A is less than or equal to the expected value of Y divided by that value of A.1025

In this case, we are interested in the probability that fewer than 5 calls will come in.1035

That means A is equal to 5, the probability that Y is greater than or equal to 5.1041

The problem actually asked for fewer than 5, we will turn this around in a moment.1046

I want to calculate this up.1051

E of Y, that is the average value, the expected value, the mean.1053

In this case, that is μ is equal to 2 because we have been told that this call center averages 2 calls per minute.1058

This is less than or equal to 2/, in this case, 5.1068

If we turn that around, if we found that the probability that Y is greater than 5 is less than 2/5.1076

The probability that Y is less than 5 is then, if we flip that around must be greater than 1 - 2/5 which is 3/5.1085

If you convert that to a percentage is 60%.1100

Our answer here is, the probability that Y is less than 5 is greater than 60%.1104

That is what Markov’s inequality told us.1115

I'm going to go ahead and answer the second part of this problem on the next slide.1118

Let me recap what we did with this part of the problem.1123

First of all, I did not use anything to do with the Poisson distribution here.1126

There is Poisson in this first part of the problem, I was just using Markov’s inequality.1131

If you want a refresher course in Markov’s inequality, we got a whole video on Markov’s inequality in this series.1136

Just go back and check out that video Markov’s inequality and you can get plenty of practice with that.1143

The short version of that is this formula.1148

The probability that Y is bigger than any constant A is less than or equal to the expected value of Y divided by A.1151

In this case, our A is this 5 right here.1161

The expected value of Y is the average number of calls per minute.1165

That is E of Y right there, we are given that in the problem.1169

That is where we get the value of 2 and the value of 5, in my answer here.1174

But what Markov’s inequality tells us is, it gives us a bound for the probability of Y being greater than 5.1180

In order to find the probability of Y being less than 5, you have to reverse that and reverse the inequality.1188

Instead of 2/5, we get 1 -2/5.1198

Instead of less than or equal to, it turns into greater than.1200

1 – 2/5 simplifies down to 3/5 or 60%.1205

Our answer is really the probability that Y is less than 5 is greater than 60%.1209

It is at least 60%.1216

If I'm making my plans for my call center and maybe I only have 5 lines on my telephone,1219

and I’m worried that maybe I will get more than 5 calls per minute.1226

I can say for sure that the probability of Y being less than 5 is at least 60%.1231

I got a 60% chance that I would not over load the 5 lines on my telephone.1236

There is no Poisson distribution coming in yet, that is coming in the answer to part B.1241

I will go ahead and jump to a new slide to do that.1246

In example 2, we have already found the probability of Y being less than 5 using Markov,1251

but we also want to calculate exactly the probability of Y being less than 5.1259

Now, we are going to use the Poisson distribution formula for that.1263

Let me remind you what the Poisson distribution formula is.1270

P of Y is equal to E λ, that is the constant part, Λ ⁺Y/Y!.1273

In this case, we want the probability that Y is less than 5.1284

In other words, we are going to get fewer than 5 calls that mean 0 through 4 calls.1287

The probability that Y is less than 5, we can get that by adding up the probabilities of getting 0 calls,1294

the probability of getting 1 call, 2 calls, or 3 calls, or 4 calls, + P of 3 + P of 4.1301

We have been told that our average number of calls per minute is 2.1311

I’m going to fill in λ is equal to 2 everywhere.1315

And that E λ, that is a constant, it does not depend on Y.1319

I will factor that out.1323

This is E⁻².1324

I’m going to plug in the values of Y being 0, 1, 2, 3, and 4.1328

That is 2⁰/0! + 2¹/1! + 2²/2! + 2³/3! + 2⁴/4!.1335

Let me go ahead and make my denominator E².1356

2⁰/0! Is 1, + 2, + I’m going to keep that expanded 2²/2! + 2³/3! + 2⁴/4!.1360

The reason I left that a little bit expanded was to kind of remind you that,1378

the Poisson distribution is related to the Taylor series expansion of E ⁺λ.1385

What we are really have here is the beginning of the Taylor series, the Taylor polynomial, for E ⁺λ, in this case, that is E².1391

1 + 2 + 2²/2! + 2³/3!, 2⁴/4!.1405

That helps you check that you have done your arithmetic right,1412

if it starts to look like the Taylor series for E ⁺X or E², in this case.1415

Now, let me start simplifying this, this is 3 + 2²/2! That is 4/2 is 2.1421

2³/3! Is 8/6 +⁴ is 16, 4! Is 24/E², that is 5 + 8/6 is 4/3 + 16/24 is 2/3 /E².1428

5 + 4/3 + 2/3 is 6/3 that is 2/E².1453

This is 7/E² which I did some calculations that I have written down somewhere.1459

7/E² on my calculator simplify down to 94.7%.1466

That means, it is very likely that you will have fewer than 5 calls.1474

If you are worried about getting more than 5 calls in your call center, maybe you have 5 lines on your telephone,1480

you can rest assured that with 95% chance, you will have fewer than 5 calls.1489

Notice that, with a Markov estimation, we got that the probability was greater than 60%.1495

That was the answer we got, that was on the previous slide.1504

You can scroll back and see that, if you do not remember how we got that.1506

It did not say it is equal to 60%, it said that the probability was greater than 60%.1510

This checks that because 94% certainly is greater than 60%, but it is a much more precise answer.1518

Markov’s inequality just gives you a very rough check, something you can calculate easily and quickly,1525

but it would not be the most accurate estimate.1532

Using the Poisson distribution, we got an exact answer at the cost of having to do a few more calculations.1536

To remind you how we got those calculations, we use this Poisson formula P of Y = E ⁺λ × λ ⁺Y/Y!.1543

Λ was 2, that is because the average number of calls per hour is 2, that was given in the stem of the problem.1554

I dropped λ = 2 in, everywhere.1560

To find the values of Y, I want all the values of Y less than 5 including 0.1564

I ran through and I plot all these values of Y into that formula and I factored out the E⁻²1569

because that was constant for all of them, that do not depend on Y.1577

Then, it was just a matter of simplifying down the fractions and simplify down to 7/E², and that converted to 94.7%.1580

Along the way, I have this check that the expansion I got looked exactly like the Taylor series expansion for E ⁺X, if X is equal to 2.1590

That is my exact probability of getting fewer than 5 calls,1601

of getting anywhere from 0 through 4 calls in our call center in the next hour.1606

Example 3, we are going to use the definition of expected value to confirm that the mean of the Poisson distribution is λ.1615

I told you at the beginning of this particular video that the mean is λ.1624

That is not in doubt, if you trust me there.1630

In example 3, we are actually figured that out from scratch using the definition of expected value.1632

Let me remind you what that definition was.1639

The expected value of a discreet random variable which Poisson distribution is discreet, it is not continuous.1641

It is the sum overall possible values of Y of Y × the probability of that particular value of Y.1650

In this case, the values that a Poisson variable can take are 0 to infinity.1660

In this case, we have the sum from Y equal 0 to infinity of Y.1669

The Poisson distribution formula is the E ⁻λ × λ ⁺Y/Y!.1676

Let me go ahead and factor out the E λ there because that is constant, that does not depend on Y.1686

We get E λ × the sum from Y = 0 to infinity of Y × λ ⁺Y/Y!.1693

Now, that looks a lot like the Taylor series for E ⁺X.1704

Let me remind you what the Taylor series for E ⁺X is.1709

E ⁺X is the sum from N equals 0 to infinity of X ⁺N/N!.1712

What we have here is something very like that.1720

The only difference here, it is different variables but that is no big deal.1723

The difference here is that, there is this extra multiple of Y on the outside.1727

We have to figure out a good way to handle that.1733

Let me show you how I can think about this.1735

I'm going to define F of λ to be E ⁺λ.1737

I'm going to expand that into a Taylor series because basically,1745

the reason I'm doing that is because I see here that there is a λ ⁺Y.1751

The λ is taking the place of the X, in our original Taylor series.1755

I will expand that it out into a Taylor series, E ⁺λ is the sum, I will put it in terms of Y.1759

Y = 0 to infinity of λ ⁺Y/Y!.1767

That looks a bit like what we are looking for but it does not have this extra factor of Y.1776

Let me show you how we can get that.1781

What we will do is, we will take the directive with respect to λ, take derivative of λ.1783

We will get F prime of λ.1792

The derivative of E ⁺X is just E ⁺X.1797

The derivative of E ⁺λ is just E ⁺λ.1800

It looks like I did not do anything there but I really did d by d λ.1803

I just did not change the function.1807

Now, I have to take the derivative of this series and this where you have to remember,1809

we are taking the derivative with respect to λ.1816

The Y! Is just a constant but the derivative of λ ⁺Y, we can use the power rule into Y × λ ⁺Y-1.1819

That is using the power rule there from calculus.1832

It is just a little strange because you are probably not used to using λ as your variable.1838

You are not used to having Y in the exponent, instead of N.1843

It is still the same rules from calculus.1848

I see that I have got something that is a little bit closer to the series that I'm looking for.1852

I got the factor of Y which is very keys.1858

I lost one of my λ because I have λ¹ -1, instead of λ ⁺Y.1860

I’m going to bump it up by another power of λl.1868

I will multiply by λ to bump it up.1870

I got λ × prime of λ, which is λl × E B ⁺λ.1878

I’m going to bring this λ inside, the sum for Y = 0 to infinity of Y.1887

Since, I have one more λ now, I get λ ⁺Y instead of Y -1 Y! in the denominator.1892

Now, I essentially have this series that I was looking for.1901

I got the right series, just need that factor of E ⁻λ on the outside.1905

I will multiply by E λ.1911

E λ × prime of λ is equal to E λ × λ × E ⁺λ, which is equal to E λ × the sum from Y = 0 to infinity of Y × λ ⁺Y/Y!.1923

This perfect because, on the right hand side, I got exactly that expression that I started out with up here.1947

That is E of Y right there.1955

On the middle term, this λ and this E ⁺λ, and E λ, those cancel each other out.1958

That is just E ⁺λ divided by E ⁺λ.1966

Those cancel each other out and I just get λ there.1971

That is exactly what I wanted, I wanted to show, remember that the expected value1975

or the mean of the Poisson variable is λ.1980

That is what I have done.1984

You can really trust that, you do not have to take my word from the initial slide.1986

You can really understand yourself why the mean comes out to be λ.1992

Let me remind you of the stuff you went through there.1997

I use the definition of expected value here, the sum on Y of Y × P of Y.1999

That was our definition of the mean or the expected value of a discreet random variable.2005

I expanded that out, the P of Y, I just remember the formula for the Poisson distribution.2013

I will plug that in.2019

The E λ was a constant, I pull that out here.2021

I have this kind of complicated series and I did not really know what to do.2026

I went back and look to my E ⁺X formula.2030

I was trying to sort of build up this series here.2033

I built it up step by step, I start out with E ⁺λ.2036

I expanded that out as a Taylor series.2041

I took its derivative, the reason I did that was to get that extra power of Y on the outside,2045

because I know that I can produce that using the power rule.2052

I need to build that because I had that inside the original series here.2055

That worked pretty nicely, if we are using the power of Y but then I lost the power of λ from the exponent there.2063

I multiplied it back on that power of λ and that bumped the λ ⁺Y -1 backup to λ ⁺Y.2069

That pretty much matches my series, except for this E λ.2079

I multiplied it by E λ everywhere, there it is.2083

That gave me the exact series that I wanted for E of Y.2088

Over here, the E λ and E ⁺λ, those cancel each other out and I just simplified down the λ.2092

We get that the mean that the Poisson random variable is λ.2099

This is not surprising at all, because we started out by assuming that we knew the average number of calls per hour.2103

We assumed that the average was λ.2111

This is not surprising but it just kind of checks all our assumptions here, and it checks our arithmetic.2113

Let us keep going with this, for example 4.2122

In example e, we have been asked to find E of the expected value of Y² for the Poisson distribution.2126

Now this time, it does not say we have to calculate that from scratch.2132

I'm going to go ahead and use some of the information from some earlier slides.2135

In particular was, I know that the expected value of the Poisson distribution is λ.2141

I also know that the variance σ² V of Y, which is the same as σ²,2148

those are 2 different notations for the same thing.2155

It is the expected value of (Y² - the expected value of Y)².2158

That is true for any distribution not just Poisson.2166

It is a very useful thing to remember.2168

For the Poisson distribution, the variance is equal to λ.2171

We were given on the third slide of these videos,2176

if you scroll back to the key properties slide, you will see that the variance is λ.2181

Let me fill in what I know here.2188

The E of Y² then, is equal to λ + E (Y)², that is λ.2190

I already said that the expected value of Y is λ.2203

That is λ + λ² there.2206

That is what we are supposed to find out.2209

We are going to be using this, by the way, in the next examples.2211

I want you to hang onto this and remember that it will be useful for the next example.2215

But in the meantime, let me just remind you of those steps there.2220

I’m really using the basic key properties of the Poisson distribution.2223

The expected value of Y is λ and the variance is also λ.2227

But also remember that, for any distribution, the variance is always equal to the expected value of Y² - the expected value of (Y)².2237

I plugged that in and I know that the expected value of (Y)², I will plug that in here to get λ².2248

I get the expected value of Y² is λ + λ².2261

Hang onto this, we will be using it in the next example, in example 5.2265

We got something cooked up that is going to require us to use the expected value of Y².2271

In example 5 here, we got California averages 2 major earthquakes per decade.2279

Y is going to represent the number of major earthquakes in the next decade.2287

We got a cost function here, the cost of damages, it depends on how many earthquakes you have.2292

In this case, the cost is 2Y² + 5Y + 10.2299

We want to find the expected cost, how much do we expect on average to pay in damages per decade,2306

for earthquakes in California.2313

We also want to find the probability that the damages will cost more than $40,000,000.2315

What is the probability next year that we will have to spend more than $40,000,000 on earthquake damages?2321

I guess not next year but next decade.2329

Let us find the expected cost first.2331

The expected cost here, the expected value of the cost is the expected value.2336

What we are given that the cost is 2Y² + 5Y + 10.2342

What we are really going to use very heavily here is linearity of expectation.2350

Expectation is always linear so I can expand this out into 2 × the expected value of Y².2357

I cannot bring that square out but I can bring the 2 outside, and I can separate out the different terms.2370

5 × the expected value of Y + the expected value of 10.2375

The expected value of 10 is just 10, there is not much to say there.2381

2 × the expected value of Y².2386

I figured out in example 4, that the expected value of Y² was λ² + λ.2389

That was in example 4, if you did not just watch it then not you want to go back and look at example 4,2398

and see how we figure that out.2404

I’m not going to recalculate that now, but you can go back and check the earlier part of the video on example 4,2406

if you have not just watched it.2416

This is 2 × λ² + λ + 5 × expected value of Y.2418

Since, this is a Poisson distribution, that is 5 × λ + 10.2427

This is 2 λ² + 2 λ + 5 λ is 7 λ + 7 λ + 10.2433

Let me go ahead and work that out with λ = 2.2453

The reason I'm using λ is equal to 2 is because we have been told that California averages 2 major earthquakes per decade.2459

That is λ is equal to 2 here.2468

This is 2 × λ² is 2 × 4 that is 8 + 7 × 2 is 14 + 10.2474

8 + 14 is 22, that is 32.2489

That tells me that my expected cost, the expected amount that I should average on2500

spending on damages per earthquake per decade is, I guess this is in millions of dollars.2510

This is $32,000,000, that means in long-term California can expect2515

to spend an average of $32,000,000 on earthquake damage each decade.2526

If you want to make some planning based on how much it would cost to retrofit buildings,2533

or how much it would cost to make up different earthquake safety plans,2542

you want a budget that against the expected amount of damages which is going to be $32,000,000.2546

Let me do part B of this problem on the next slide here.2553

Let us just quickly check here that the expected costs here was,2558

the expected value of what we are given the cost functioned 2Y² + 5Y + 10.2565

We expanded that out using linearity of expectation.2576

It is very useful to use linearity of expectation.2579

We have the expected value of Y² was λ² + λ.2583

That came from example 4, you can go back and watch the video in example 4, if do not know where that came from.2587

The expected value of Y is just λ, we have proved it in examples 3.2593

I also gave you the answer back in one of the first slides in this lecture.2600

Of course, the expected value of a constant 10 is just 10.2605

Then, I expanded that out and I got 8 + 14 + 10.2610

That was using the value of λ = 2 that I got from the stem of the problem, that we are averaging two earthquakes per decade.2615

I gave you 32 as an answer.2625

Apparently, our units here are millions of dollars.2628

We are talking about $32,000,000 for our expected cost for earthquake damage next decade.2630

Let us go ahead and jump to the next slide.2638

We will find the probability that the damages will cost more than $40,000,000.2640

That is what we are going to calculate here on example 5, the probability that the damages will cost more than $40,000,000.2647

We already found the expected cost on the previous slides.2655

We are already done with one.2658

In this case, the cost function here, the probability that C is greater than 40.2660

Let me write that as C is greater than 40.2669

I want to convert that into values for Y, let us see how that works out.2672

The expected cost was 2Y² + 5Y + 10.2677

I want to see if that is greater than or equal to 40.2685

Let me try and simplify this as much as I can but it is not going to simplify terribly well.2689

Let me subtract 10 from all sides, 2Y² + 5Y.2693

I forgot my 5 there.2699

5Y is greater than or equal to 30.2701

I want to figure out which values of Y I will be looking at there, and then I can find the probabilities.2706

Remember that Y is a whole number because with Poisson random variables, you are always getting whole numbers.2712

In this case, we are counting the number of earthquakes we might have in the next decade.2720

We are going to have a whole number of earthquakes.2723

You cannot have half an earthquake.2726

In this case, let us figure out which values of Y might give us 2Y² + 5Y greater than or equal to 30.2729

There is not a good way to solve that for whole numbers.2737

For something better, I will use trial and error.2742

Let me just plug in some values of Y.2748

If Y equals 0, we will get 2Y² + 5Y is 0 which is not greater than 30.2750

That certainly does not work there.2757

Let me try Y is equal 1, we will get 2 + 5, is that greater than 30?2761

No, that certainly is not.2769

Let us try Y equals 2, 2Y² is 2 × 4 that is 8 + 5Y is 10.2774

Is that greater than 30? No, that is not greater than 30 because that is 18.2782

Let us try Y equals 3, 2Y² is 2 × 9 is 18 + 5 × Y is 15, 18 + 15 is that greater than 30?2788

Yes, that is greater than 30 because that is 33.2802

Y = 3 or higher, essentially, if we have 3 or more earthquakes next decade then our cost goes above $40,000,000.2806

We want to find the probability of having 3 or more earthquakes next decade.2817

Let me try and calculate that out.2825

The probability that Y is greater than or equal to 3.2828

If we have 3 more earthquakes, we bust the budget.2830

If we can have fewer than 3 earthquakes, then we will stay under budget.2833

Assuming, we budgeted $40,000,000 for earthquake repair.2837

It is hard to calculate P of Y being greater than or equal of 3.2842

What is easier to calculate is to find the probability of it being less than 3.2846

Let me calculate the probability of 0, 1, 2, earthquakes and2851

then I will subtract that off from 1 to get the probability of their being more than 3 earthquakes.2859

I should have written p of 2 there, that is 2.2865

Now, I have to calculate these probabilities.2873

Let me remind you of the Poisson distribution formula.2875

The Poisson distribution formula tells me that P of Y is equal to E λ × λ ⁺Y/Y!.2879

In this case, our λ = 2.2893

That E λ is going to be constant for all these terms.2898

This is 1 – E⁻², that is the constant part.2901

I’m going to fill in the different values of Y with 2 being λ.2907

2⁰/0! + 2¹/1! + 2²/2!.2911

This is 1 -, I will E² as a denominator there.2922

2⁰/0! Is 1 + 2 + 4/2 is 2.2927

I get here 1 -5/E², that is my exact probability that there will be more than 3 earthquakes,2935

which means we will bust our budgets of $40,000,000 for damages.2944

I did throw that into a calculator, I found the decimal.2950

It worked out to about 32.3%.2953

If I have budgeted $40,000,000 for earthquake repair in California,2959

and I'm worried about be the prospect of going over budget, I should be worried because there is a 30% chance,2965

actually a little more than 30% chance that I will go over budget.2973

That I will have more than 3 earthquakes which mean that I will be spending more than $40,000,000 on earthquake repair.2977

Let me recap the steps here.2987

We are interested in whether C is bigger than or equal to 40.2989

I plug in my formula for C here, there is my cost of being bigger than or equal to 40.2993

That simplify down a little bit to Y² + 5Y is greater than or equal to 30.2999

I was trying to figure out which values of Y would make that true, that is because Y is a discreet random variable.3004

We are not looking for a real number for Y, we are looking for whole numbers for Y.3011

Since, there is not an easy way to solve that, I just use trial and error.3018

I plugged in some different values of Y until I figure out which values of Y would make that true.3022

I will plug in these Y’s in here.3027

The first one that worked was Y = 3.3030

Of course, all the other Y after that will make it true.3035

We are trying to calculate the probability that Y is bigger than or equal to 3.3039

That is not an easy thing to do directly with the Poisson distribution.3044

Instead, what you want to do is calculate the probability that Y is less than 3,3049

which means the probability of getting 0, 1, or 2, and then you subtract it from 1.3057

I used my Poisson distribution formula.3064

In particular, I notice that one of these factors is constant.3068

I factor that out right here, that is the E⁻².3071

The other terms, I was just plugging in the different values of Y.3075

Y= 0, 1, 2.3080

All those gave me pretty simple numbers to simplify down.3083

That is my exact answer, 1 - 5/E².3087

I got a decimal approximation as 32.3%, that is the probability that3091

there will be more than 3 or more earthquakes 3 in California next year.3098

In turn, that tells me the probability my might cost is going to go above $40,000,000 in damages next year.3104

That wraps up our lecture here on the Poisson distribution and that also wraps up our chapter on discrete probability distributions.3113

The next chapter is on continuous probability distributions.3122

We will be talking about intervals, instead of summations.3126

That will be a lot of fun and I hope you will stick around for that.3129

In the meantime, you are watching the probability lecture series here on www.educator.com.3132

My name is Will Murray, thank you for joining me, bye.3137

Educator®

Please sign in to participate in this lecture discussion.

Resetting Your Password?
OR

Start Learning Now

Our free lessons will get you started (Adobe Flash® required).
Get immediate access to our entire library.

Membership Overview

  • Available 24/7. Unlimited Access to Our Entire Library.
  • Search and jump to exactly what you want to learn.
  • *Ask questions and get answers from the community and our teachers!
  • Practice questions with step-by-step solutions.
  • Download lecture slides for taking notes.
  • Track your course viewing progress.
  • Accessible anytime, anywhere with our Android and iOS apps.