  William Murray

Geometric Distribution

Slide Duration:

Section 1: Probability by Counting
Experiments, Outcomes, Samples, Spaces, Events

59m 30s

Intro
0:00
Terminology
0:19
Experiment
0:26
Outcome
0:56
Sample Space
1:16
Event
1:55
Key Formula
2:47
Formula for Finding the Probability of an Event
2:48
Example: Drawing a Card
3:36
Example I
5:01
Experiment
5:38
Outcomes
5:54
Probability of the Event
8:11
Example II
12:00
Experiment
12:17
Outcomes
12:34
Probability of the Event
13:49
Example III
16:33
Experiment
17:09
Outcomes
17:33
Probability of the Event
18:25
Example IV
21:20
Experiment
21:21
Outcomes
22:00
Probability of the Event
23:22
Example V
31:41
Experiment
32:14
Outcomes
32:35
Probability of the Event
33:27
Alternate Solution
40:16
Example VI
43:33
Experiment
44:08
Outcomes
44:24
Probability of the Event
53:35

1h 2m 47s

Intro
0:00
Unions of Events
0:40
Unions of Events
0:41
Disjoint Events
3:42
Intersections of Events
4:18
Intersections of Events
4:19
Conditional Probability
5:47
Conditional Probability
5:48
Independence
8:20
Independence
8:21
Warning: Independent Does Not Mean Disjoint
9:53
If A and B are Independent
11:20
Example I: Choosing a Number at Random
12:41
Solving by Counting
12:52
Solving by Probability
17:26
Example II: Combination
22:07
Combination Deal at a Restaurant
22:08
Example III: Rolling Two Dice
24:18
Define the Events
24:20
Solving by Counting
27:35
Solving by Probability
29:32
Example IV: Flipping a Coin
35:07
Flipping a Coin Four Times
35:08
Example V: Conditional Probabilities
41:22
Define the Events
42:23
Calculate the Conditional Probabilities
46:21
Example VI: Independent Events
53:42
Define the Events
53:43
Are Events Independent?
55:21
Choices: Combinations & Permutations

56m 3s

Intro
0:00
Choices: With or Without Replacement?
0:12
Choices: With or Without Replacement?
0:13
Example: With Replacement
2:17
Example: Without Replacement
2:55
Choices: Ordered or Unordered?
4:10
Choices: Ordered or Unordered?
4:11
Example: Unordered
4:52
Example: Ordered
6:08
Combinations
9:23
Definition & Equation: Combinations
9:24
Example: Combinations
12:12
Permutations
13:56
Definition & Equation: Permutations
13:57
Example: Permutations
15:00
Key Formulas
17:19
Number of Ways to Pick r Things from n Possibilities
17:20
Example I: Five Different Candy Bars
18:31
Example II: Five Identical Candy Bars
24:53
Example III: Five Identical Candy Bars
31:56
Example IV: Five Different Candy Bars
39:21
Example V: Pizza & Toppings
45:03
Inclusion & Exclusion

43m 40s

Intro
0:00
Inclusion/Exclusion: Two Events
0:09
Inclusion/Exclusion: Two Events
0:10
Inclusion/Exclusion: Three Events
2:30
Inclusion/Exclusion: Three Events
2:31
Example I: Inclusion & Exclusion
6:24
Example II: Inclusion & Exclusion
11:01
Example III: Inclusion & Exclusion
18:41
Example IV: Inclusion & Exclusion
28:24
Example V: Inclusion & Exclusion
39:33
Independence

46m 9s

Intro
0:00
Formula and Intuition
0:12
Definition of Independence
0:19
Intuition
0:49
Common Misinterpretations
1:37
Myth & Truth 1
1:38
Myth & Truth 2
2:23
Combining Independent Events
3:56
Recall: Formula for Conditional Probability
3:58
Combining Independent Events
4:10
Example I: Independence
5:36
Example II: Independence
14:14
Example III: Independence
21:10
Example IV: Independence
32:45
Example V: Independence
41:13
Bayes' Rule

1h 2m 10s

Intro
0:00
When to Use Bayes' Rule
0:08
When to Use Bayes' Rule: Disjoint Union of Events
0:09
Bayes' Rule for Two Choices
2:50
Bayes' Rule for Two Choices
2:51
Bayes' Rule for Multiple Choices
5:03
Bayes' Rule for Multiple Choices
5:04
Example I: What is the Chance that She is Diabetic?
6:55
Example I: Setting up the Events
6:56
Example I: Solution
11:33
Example II: What is the chance that It Belongs to a Woman?
19:28
Example II: Setting up the Events
19:29
Example II: Solution
21:45
Example III: What is the Probability that She is a Democrat?
27:31
Example III: Setting up the Events
27:32
Example III: Solution
32:08
Example IV: What is the chance that the Fruit is an Apple?
39:11
Example IV: Setting up the Events
39:12
Example IV: Solution
43:50
Example V: What is the Probability that the Oldest Child is a Girl?
51:16
Example V: Setting up the Events
51:17
Example V: Solution
53:07
Section 2: Random Variables
Random Variables & Probability Distribution

38m 21s

Intro
0:00
Intuition
0:15
Intuition for Random Variable
0:16
Example: Random Variable
0:44
Intuition, Cont.
2:52
Example: Random Variable as Payoff
2:57
Definition
5:11
Definition of a Random Variable
5:13
Example: Random Variable in Baseball
6:02
Probability Distributions
7:18
Probability Distributions
7:19
Example I: Probability Distribution for the Random Variable
9:29
Example II: Probability Distribution for the Random Variable
14:52
Example III: Probability Distribution for the Random Variable
21:52
Example IV: Probability Distribution for the Random Variable
27:25
Example V: Probability Distribution for the Random Variable
34:12
Expected Value (Mean)

46m 14s

Intro
0:00
Definition of Expected Value
0:20
Expected Value of a (Discrete) Random Variable or Mean
0:21
Indicator Variables
3:03
Indicator Variable
3:04
Linearity of Expectation
4:36
Linearity of Expectation for Random Variables
4:37
Expected Value of a Function
6:03
Expected Value of a Function
6:04
Example I: Expected Value
7:30
Example II: Expected Value
14:14
Example III: Expected Value of Flipping a Coin
21:42
Example III: Part A
21:43
Example III: Part B
30:43
Example IV: Semester Average
36:39
Example V: Expected Value of a Function of a Random Variable
41:28
Variance & Standard Deviation

47m 23s

Intro
0:00
Definition of Variance
0:11
Variance of a Random Variable
0:12
Variance is a Measure of the Variability, or Volatility
1:06
Most Useful Way to Calculate Variance
2:46
Definition of Standard Deviation
3:44
Standard Deviation of a Random Variable
3:45
Example I: Which of the Following Sets of Data Has the Largest Variance?
5:34
Example II: Which of the Following Would be the Least Useful in Understanding a Set of Data?
9:02
Example III: Calculate the Mean, Variance, & Standard Deviation
11:48
Example III: Mean
12:56
Example III: Variance
14:06
Example III: Standard Deviation
15:42
Example IV: Calculate the Mean, Variance, & Standard Deviation
17:54
Example IV: Mean
18:47
Example IV: Variance
20:36
Example IV: Standard Deviation
25:34
Example V: Calculate the Mean, Variance, & Standard Deviation
29:56
Example V: Mean
30:13
Example V: Variance
33:28
Example V: Standard Deviation
34:48
Example VI: Calculate the Mean, Variance, & Standard Deviation
37:29
Example VI: Possible Outcomes
38:09
Example VI: Mean
39:29
Example VI: Variance
41:22
Example VI: Standard Deviation
43:28
Markov's Inequality

26m 45s

Intro
0:00
Markov's Inequality
0:25
Markov's Inequality: Definition & Condition
0:26
Markov's Inequality: Equation
1:15
Markov's Inequality: Reverse Equation
2:48
Example I: Money
4:11
Example II: Rental Car
9:23
Example III: Probability of an Earthquake
12:22
Example IV: Defective Laptops
16:52
Example V: Cans of Tuna
21:06
Tchebysheff's Inequality

42m 11s

Intro
0:00
Tchebysheff's Inequality (Also Known as Chebyshev's Inequality)
0:52
Tchebysheff's Inequality: Definition
0:53
Tchebysheff's Inequality: Equation
1:19
Tchebysheff's Inequality: Intuition
3:21
Tchebysheff's Inequality in Reverse
4:09
Tchebysheff's Inequality in Reverse
4:10
Intuition
5:13
Example I: Money
5:55
Example II: College Units
13:20
Example III: Using Tchebysheff's Inequality to Estimate Proportion
16:40
Example IV: Probability of an Earthquake
25:21
Example V: Using Tchebysheff's Inequality to Estimate Proportion
32:57
Section 3: Discrete Distributions
Binomial Distribution (Bernoulli Trials)

52m 36s

Intro
0:00
Binomial Distribution
0:29
Binomial Distribution (Bernoulli Trials) Overview
0:30
Prototypical Examples: Flipping a Coin n Times
1:36
Process with Two Outcomes: Games Between Teams
2:12
Process with Two Outcomes: Rolling a Die to Get a 6
2:42
Formula for the Binomial Distribution
3:45
Fixed Parameters
3:46
Formula for the Binomial Distribution
6:27
Key Properties of the Binomial Distribution
9:54
Mean
9:55
Variance
10:56
Standard Deviation
11:13
Example I: Games Between Teams
11:36
Example II: Exam Score
17:01
Example III: Expected Grade & Standard Deviation
25:59
Example IV: Pogo-sticking Championship, Part A
33:25
Example IV: Pogo-sticking Championship, Part B
38:24
Example V: Expected Championships Winning & Standard Deviation
45:22
Geometric Distribution

52m 50s

Intro
0:00
Geometric Distribution
0:22
Geometric Distribution: Definition
0:23
Prototypical Example: Flipping a Coin Until We Get a Head
1:08
Geometric Distribution vs. Binomial Distribution.
1:31
Formula for the Geometric Distribution
2:13
Fixed Parameters
2:14
Random Variable
2:49
Formula for the Geometric Distribution
3:16
Key Properties of the Geometric Distribution
6:47
Mean
6:48
Variance
7:10
Standard Deviation
7:25
Geometric Series
7:46
Recall from Calculus II: Sum of Infinite Series
7:47
Application to Geometric Distribution
10:10
Example I: Drawing Cards from a Deck (With Replacement) Until You Get an Ace
13:02
Example I: Question & Solution
13:03
Example II: Mean & Standard Deviation of Winning Pin the Tail on the Donkey
16:32
Example II: Mean
16:33
Example II: Standard Deviation
18:37
Example III: Rolling a Die
22:09
Example III: Setting Up
22:10
Example III: Part A
24:18
Example III: Part B
26:01
Example III: Part C
27:38
Example III: Summary
32:02
Example IV: Job Interview
35:16
Example IV: Setting Up
35:15
Example IV: Part A
37:26
Example IV: Part B
38:33
Example IV: Summary
39:37
Example V: Mean & Standard Deviation of Time to Conduct All the Interviews
41:13
Example V: Setting Up
42:50
Example V: Mean
46:05
Example V: Variance
47:37
Example V: Standard Deviation
48:22
Example V: Summary
49:36
Negative Binomial Distribution

51m 39s

Intro
0:00
Negative Binomial Distribution
0:11
Negative Binomial Distribution: Definition
0:12
Prototypical Example: Flipping a Coin Until We Get r Successes
0:46
Negative Binomial Distribution vs. Binomial Distribution
1:04
Negative Binomial Distribution vs. Geometric Distribution
1:33
Formula for Negative Binomial Distribution
3:39
Fixed Parameters
3:40
Random Variable
4:57
Formula for Negative Binomial Distribution
5:18
Key Properties of Negative Binomial
7:44
Mean
7:47
Variance
8:03
Standard Deviation
8:09
Example I: Drawing Cards from a Deck (With Replacement) Until You Get Four Aces
8:32
Example I: Question & Solution
8:33
Example II: Chinchilla Grooming
12:37
Example II: Mean
12:38
Example II: Variance
15:09
Example II: Standard Deviation
15:51
Example II: Summary
17:10
Example III: Rolling a Die Until You Get Four Sixes
18:27
Example III: Setting Up
19:38
Example III: Mean
19:38
Example III: Variance
20:31
Example III: Standard Deviation
21:21
Example IV: Job Applicants
24:00
Example IV: Setting Up
24:01
Example IV: Part A
26:16
Example IV: Part B
29:53
Example V: Mean & Standard Deviation of Time to Conduct All the Interviews
40:10
Example V: Setting Up
40:11
Example V: Mean
45:24
Example V: Variance
46:22
Example V: Standard Deviation
47:01
Example V: Summary
48:16
Hypergeometric Distribution

36m 27s

Intro
0:00
Hypergeometric Distribution
0:11
Hypergeometric Distribution: Definition
0:12
Random Variable
1:38
Formula for the Hypergeometric Distribution
1:50
Fixed Parameters
1:51
Formula for the Hypergeometric Distribution
2:53
Key Properties of Hypergeometric
6:14
Mean
6:15
Variance
6:42
Standard Deviation
7:16
Example I: Students Committee
7:30
Example II: Expected Number of Women on the Committee in Example I
11:08
Example III: Pairs of Shoes
13:49
Example IV: What is the Expected Number of Left Shoes in Example III?
20:46
Example V: Using Indicator Variables & Linearity of Expectation
25:40
Poisson Distribution

52m 19s

Intro
0:00
Poisson Distribution
0:18
Poisson Distribution: Definition
0:19
Formula for the Poisson Distribution
2:16
Fixed Parameter
2:17
Formula for the Poisson Distribution
2:59
Key Properties of the Poisson Distribution
5:30
Mean
5:34
Variance
6:07
Standard Deviation
6:27
Example I: Forest Fires
6:41
Example II: Call Center, Part A
15:56
Example II: Call Center, Part B
20:50
Example III: Confirming that the Mean of the Poisson Distribution is λ
26:53
Example IV: Find E (Y²) for the Poisson Distribution
35:24
Example V: Earthquakes, Part A
37:57
Example V: Earthquakes, Part B
44:02
Section 4: Continuous Distributions
Density & Cumulative Distribution Functions

57m 17s

Intro
0:00
Density Functions
0:43
Density Functions
0:44
Density Function to Calculate Probabilities
2:41
Cumulative Distribution Functions
4:28
Cumulative Distribution Functions
4:29
Using F to Calculate Probabilities
5:58
Properties of the CDF (Density & Cumulative Distribution Functions)
7:27
F(-∞) = 0
7:34
F(∞) = 1
8:30
F is Increasing
9:14
F'(y) = f(y)
9:21
Example I: Density & Cumulative Distribution Functions, Part A
9:43
Example I: Density & Cumulative Distribution Functions, Part B
14:16
Example II: Density & Cumulative Distribution Functions, Part A
21:41
Example II: Density & Cumulative Distribution Functions, Part B
26:16
Example III: Density & Cumulative Distribution Functions, Part A
32:17
Example III: Density & Cumulative Distribution Functions, Part B
37:08
Example IV: Density & Cumulative Distribution Functions
43:34
Example V: Density & Cumulative Distribution Functions, Part A
51:53
Example V: Density & Cumulative Distribution Functions, Part B
54:19
Mean & Variance for Continuous Distributions

36m 18s

Intro
0:00
Mean
0:32
Mean for a Continuous Random Variable
0:33
Expectation is Linear
2:07
Variance
2:55
Variance for Continuous random Variable
2:56
Easier to Calculate Via the Mean
3:26
Standard Deviation
5:03
Standard Deviation
5:04
Example I: Mean & Variance for Continuous Distributions
5:43
Example II: Mean & Variance for Continuous Distributions
10:09
Example III: Mean & Variance for Continuous Distributions
16:05
Example IV: Mean & Variance for Continuous Distributions
26:40
Example V: Mean & Variance for Continuous Distributions
30:12
Uniform Distribution

32m 49s

Intro
0:00
Uniform Distribution
0:15
Uniform Distribution
0:16
Each Part of the Region is Equally Probable
1:39
Key Properties of the Uniform Distribution
2:45
Mean
2:46
Variance
3:27
Standard Deviation
3:48
Example I: Newspaper Delivery
5:25
Example II: Picking a Real Number from a Uniform Distribution
8:21
Example III: Dinner Date
11:02
Example IV: Proving that a Variable is Uniformly Distributed
18:50
Example V: Ice Cream Serving
27:22
Normal (Gaussian) Distribution

1h 3m 54s

Intro
0:00
Normal (Gaussian) Distribution
0:35
Normal (Gaussian) Distribution & The Bell Curve
0:36
Fixed Parameters
0:55
Formula for the Normal Distribution
1:32
Formula for the Normal Distribution
1:33
Calculating on the Normal Distribution can be Tricky
3:32
Standard Normal Distribution
5:12
Standard Normal Distribution
5:13
Graphing the Standard Normal Distribution
6:13
Standard Normal Distribution, Cont.
8:30
Standard Normal Distribution Chart
8:31
Nonstandard Normal Distribution
14:44
Nonstandard Normal Variable & Associated Standard Normal
14:45
Finding Probabilities for Z
15:39
Example I: Chance that Standard Normal Variable Will Land Between 1 and 2?
16:46
Example I: Setting Up the Equation & Graph
16:47
Example I: Solving for z Using the Standard Normal Chart
19:05
Example II: What Proportion of the Data Lies within Two Standard Deviations of the Mean?
20:41
Example II: Setting Up the Equation & Graph
20:42
Example II: Solving for z Using the Standard Normal Chart
24:38
Example III: Scores on an Exam
27:34
Example III: Setting Up the Equation & Graph, Part A
27:35
Example III: Setting Up the Equation & Graph, Part B
33:48
Example III: Solving for z Using the Standard Normal Chart, Part A
38:23
Example III: Solving for z Using the Standard Normal Chart, Part B
40:49
Example IV: Temperatures
42:54
Example IV: Setting Up the Equation & Graph
42:55
Example IV: Solving for z Using the Standard Normal Chart
47:03
Example V: Scores on an Exam
48:41
Example V: Setting Up the Equation & Graph, Part A
48:42
Example V: Setting Up the Equation & Graph, Part B
53:20
Example V: Solving for z Using the Standard Normal Chart, Part A
57:45
Example V: Solving for z Using the Standard Normal Chart, Part B
59:17
Gamma Distribution (with Exponential & Chi-square)

1h 8m 27s

Intro
0:00
Gamma Function
0:49
The Gamma Function
0:50
Properties of the Gamma Function
2:07
Formula for the Gamma Distribution
3:50
Fixed Parameters
3:51
Density Function for Gamma Distribution
4:07
Key Properties of the Gamma Distribution
7:13
Mean
7:14
Variance
7:25
Standard Deviation
7:30
Exponential Distribution
8:03
Definition of Exponential Distribution
8:04
Density
11:23
Mean
13:26
Variance
13:48
Standard Deviation
13:55
Chi-square Distribution
14:34
Chi-square Distribution: Overview
14:35
Chi-square Distribution: Mean
16:27
Chi-square Distribution: Variance
16:37
Chi-square Distribution: Standard Deviation
16:55
Example I: Graphing Gamma Distribution
17:30
Example I: Graphing Gamma Distribution
17:31
Example I: Describe the Effects of Changing α and β on the Shape of the Graph
23:33
Example II: Exponential Distribution
27:11
Example II: Using the Exponential Distribution
27:12
Example II: Summary
35:34
Example III: Earthquake
37:05
Example III: Estimate Using Markov's Inequality
37:06
Example III: Estimate Using Tchebysheff's Inequality
40:13
Example III: Summary
44:13
Example IV: Finding Exact Probability of Earthquakes
46:45
Example IV: Finding Exact Probability of Earthquakes
46:46
Example IV: Summary
51:44
Example V: Prove and Interpret Why the Exponential Distribution is Called 'Memoryless'
52:51
Example V: Prove
52:52
Example V: Interpretation
57:44
Example V: Summary
1:03:54
Beta Distribution

52m 45s

Intro
0:00
Beta Function
0:29
Fixed parameters
0:30
Defining the Beta Function
1:19
Relationship between the Gamma & Beta Functions
2:02
Beta Distribution
3:31
Density Function for the Beta Distribution
3:32
Key Properties of the Beta Distribution
6:56
Mean
6:57
Variance
7:16
Standard Deviation
7:37
Example I: Calculate B(3,4)
8:10
Example II: Graphing the Density Functions for the Beta Distribution
12:25
Example III: Show that the Uniform Distribution is a Special Case of the Beta Distribution
24:57
Example IV: Show that this Triangular Distribution is a Special Case of the Beta Distribution
31:20
Example V: Morning Commute
37:39
Example V: Identify the Density Function
38:45
Example V: Morning Commute, Part A
42:22
Example V: Morning Commute, Part B
44:19
Example V: Summary
49:13
Moment-Generating Functions

51m 58s

Intro
0:00
Moments
0:30
Definition of Moments
0:31
Moment-Generating Functions (MGFs)
3:53
Moment-Generating Functions
3:54
Using the MGF to Calculate the Moments
5:21
Moment-Generating Functions for the Discrete Distributions
8:22
Moment-Generating Functions for Binomial Distribution
8:36
Moment-Generating Functions for Geometric Distribution
9:06
Moment-Generating Functions for Negative Binomial Distribution
9:28
Moment-Generating Functions for Hypergeometric Distribution
9:43
Moment-Generating Functions for Poisson Distribution
9:57
Moment-Generating Functions for the Continuous Distributions
11:34
Moment-Generating Functions for the Uniform Distributions
11:43
Moment-Generating Functions for the Normal Distributions
12:24
Moment-Generating Functions for the Gamma Distributions
12:36
Moment-Generating Functions for the Exponential Distributions
12:44
Moment-Generating Functions for the Chi-square Distributions
13:11
Moment-Generating Functions for the Beta Distributions
13:48
Useful Formulas with Moment-Generating Functions
15:02
Useful Formulas with Moment-Generating Functions 1
15:03
Useful Formulas with Moment-Generating Functions 2
16:21
Example I: Moment-Generating Function for the Binomial Distribution
17:33
Example II: Use the MGF for the Binomial Distribution to Find the Mean of the Distribution
24:40
Example III: Find the Moment Generating Function for the Poisson Distribution
29:28
Example IV: Use the MGF for Poisson Distribution to Find the Mean and Variance of the Distribution
36:27
Example V: Find the Moment-generating Function for the Uniform Distribution
44:47
Section 5: Multivariate Distributions
Bivariate Density & Distribution Functions

50m 52s

Intro
0:00
Bivariate Density Functions
0:21
Two Variables
0:23
Bivariate Density Function
0:52
Properties of the Density Function
1:57
Properties of the Density Function 1
1:59
Properties of the Density Function 2
2:20
We Can Calculate Probabilities
2:53
If You Have a Discrete Distribution
4:36
Bivariate Distribution Functions
5:25
Bivariate Distribution Functions
5:26
Properties of the Bivariate Distribution Functions 1
7:19
Properties of the Bivariate Distribution Functions 2
7:36
Example I: Bivariate Density & Distribution Functions
8:08
Example II: Bivariate Density & Distribution Functions
14:40
Example III: Bivariate Density & Distribution Functions
24:33
Example IV: Bivariate Density & Distribution Functions
32:04
Example V: Bivariate Density & Distribution Functions
40:26
Marginal Probability

42m 38s

Intro
0:00
Discrete Case
0:48
Marginal Probability Functions
0:49
Continuous Case
3:07
Marginal Density Functions
3:08
Example I: Compute the Marginal Probability Function
5:58
Example II: Compute the Marginal Probability Function
14:07
Example III: Marginal Density Function
24:01
Example IV: Marginal Density Function
30:47
Example V: Marginal Density Function
36:05
Conditional Probability & Conditional Expectation

1h 2m 24s

Intro
0:00
Review of Marginal Probability
0:46
Recall the Marginal Probability Functions & Marginal Density Functions
0:47
Conditional Probability, Discrete Case
3:14
Conditional Probability, Discrete Case
3:15
Conditional Probability, Continuous Case
4:15
Conditional Density of Y₁ given that Y₂ = y₂
4:16
Interpret This as a Density on Y₁ & Calculate Conditional Probability
5:03
Conditional Expectation
6:44
Conditional Expectation: Continuous
6:45
Conditional Expectation: Discrete
8:03
Example I: Conditional Probability
8:29
Example II: Conditional Probability
23:59
Example III: Conditional Probability
34:28
Example IV: Conditional Expectation
43:16
Example V: Conditional Expectation
48:28
Independent Random Variables

51m 39s

Intro
0:00
Intuition
0:55
Experiment with Two Random Variables
0:56
Intuition Formula
2:17
Definition and Formulas
4:43
Definition
4:44
Short Version: Discrete
5:10
Short Version: Continuous
5:48
Theorem
9:33
For Continuous Random Variables, Y₁ & Y₂ are Independent If & Only If: Condition 1
9:34
For Continuous Random Variables, Y₁ & Y₂ are Independent If & Only If: Condition 2
11:22
Example I: Use the Definition to Determine if Y₁ and Y₂ are Independent
12:49
Example II: Use the Definition to Determine if Y₁ and Y₂ are Independent
21:33
Example III: Are Y₁ and Y₂ Independent?
27:01
Example IV: Are Y₁ and Y₂ Independent?
34:51
Example V: Are Y₁ and Y₂ Independent?
43:44
Expected Value of a Function of Random Variables

37m 7s

Intro
0:00
Review of Single Variable Case
0:29
Expected Value of a Single Variable
0:30
Expected Value of a Function g(Y)
1:12
Bivariate Case
2:11
Expected Value of a Function g(Y₁, Y₂)
2:12
Linearity of Expectation
3:24
Linearity of Expectation 1
3:25
Linearity of Expectation 2
3:38
4:03
Example I: Calculate E (Y₁ + Y₂)
4:39
Example II: Calculate E (Y₁Y₂)
14:47
Example III: Calculate E (U₁) and E(U₂)
19:33
Example IV: Calculate E (Y₁) and E(Y₂)
22:50
Example V: Calculate E (2Y₁ + 3Y₂)
33:05
Covariance, Correlation & Linear Functions

59m 50s

Intro
0:00
Definition and Formulas for Covariance
0:38
Definition of Covariance
0:39
Formulas to Calculate Covariance
1:36
Intuition for Covariance
3:54
Covariance is a Measure of Dependence
3:55
Dependence Doesn't Necessarily Mean that the Variables Do the Same Thing
4:12
If Variables Move Together
4:47
If Variables Move Against Each Other
5:04
Both Cases Show Dependence!
5:30
Independence Theorem
8:10
Independence Theorem
8:11
The Converse is Not True
8:32
Correlation Coefficient
9:33
Correlation Coefficient
9:34
Linear Functions of Random Variables
11:57
Linear Functions of Random Variables: Expected Value
11:58
Linear Functions of Random Variables: Variance
12:58
Linear Functions of Random Variables, Cont.
14:30
Linear Functions of Random Variables: Covariance
14:35
Example I: Calculate E (Y₁), E (Y₂), and E (Y₁Y₂)
15:31
Example II: Are Y₁ and Y₂ Independent?
29:16
Example III: Calculate V (U₁) and V (U₂)
36:14
Example IV: Calculate the Covariance Correlation Coefficient
42:12
Example V: Find the Mean and Variance of the Average
52:19
Section 6: Distributions of Functions of Random Variables
Distribution Functions

1h 7m 35s

Intro
0:00
Premise
0:44
Premise
0:45
Goal
1:38
Goal Number 1: Find the Full Distribution Function
1:39
Goal Number 2: Find the Density Function
1:55
Goal Number 3: Calculate Probabilities
2:17
Three Methods
3:05
Method 1: Distribution Functions
3:06
Method 2: Transformations
3:38
Method 3: Moment-generating Functions
3:47
Distribution Functions
4:03
Distribution Functions
4:04
Example I: Find the Density Function
6:41
Step 1: Find the Distribution Function
6:42
Step 2: Find the Density Function
10:20
Summary
11:51
Example II: Find the Density Function
14:36
Step 1: Find the Distribution Function
14:37
Step 2: Find the Density Function
18:19
Summary
19:22
Example III: Find the Cumulative Distribution & Density Functions
20:39
Step 1: Find the Cumulative Distribution
20:40
Step 2: Find the Density Function
28:58
Summary
30:20
Example IV: Find the Density Function
33:01
Step 1: Setting Up the Equation & Graph
33:02
Step 2: If u ≤ 1
38:32
Step 3: If u ≥ 1
41:02
Step 4: Find the Distribution Function
42:40
Step 5: Find the Density Function
43:11
Summary
45:03
Example V: Find the Density Function
48:32
Step 1: Exponential
48:33
Step 2: Independence
50:48
Step 2: Find the Distribution Function
51:47
Step 3: Find the Density Function
1:00:17
Summary
1:02:05
Transformations

1h 16s

Intro
0:00
Premise
0:32
Premise
0:33
Goal
1:37
Goal Number 1: Find the Full Distribution Function
1:38
Goal Number 2: Find the Density Function
1:49
Goal Number 3: Calculate Probabilities
2:04
Three Methods
2:34
Method 1: Distribution Functions
2:35
Method 2: Transformations
2:57
Method 3: Moment-generating Functions
3:05
Requirements for Transformation Method
3:22
The Transformation Method Only Works for Single-variable Situations
3:23
Must be a Strictly Monotonic Function
3:50
Example: Strictly Monotonic Function
4:50
If the Function is Monotonic, Then It is Invertible
5:30
Formula for Transformations
7:09
Formula for Transformations
7:11
Example I: Determine whether the Function is Monotonic, and if so, Find Its Inverse
8:26
Example II: Find the Density Function
12:07
Example III: Determine whether the Function is Monotonic, and if so, Find Its Inverse
17:12
Example IV: Find the Density Function for the Magnitude of the Next Earthquake
21:30
Example V: Find the Expected Magnitude of the Next Earthquake
33:20
Example VI: Find the Density Function, Including the Range of Possible Values for u
47:42
Moment-Generating Functions

1h 18m 52s

Intro
0:00
Premise
0:30
Premise
0:31
Goal
1:40
Goal Number 1: Find the Full Distribution Function
1:41
Goal Number 2: Find the Density Function
1:51
Goal Number 3: Calculate Probabilities
2:01
Three Methods
2:39
Method 1: Distribution Functions
2:40
Method 2: Transformations
2:50
Method 3: Moment-Generating Functions
2:55
Review of Moment-Generating Functions
3:04
Recall: The Moment-Generating Function for a Random Variable Y
3:05
The Moment-Generating Function is a Function of t (Not y)
3:45
Moment-Generating Functions for the Discrete Distributions
4:31
Binomial
4:50
Geometric
5:12
Negative Binomial
5:24
Hypergeometric
5:33
Poisson
5:42
Moment-Generating Functions for the Continuous Distributions
6:08
Uniform
6:09
Normal
6:17
Gamma
6:29
Exponential
6:34
Chi-square
7:05
Beta
7:48
Useful Formulas with the Moment-Generating Functions
8:48
Useful Formula 1
8:49
Useful Formula 2
9:51
How to Use Moment-Generating Functions
10:41
How to Use Moment-Generating Functions
10:42
Example I: Find the Density Function
12:22
Example II: Find the Density Function
30:58
Example III: Find the Probability Function
43:29
Example IV: Find the Probability Function
51:43
Example V: Find the Distribution
1:00:14
Example VI: Find the Density Function
1:12:10
Order Statistics

1h 4m 56s

Intro
0:00
Premise
0:11
Example Question: How Tall Will the Tallest Student in My Next Semester's Probability Class Be?
0:12
Setting
0:56
Definition 1
1:49
Definition 2
2:01
Question: What are the Distributions & Densities?
4:08
Formulas
4:47
Distribution of Max
5:11
Density of Max
6:00
Distribution of Min
7:08
Density of Min
7:18
Example I: Distribution & Density Functions
8:29
Example I: Distribution
8:30
Example I: Density
11:07
Example I: Summary
12:33
Example II: Distribution & Density Functions
14:25
Example II: Distribution
14:26
Example II: Density
17:21
Example II: Summary
19:00
Example III: Mean & Variance
20:32
Example III: Mean
20:33
Example III: Variance
25:48
Example III: Summary
30:57
Example IV: Distribution & Density Functions
35:43
Example IV: Distribution
35:44
Example IV: Density
43:03
Example IV: Summary
46:11
Example V: Find the Expected Time Until the Team's First Injury
51:14
Example V: Solution
51:15
Example V: Summary
1:01:11
Sampling from a Normal Distribution

1h 7s

Intro
0:00
Setting
0:36
Setting
0:37
Assumptions and Notation
2:18
Assumption Forever
2:19
Assumption for this Lecture Only
3:21
Notation
3:49
The Sample Mean
4:15
Statistic We'll Study the Sample Mean
4:16
Theorem
5:40
Standard Normal Distribution
7:03
Standard Normal Distribution
7:04
Converting to Standard Normal
10:11
Recall
10:12
Corollary to Theorem
10:41
Example I: Heights of Students
13:18
Example II: What Happens to This Probability as n → ∞
22:36
Example III: Units at a University
32:24
Example IV: Probability of Sample Mean
40:53
Example V: How Many Samples Should We Take?
48:34
The Central Limit Theorem

1h 9m 55s

Intro
0:00
Setting
0:52
Setting
0:53
Assumptions and Notation
2:53
Our Samples are Independent (Independent Identically Distributed)
2:54
No Longer Assume that the Population is Normally Distributed
3:30
The Central Limit Theorem
4:36
The Central Limit Theorem Overview
4:38
The Central Limit Theorem in Practice
6:24
Standard Normal Distribution
8:09
Standard Normal Distribution
8:13
Converting to Standard Normal
10:13
Recall: If Y is Normal, Then …
10:14
Corollary to Theorem
11:09
Example I: Probability of Finishing Your Homework
12:56
Example I: Solution
12:57
Example I: Summary
18:20
Example I: Confirming with the Standard Normal Distribution Chart
20:18
Example II: Probability of Selling Muffins
21:26
Example II: Solution
21:27
Example II: Summary
29:09
Example II: Confirming with the Standard Normal Distribution Chart
31:09
Example III: Probability that a Soda Dispenser Gives the Correct Amount of Soda
32:41
Example III: Solution
32:42
Example III: Summary
38:03
Example III: Confirming with the Standard Normal Distribution Chart
40:58
Example IV: How Many Samples Should She Take?
42:06
Example IV: Solution
42:07
Example IV: Summary
49:18
Example IV: Confirming with the Standard Normal Distribution Chart
51:57
Example V: Restaurant Revenue
54:41
Example V: Solution
54:42
Example V: Summary
1:04:21
Example V: Confirming with the Standard Normal Distribution Chart
1:06:48
Bookmark & Share Embed

## Copy & Paste this embed code into your website’s HTML

Please ensure that your website editor is in text mode when you paste the code.
(In Wordpress, the mode button is on the top right corner.)
×
• - Allow users to view the embedded video in full-size.

• ## Transcription 1 answer Last reply by: Dr. William MurraySun Jan 4, 2015 7:30 PMPost by Carlos Morales on January 3, 2015If you are dealing one card at a time from a shuffled deck. How many cards would it take before an Ace would come up (no replacement). I tried replacing P(y)=1 in Exercise 1 but my answer makes no sense. 1 answer Last reply by: Dr. William MurrayFri Sep 5, 2014 12:44 PMPost by Ikze Cho on September 3, 2014how would one figure out the probability of winning in less than y trials?

### Geometric Distribution

Lecture Slides are screen-captured images of important points in the lecture. Students can download and print out these lecture slide images to do practice problems as well as take notes while watching the lecture.

• Intro 0:00
• Geometric Distribution 0:22
• Geometric Distribution: Definition
• Prototypical Example: Flipping a Coin Until We Get a Head
• Geometric Distribution vs. Binomial Distribution.
• Formula for the Geometric Distribution 2:13
• Fixed Parameters
• Random Variable
• Formula for the Geometric Distribution
• Key Properties of the Geometric Distribution 6:47
• Mean
• Variance
• Standard Deviation
• Geometric Series 7:46
• Recall from Calculus II: Sum of Infinite Series
• Application to Geometric Distribution
• Example I: Drawing Cards from a Deck (With Replacement) Until You Get an Ace 13:02
• Example I: Question & Solution
• Example II: Mean & Standard Deviation of Winning Pin the Tail on the Donkey 16:32
• Example II: Mean
• Example II: Standard Deviation
• Example III: Rolling a Die 22:09
• Example III: Setting Up
• Example III: Part A
• Example III: Part B
• Example III: Part C
• Example III: Summary
• Example IV: Job Interview 35:16
• Example IV: Setting Up
• Example IV: Part A
• Example IV: Part B
• Example IV: Summary
• Example V: Mean & Standard Deviation of Time to Conduct All the Interviews 41:13
• Example V: Setting Up
• Example V: Mean
• Example V: Variance
• Example V: Standard Deviation
• Example V: Summary

### Transcription: Geometric Distribution

Hi and welcome back to the probability lectures here on www.educator.com, my name is Will Murray.0000

Today, we are going to learn about the geometric distribution.0005

It looks a lot like the binomial distribution in the initial setup because it is describing a similar situation.0009

I will try to make it clear how the geometric distribution is actually different from the binomial distribution.0016

The idea of the geometric distribution is that you have a sequence of trials.0023

Each one of these trials can have two outcomes, you think of those as being success or failure.0028

Very typically, you think of this as being a sequence of coin flips, you are flipping a coin,0034

but it can really be anything where there are 2 possible outcomes.0039

For example, each year you want to know if the Yankees are going to win the World Series.0043

Each year, either they win the World Series or they do not win the World Series.0048

There is a yes or no outcome every single time you run the trial.0053

The key point about the geometric distribution is that you continue the trials indefinitely until you get the first success.0057

For example, if you are flipping a coin, you would keep flipping a coin over and over again0069

until you get the first head and then you would stop.0076

Or if you are tracking the Yankees winning the World Series, you wait and wait and wait as many years it takes,0080

until the Yankees win the World Series for the first time and then you stop.0086

The difference between that and the binomial distribution is that, we do not know the number of trials in advance0091

and we stop after we get the first success.0098

Remember, the binomial distribution, we have a fixed number of trials that we decide ahead of time.0100

We say, I'm going to flip this coin 10 × and I will count the number of heads, that was the binomial distribution.0106

This is the geometric distribution when we say, I'm going to flip this coin as long as it takes until I get the first head.0112

That is the difference between those two.0121

You will be really careful when you are studying a new situation, and when talking about the geometric distribution or the binomial distribution.0122

I still now have given you any formulas for the geometric distribution.0134

The way it works is you have a fix parameter which is the probability of success on each trial.0137

If you are flipping a fair coin then P would just be ½.0144

If you are tracking the Yankees winning World Series, and you figure that each year,0147

they have a 10% chance of winning the World Series then P would be 1/10.0152

Q is going to be the probability of failure that is always 1 – P.0157

Q is just dependent on P.0162

You do not really need to know Q ahead of time because as long as you know P, you can work out what Q is.0165

The random variable you are keeping track of is the number of trials that you have to take, in order to get that first success.0170

That is another different aspect between the geometric distribution and binomial distribution.0178

In the binomial distribution, Y was the number of successes that you get in total.0182

In the geometric distribution, Y is the number of × it takes to get the first success.0189

We are ready to actually get the formula for the binomial distribution.0197

The probability of getting exactly Y trials is equal to Q ⁺Y -1 × P.0201

Here, Y can be any number between 1 or it can be arbitrarily large, that is why I put Y less than infinity there.0209

This should be fairly easy to remember because what this really represents is Q ⁺Y -1 means you have to fail on the first Y -1 trials.0216

If you want to get the first head on your 6th coin flip that means the first 5 coin flips has to come up tails.0229

What you are really doing here is your failing on the first Y -1 trials.0238

Your first 5 coin flips have to be tails and then this 1 power of P at the end means you exceed on the Yth trial.0249

If you are flipping a coin that means on the 6th time you flip a coin, you will only get a head.0262

There is a P probability of succeeding on the Yth trial.0267

Word of warning here, we have the same bad notation here that we had for a lot of our other distributions0278

and a lot of our other formulas in probability, which is that we are using P to represent two different things here.0285

This P right here and this P right here are 2 different p's.0295

That P, remember, is the probability of success on 1 trial, probability of success on any given trial.0301

If you are flipping a coin and it is a fair coin, that P is ½.0316

If you are rolling a dice and you are trying to get a 6 then that P would be 1/6.0320

This P represents the probability of Y, your random variable having the value of little Y, the probability of Y trials /all.0325

That is unfortunate that people use the letter P for many different things.0348

It is a curse of the word probability starting with the letter P.0353

When you study probability, people tend to overuse letter P.0356

Everything is called P.0360

Unfortunately, you have to keep track of these and never use lowercase P for both of these, these are 2 different uses of the letter P.0362

Just keep track of that.0372

But having kept track of that, it should be pretty easy to remember this formula for the geometric distribution0374

because you just remember that you keep flipping a coin until you get your first head,0380

which means you have to fail on all the previous tries and then succeed on try number Y.0385

You fail Y -1 ×, that is what the Q - 1 gives you and you succeed on the very last time.0393

That is why there is 1 power of P to represent that final success there.0401

Let us keep going with this.0407

There is a couple key properties that we need for any distribution.0408

I will just list them out here.0412

The mean, remember, that is the same as the expected value.0414

Mean and expected value are the same thing.0416

The mean or the expected value for the geometric distribution is just 1/P.0425

The variance for the geometric distribution is Q/P².0430

Remember that Q is 1 – P, if you might see that written as 1 – P/P², those mean the same thing.0435

Standard deviation is always the square root of the variance.0446

It is just the square root of Q/P² which simplifies down to √ Q/P.0450

You can also write that as the √ 1 - P that would also be legit to say that that is the standard deviation.0456

There is a very useful fact that I want you remember from calculus0468

because it comes up a lot when you are doing geometric distribution problems.0473

That fact is the sum of an infinite geometric series.0477

Let me remind you of the formula for the sum of an infinite geometric series.0481

We covered this in calculus 2, if you do not remember this, you might want to go back and review this section of calculus 20485

because we use it a lot in probability, the sum of the geometric series.0493

What we learn back in calculus 2 is, if you have a series A + AR + AR², that is a geometric series0497

because to get through each term to the next, you are multiplying by R, you are multiplying by R each time.0506

I did not write this down on the initial slide but this only works if the common ratio has absolute value less than 1.0514

In this case, the sum of the geometric series is given by A/1 – R.0525

I think that formula is not very useful to remember.0534

I think it is a much more useful to remember the sum of geometric series in words.0538

The way I remember it is it is the first term divided by 1 - the common ratio.0543

That first term was A and the common ratio is the amount you are multiplying by to get from each term to the next.0549

The reason I like that formula better is because it avoids a lot of special cases.0557

If you look in your calculus book, you might see different special cases for the sum of AR to be N.0562

When N starts at 0, we might see 1 formula there.0570

You might see another formula when N starts at 1 of AR ⁺N, a different formula.0573

You end up having to memorize all these different formulas depending on subtle differences in how the sum is presented.0580

The formula in words is always the same, no matter how you give the series.0587

If you remember that one, you will never go wrong.0594

I really like this formula the first term / 1 - the common ratio, that is what I tend to remember.0596

When I apply that to adding up the sum of the geometric series, it always works.0605

Let me give you an example of how that comes up, when we are studying the geometric distribution.0611

A lot of times, when you are studying a probability problem,0617

you want to say what is the probability that it will take at least a certain number of trials to achieve success?0621

If I'm flipping a coin, what is the probability that I will have to flip it at least 3 × before I see my first head?0628

If I'm waiting for the Yankees to win the World Series, what is the probability that it will take at least 10 years for them to win the World Series,0636

or what is the probability that they will win sometime in the next 10 years?0644

The way we want to calculate that is, we want to add up all the values that are bigger than or equal to Y.0649

We want to add up if we want to find the probability that it is at least little Y.0659

We look at the probability of little Y + the probability of little Y + 1 little Y + 2, and so on, we add that up.0664

I’m just going to use my formula for the geometric distribution P of Y is equal to Q ⁺Y -1 × P.0673

I fill that in for P of Y and then for P of Y + 1, I just increment the exponent by 1, Q ⁺YP then Q ⁺Y + 1P, and so on.0684

That is a geometric series.0695

What we are doing each time is we are multiplying each term by Q.0697

We got a geometric series.0703

I use my first term/ 1 - common ratio formula.0705

The first term here is Q ⁺Y -1, fill that in.0710

The common ratio is Q, that is why I get 1 - Q in the denominator.0716

1-Q is just P, that was because Q itself was 1 – P, 1 – Q is P.0722

That cancels with the P in the numerator and we get just Q ⁺Y -1.0733

That is our probability that we will go at least Y trials until we get our success.0741

Another way to think about that is that in order for the experiment to last for Y trials,0748

say 10 trials, it means that you have to fail for the first 9 × that you run the experiment.0754

If you are going to run an experiment for 10 trials or more, that means you have to fail 9 ×.0762

Q was the probability of failure, you have to fail for the first Y -1 ×, that why you get Q ⁺Y -1 there.0771

Let us see how this plays out in some examples.0781

In the first example, we are going to draw cards from a deck until we get an ace.0784

We got a 52 card deck here, we just start pulling out cards until we get an ace.0789

A key question you should always ask about selection like this is do we replace the cards back in the deck before we draw the next one?0794

What I have told you in the problem is, if we do replace them that really change the answer.0803

You want to be very sure that you understand in probability questions, are you drawing with replacement or without replacement?0809

We are going to draw until we get an ace.0818

We are being asked what is the chance that we will draw exactly 3 × here?0819

This is the geometric distribution because we are doing a sequence of independent trials.0827

We are drawing a card, putting it back, trying another card, or possibly the same card.0832

Putting it back, drawing again, putting the card back.0836

We want to keep doing this until we get an ace and then we stop.0839

We want to find out how many times we are going to draw until we get that first ace?0844

Let us first of all figure out what our parameter is.0850

Our P is our probability of getting an ace when we draw a card from a 52 card deck.0853

There are 4 aces in there and there is 52 cards total, that is 1/13.0861

We have a 1/13 chance of success anytime we draw a card.0867

Q is always 1 – P, that is 12/13.0871

That is a chance it will fail on any particular drawing.0876

We have a 12/13 chance of getting something other than an ace.0880

Let me write down the formula for the geometric distribution.0886

P of Y is Q ⁺Y -1 × P.0890

In this case, our Y is equal to 3 because we want to know the chance that we will have to draw exactly 3 ×, P of 3.0895

In this case, Q is 12/13, I will fill that in, 12/13.0905

Y -1 is 3 -1 that is 2, P is 1/13.0911

I think I can simplify that a bit, it is 12²/13³.0920

I do not think I’m going to go ahead and multiply those out because I do not think the numbers will get any more illuminating there.0925

I will leave that as my answer.0933

You could multiply that out and get a decimal, if you wanted to.0936

Of course, there will be a number between 0 and 1, a fairly low number.0939

I do not think it will be very revealing, the decimal that you get.0944

That is our answer for example 1 here.0949

Just to recap how we did that, we figured out this is a geometric distribution0953

because we are running independent trials until we get a success.0957

The probability of success is, there are 4 aces out of 52 cards, it is 1/13.0961

Q was 1 – P, that is 12/13.0968

I will just drop those into my probability distribution formula.0971

The Y here is 3 because we want to draw exactly 3 × until we get an ace.0975

I filled in Q, I filled in Y, our Y -1, and I filled in P, and then I simplify that down to get an answer.0982

In example 2, we have the Akron Arvacks and they are competing in the northwestern championship of pin the tail on the donkey.0993

It looks like they have a 10% chance of winning the championship each year.1002

We want Y to be the number of years until they next win.1007

We are sitting there in Akron and we are hoping that the Arvarks are going to bring home the trophy this year.1011

We want to know how many years we might have to wait until we see that trophy.1018

We want to find the mean and standard deviation of that.1023

This is a geometric distribution once again, because we are waiting for the first success.1025

We are waiting for our home team to bring home that first trophy.1030

Our P, that is our probability of them winning in any given year, it is 10%.1036

I will write that as 1/10.1042

Our Q, the probability of failure , if they are not bringing home the trophy that is 1 – P.1044

Q is always 1 – P, that is 9/10, 1 -1/10.1051

I gave you the expected value formula which is the same as the mean in one of the earlier slides of this lecture.1059

It is always 1/P, that is 1/1/10 and that is exactly 10.1068

We want to express that in years.1077

That is not at all surprising, that is kind of the more intuitive results in probability.1079

The mean of the geometric distribution, if they have a 10% chance of bringing home the trophy in any given year1087

then on average, you expect to wait about 10 years until you bring home a trophy.1097

That is not surprising, about once every 10 years, they will bring home a trophy.1104

If you sit down the wait, on average it will take you about 10 years.1110

You will be waiting about 10 years to see them bring home a trophy.1114

The sigma² is the variance.1117

The first thing you figure out there was the mean.1121

Now, we are going to figure out the variance.1124

That is not what the problem is asking.1126

The problem is actually asking for standard deviation but the standard deviation is the square root of the variance.1128

Once you figure out the variance, it is very easy to find the standard deviation, we just take the square root.1134

Let us find the variance first.1139

The variance is sigma² is Q/P², that is 9/10 divided by P is 1/10.1141

P² is 1/100.1153

If I do the flip on the fraction then I get 100 × 9/10.1156

100/10 cancels to 10, I get 10 × 9 which is 90.1163

The standard deviation, once you found the variance that is very easy.1168

You just take the square root of the variance, the √ 90.1177

I can simplify that a little bit, I can pull a 9 out of the square root and we will turn into a 3.1183

I get 3 √10 and I just threw that into my calculator before I started this.1189

What I came up with was 3 √10 is 9. 487.1198

Our units there are years.1204

The standard deviation in the waiting time is 9.487 years.1208

That just means if you are waiting for Akron to bring home a trophy and pin the tail on the donkey,1214

you expect to wait about 10 years on average but the standard deviation is almost another 9 1/2 years on either side.1220

You could easily be waiting 20 years, for example, before they bring home their first trophy.1230

Let me remind you how we did each step there.1237

First identify that this was a geometric distribution because we are waiting for them to bring home their first trophy.1240

We are waiting for the first success in a sequence of trials.1246

Each year they go out, they play the championships, has a 10% chance they win everything.1250

That 10% is actually the probabilities, that is why I get the P=1/10, that came from the 10% there.1258

The Q is always 1 – P, I got the 9/10.1269

I just read off the mean and variance, and standard deviation formulas from one of the earlier slides in this lecture.1272

You can scroll back and you will see where those formulas come from.1281

The mean is 1/P that is 10 years.1284

We expect to wait about 10 years before we are going to bring home a trophy.1287

Very intuitive result, by the way.1291

The variance is Q/P², I filled in my Q and my P, simplified that down to 90.1293

The units on that would be years² which really is not very meaningful.1300

I did not write those in.1305

In fact, what we are looking for is the standard deviation.1306

That is the square root of the variance, always standard deviation you find it by taking the square root of the variance.1310

You always can find it that way.1317

That simplify down to 3 √10 and I made decimal of that 9.487 years is the standard deviation there.1320

In example 3 here, we are going to make things a little more complicated.1331

We are setting up a game between you and your friend and it is a fairly simple game but it will give us some interesting probability.1334

You just take turns rolling a dice and you get to roll first and then your friend rolls, and then you roll, then your friend rolls.1343

Whoever rolls a 6 wins and I guess the person has to pay them some money or something.1350

You are both trying to roll a 6.1357

If you a roll 6 right away, then you win.1359

If you fail to roll a 6 and then your friend rolls a 6, then your friend wins.1362

And then, you just keep going back and forth until the first person rolls a 6.1366

What is the chance that you will win on your third roll?1376

In exactly your third roll, you win the game.1379

What is the chance that your friend will get to roll 3 × or more?1382

What is the chance that you will win overall and that includes you winning on the first roll, maybe on your second roll, third roll, and so on.1387

Several different questions here.1394

This is a geometric distribution because if you can think of you and your friend rolling a dice together,1396

you are going to keep rolling the dice until the first 6 comes up, whether it is you or your friend rolls the 6,1404

you are going to keep rolling the dice until the first 6 comes up.1409

And then, the game is over, you make the payoff or somebody has to wash the dishes.1412

You do not roll it anymore.1419

That is definitely a geometric distribution because you are waiting for the first success.1420

This is geometric, probability is the chance of getting a 6 on any given roll, that is 1/6.1426

Our Q is 1 – P, that is 5/6.1436

I did not really left myself space on the slides, hang on to this because I’m going to go in the next slide and will answer the questions.1450

The first question is that you will win on your 3rd roll.1459

Let me remind you that you get to go first and you roll first and then your friend goes, and then you roll and then your friend goes,1464

and then you roll and so on, like that.1469

If you are going to win on your third roll, that means that we have to get that 6 on the 5th turn of the game.1476

That is exactly what would happen, in order for you to win on your 3rd roll of the game.1487

What we are really asking here is the probability that Y is equal to 5.1495

Let me remind you of the formula for probability distribution.1501

For geometric distribution, P of Y is Q ⁺Y-1 × P.1505

We already said on the previous slide, we already said that our P for this game is 1/61514

because that is probability of getting a 6 on any particular roll.1520

Our Q is just 1 – P, that is 5/6.1524

In this case, P of 5 is Q ⁺Y -1 that is 5/6 ⁺Y - 1 -5 here, that is⁴ × P is 1/6.1527

I can simplify that down a little bit into 5⁴/6⁴ × 6⁵.1541

That is the probability that you will win on exactly your 3rd roll of the game.1549

I did not find the decimal for that, it will be a pretty small number because1554

it is not very likely that you will win exactly on the 3rd roll of the game.1558

What is the probability that your friend will roll 3 × or more?1562

What that is really asking is, were your friend rolling 3 × or more, that is the 6th roll of the game.1568

In order to get to that turn, you have to have 6 rolls or more.1579

What we are really asking here is the probability that Y is greater than or equal to 6.1587

We work that out using a geometric series on one of the earlier slides.1597

That is the probability, let me just remind you the probability that Y1602

is greater than or equal to any particular value of little Y is Q ⁺Y -1.1609

We work that using a geometric series, you can also think about that as you have to fail Y -1 ×,1617

in order to get the success on the yth turn or later.1626

In this case, our Q is 5/6.1632

Our Y -1 is 6 -1, that is 5.1638

That does not really simplify so I’m just going to leave that in that form.1644

That is our chance that the game will run 6 turns or more in total, which will give your friend a chance to roll 3 × or more.1648

What is the probability that you will win this game?1661

That means you can win on any turn.1665

This is going to be a little more complicated.1667

The probability that you will win means that the game ends either on the first turn or that it ends on the third turn1669

because if it ends on the third term that means you rolled last and you win, or it ends on the 5th turn and so on.1680

It really means what is the probability that this game is going to go an odd number of terms.1686

The probability of 1 + the probability of 3, we will fill in each of these, + the probability of 5, and so on.1694

That is the probability that you will win this game and that the game will be won on either the first turn,1702

or the 3rd term, or the 5th turn, or so on.1709

Let me fill that in.1712

The probability of 1, for each one of those, I'm going to use my geometric distribution formula.1714

Q ⁺Y-1 × P, I’m just going to leave it in terms of P and Q for now.1721

When I fill in P and Q are, in a moment.1728

Q ⁺Y-1P when Y is equal to 1 that is Q ⁺0P, that is P +, when Y =3 that is Q ⁺2P.1732

Y = 5 will give us Q ⁺4P, Y = 7 I will fill in one more, + Q ⁺6P, and so on.1746

What I see here is a geometric series, this is a geometric series with,1760

I want to figure out what the common ratio is between each term.1777

I see that to get from the first term to the next one, I’m multiplying by Q².1780

To get from that term to the next one, I’m multiplying by Q² and so on.1786

It is Q² every time, that is why it is a geometric series.1791

My common ratio is Q².1794

I can use my formula for the sum of the geometric series.1798

Remember my formula, the best way to remember it is in words, first term/ 1 - the common ratio.1804

In this case, the first term is P, the common ratio is Q² 1 - Q².1818

I think I’m going to fill in what actual numbers for the P and the Q².1826

P where was that, it is up here is 1/6.1831

1 - Q² 1 - Q was 5/6.1838

Q² is 25/36.1842

I want to simplify that a bit, we will multiply top and bottom by 36.1846

Let me separate those.1853

Top and bottom by 36 there, so 36 × the top 36 × the bottom.1855

I will simplify things on the bottom.1860

That will give me a 6 in the top and in the bottom I will get 36 -25 which simplifies down to 6/11.1863

That is your chance of winning the game.1877

Your chance of winning the game is 6 /11.1879

Notice, by the way, that is a little bit more than ½.1882

6/12 would be ½, 6/11 is slightly more than ½.1885

That makes complete sense because you have a slight advantage in this game.1892

The advantage is that you got to roll first.1898

You are a little more likely to get a 6 before your friend is.1900

It is not a big advantage because 6 are not all that likely when you roll the dice.1906

In the long run, it would not really make a difference who got to roll first.1910

But you get to pick up a slight advantage by rolling first and that is why 6/11 is slightly bigger than ½.1914

Let me go to the steps again, make sure that everybody was able to follow them.1923

The key thing here is we are playing this game where you roll and then your friend rolls, and then you roll and then your friend rolls.1927

You want to keep track of the terms of the game that is sort of odd-even, odd-even, between you and your friend.1934

If you are going to win on your third roll, your third role is the 5th roll of the game.1941

That is why I put P of 5 here and then I use my formula for the geometric distribution to get Q ⁺Y -1.1947

The Y was 5 × P and then I just simplified that down to 5⁴/6⁵.1956

If your friend is going to roll 3 × or more, that means your friend has to get to his third roll.1965

His third roll is the 6th roll of the game that is why we are looking at the probability that Y is greater than or equal to 6.1972

We worked out the probability generically for the geometric distribution on one of the earlier slides.1981

You can go back and check it out, if you have not watched it recently but that probability is Q ⁺Y -1.1986

Another way to think about that, we worked it out using geometric series before but you can also think about that as,1994

in order for the game to last 6 or more turns, it means we have to fail on the first 5 turns.2001

Fail means we have to not roll a 6 on the first 5 turns.2008

The chance of not rolling a 6 is 5/6 and to last 6 turns, we got to roll something else 5 × in a row.2014

That is where that answer comes from.2024

Finally, we are asking what is the probability the you will win?2026

You get to roll on all odd turns, the first, the third, the fifth, and so on.2031

We are asking what is the probability that the game is won on the first turn or on the third turn, or on the fifth turn?2035

Adding up those probabilities and each one of those, I use my geometric distribution formula.2045

The cool thing is that when I'm wrote all those out, I noticed that I had a geometric series.2052

I had a common ratio of Q² that let me use my geometric series formula.2058

By the way, I reminded you of that, the geometric series formula earlier on in this lecture.2064

You scroll back a couple of slides, you will see that.2069

The first term/ 1 - common ratio, the first term is, the common ratio was Q².2072

Then I filled in the numbers for P and Q, that came from up here.2078

Fill the numbers for P and Q down here, did a little bit simplifying with the fractions and it came down to 6 /11.2083

I did not convert that into a decimal but one thing I know for sure is that that is a little bit bigger than 50%,2090

which makes sense because you get to roll first, you are a little bit more likely than your friend to win this game.2098

It is a very plausible answer that it can come out to be a little bit over 50%.2106

Let us move on to example 4, here we have a company that is interviewing applicants for jobs.2116

They have a job opening and they are interviewing their applicants.2122

In the general population, 10% of the applicants actually possess the right skills.2127

Maybe they have to have knowledge of a certain computer applications, for example.2132

Or they have to have study probability.2137

Only 10% of applicants for this job actually are qualified for the job.2139

The company is just going to interview people over and over again,2144

until they find 1 person who is qualify for the job and then they are going to hire that person.2147

We are asking here the probability that they will interview exactly 10 applicants2154

which essentially means that the first 9 people will be bombs and the 10th person is that qualified person,2159

and they are going to hire the 10th person.2166

Part B here, we are going to calculate the probability that they will interview at least 10 applicants.2171

Maybe, they will have to interview 50 people before they find the perfect person for that job but it is at least 10.2177

This is a geometric distribution because we have a sequence of trials, ultimately ending in 1 success.2184

As soon as we get 1 success, as soon as we find 1 person who is qualified, we hire that person and then we send everybody also away.2192

We stop the interview process right there.2200

This is a geometric distribution, let me fill in my parameters here.2203

The probability of any given person being a worthy applicant for the job is 10%, that is 1/10.2208

The Q is always 1 – P, in this case that is 9/10.2218

Let me write down some of the formulas that we had earlier on in the lecture because those would be useful for this.2224

P of Y is Q ⁺Y -1 × P and the probability that Y is greater than or equal to any particular value is just Q ⁺Y -1.2231

Let us go ahead and work that out.2247

In this case, for part A, we want to know what is the probability that we will interview exactly 10 applicants?2250

I have 9 failures and the 10th person is the perfect person for the job.2257

That is the probability of getting exactly 10 and from my formula, my Q ⁺Y -1 P formula that is Q⁹ × P, which our Q was 9/10.2263

Let me raise that to the 9th power, multiply it by a single power of P 1/10 and that is 9⁹/10⁹ × another 10, 10 ⁺10.2279

I did not try to find the decimal for that, it would again be quite small.2296

If you plug that into your calculator, it should be a very small decimal2299

because it is quite unlikely that we would have to interview exactly 10 applicants.2304

Most likely, we will find somebody earlier than that or probably later than that.2309

Let us find the probability that they will interview at least 10 applicants.2316

That is the probability that Y is greater than or equal to 10.2321

Y is the number of people that we interview.2327

Using our formula up here, the Q ⁺Y -1, that is Q⁹ which is 9/10⁹.2330

I did not simplify that but that will be a bit bigger, 10 × bigger than our previous answer.2344

That is our probability that we will interview at least 10 applicants.2360

That is our answer for both of these.2366

By the way, we are going to hang onto this example for the next problem.2368

Do not let these numbers and the whole situation completely slip your mind.2373

In the meantime, let me quickly remind you where everything came from here.2377

In part A, we want to find the probability that they will interview exactly 10 applicants.2382

It is a geometric distributions so I’m using my geometric distribution formula right here Q ⁺Y -1 P.2387

Our Y here is 10 and we have Q⁹ × P.2397

The values of P and Q, I got the P from this 10% right here,2404

that is the probability that any given applicant will be a success and Q is just 1 – that, that is 9/10.2409

Drop those numbers in and simplify it down.2416

In part B, we want the probability of interviewing at least 10 applicants.2419

At least 10 means the probability that it will be greater than or equal to 10.2425

Using the formula we derived back in one of the earlier slides, several slides ago, the beginning of this video, it is Q ⁺Y -1, that is Q⁹.2430

The way to think about that is, in order to interview at least 10 applicants,2443

that means the first 9 applicants are failures.2448

Each one of those 9 people has a 9/10 chance of being a failure, that means we have to see 9 failures in a row,2451

in order to ensure that we end up talking to at least 10 people.2459

Like I said, hang onto these numbers for the next example because2463

we are going to use the same scenario for the next example, for example 5.2468

In example 5, we are going to look back at the company from example 4.2475

If you have not just watched example 4, you really need to go back and read that one before example 5 will make sense.2479

Checkout example 4, there was a company interviewing applicants for a job opening and2489

each applicant has a 10% chance of being selected.2495

We interview and interview until we find a good one and then we keep that person, and we stop interviewing.2499

What we are doing in example 5 is we are keeping track of how long this procedure will take.2507

Apparently, it takes 3 hours to interview an unqualified applicant and 5 hours to interview a qualified applicant.2512

All of these people that do not meet the qualifications is going to take 3 hours each for us to figure out2531

that these people are actually bombs and do not deserve to be here.2538

And then, we finally get somebody that we think is qualified, we are going to interview them2543

for an extra 2 hours just to make sure that they really are the right person for this job.2547

We want to calculate the mean and the standard deviation of the time to conduct all the interviews.2551

How long do we expect this interview process to take at this company?2557

Let me show you how to set that up, we have not really seen a problem like this before.2563

This is our first one.2567

Let me set up a variable that represents time here, T is going to be the time.2571

Remember, Y is the number of applicants that we speak to.2578

Remember, the deal here is we are going to keep interviewing until we find somebody good.2592

That means if we find many good on the 16th try, that means we interviewed 15 people2600

who did not measure up and then number 16 was the good one.2606

In general T is, we have all the people who do not measure up, there is Y -1 of them.2610

Each one of those people cost us 3 hours each.2618

They cost us 3 hours to find out that those people did not actually deserve the job.2622

The last person, the person who is good that we actually want to give the job to,2627

we have to do some extra scrutiny on that person.2634

It took us 5 hours to interview her because we wanted to make extra sure that she was really qualified for the job.2637

The total time is 3 × Y -1 + 5.2644

We can simplify that a bit, that is 3 Y - 3 + 5 which is 3 Y + 2.2648

That is the total time that it takes and we want to find the expected value, the mean of that, and the standard deviation.2661

Let me calculate first the expected value and the variance of Y because those are going to be useful intermediate steps.2669

Let me remind you here what our P was.2680

Our P was 1/10 for this problem.2683

That is because 10% of the applicants have the right qualifications.2687

Our Q is always 1 – P, it is 9/10 here.2693

Our expected value of Y, what we learned at the beginning of this lecture is that it is always 1/P.2702

In this case, it is always 1/P.2712

1/1/10 is just 10.2718

Let me go ahead and find variance of Y because that is going to be useful as a steppingstone to finding the standard deviation.2724

The variance of Y is always Q/P².2732

Again, it is coming from one of the first slides in this lecture.2736

You can scroll back and you can find that.2740

In this case, the Q is 9/10, P² is 1/100.2743

If we do a flip on the denominator, flip that up, we get john 900/10, that is 90.2750

That is the variance of Y.2758

What we are really want is the mean and standard deviation of T.2761

Let me go ahead and figure those out for T.2766

We want the expected value of T but that is the expected value.2769

T was 3 Y + 2 and it is time to remember some properties of expectation.2774

In particular, expectation is linear.2784

You can write this as 3 × E of Y + 2.2786

We can pull the 2 out because expectation is linear.2792

This is 3 × E of Y was 10, 3 × 10 + 2 is 32.2795

That is the expected amount of time to conduct all these interviews.2805

Our unit here is hour, let me go ahead and fill that in.2809

32 hours is the expected amount of time to conduct all these interviews.2812

The variance is more complicated and I want to remind you of an old rule in probability.2821

The variance of AY + B is equal to A² × the variance of Y.2828

There is no B in the answer.2840

That is an old rule in the probability that is very useful and we are going to invoke it right here, the variance of AY + B.2842

The B does not affect it, that is shifting all the data over.2849

It does not affect how much they vary.2854

You get A² × V of Y.2856

The variance of T here is the variance of 3 Y + 2 which is now, if I use my A is equal to 32858

and my B is equal to 2 then I get A² × V of Y.2872

But B² is 9, so 9 × the variance of Y.2877

I figure out the variance of Y up here was 90.2882

That is 9 × 90 which is 810, that is the variance.2885

It is not the standard deviation, we are trying to find the standard deviation.2892

This was variance, this is the mean that we found up above here.2896

What I really want is the standard deviation.2903

The standard deviation is the square root of the variance of T which is √810.2909

I can factor 81 out of that, it is a perfect square so I get 9 × √10.2920

I did calculate that decimal for that, I calculated 9 √10 is about 28.46.2928

Since, this is a standard deviation my units there would be hours.2939

If you are a company and you are planning this interview process, you know that on average,2946

about 1 and 10 applicants is going to have the right skills.2951

On average, it will take about 32 hours to find the right employee for the job.2954

The standard deviation on that figure is 28.46 hours.2962

I guess that was an approximation, I should be clear that that was a calculator approximation right there.2969

Let me show you how I did that.2977

It was one of the trickier problems here.2979

What I want to do was write a formula for the amount of time it takes to conduct all the interviews.2981

Think of Y as being the number of applicants and T is the time we spend on them.2989

What we are doing is all the unqualified people that we speak to, we spent 3 hours each on them.2995

Remember, the last person to talk to is the person we hire.3002

As soon as we find a good person, we hire her.3006

That means all the previous people were unqualified, that is Y -1 people.3010

We spend 3 hours on each that is why we have Y - 1 there.3017

The last person cost us 5 hours because we want to give that last some extra scrutiny3021

and make sure that that person really is qualified for the job.3029

We get 3 × Y -1 + 5 that simplifies down to 3 Y + 2.3032

We want to find the mean, and the variance, and standard deviation of T of 3 Y + 2.3038

To find that, we need the mean and variance of Y itself.3044

Using those formulas that I gave you on one of the first slides in this lecture, the mean of the variance are 1 /P and Q/P².3050

The P was 1/10, that was coming from example f4, the previous slide.3061

Because 10% of the applicants are qualified, your chance of getting a qualified applicant is 1/10.3067

The Q is just 1 - P 9/10.3076

We drop those numbers in here and you get the mean and variance of Y as being 10 and 90.3078

To find the mean and the variance of T, to find the mean, we are going to use linearity.3086

The mean is linear, it is 3 × the mean of Y + 2.3093

3 × 10 + 2 is 32 hours.3096

The variance is not linear, we have this rule right here which tells us what to do with linear expressions in the variance.3100

The variance of 3 Y + 2, the 2 is the B but it turns out that that had no affect on the answer3108

because there is no B up here, it is just A² Y.3118

It is 9 × the variance of Y, 9 × that is our 90 right here, it is coming in here, we get 810.3121

And the standard deviation is always the square root of the variance.3132

We take √810 and I simplify that down and found the decimal to 28.46 hours.3135

That is our mean and our standard deviation, and the time to conduct all these interviews,3145

if you are in this company planning for how long it might take to fill your next job opening.3151

That is our last example, that wraps up the geometric distribution.3158

You are watching the probability lectures here on www.educator.com.3163

My name is Will Murray, thank you very much for watching, bye.3167

OR

### Start Learning Now

Our free lessons will get you started (Adobe Flash® required).