  William Murray

Hypergeometric Distribution

Slide Duration:

Section 1: Probability by Counting
Experiments, Outcomes, Samples, Spaces, Events

59m 30s

Intro
0:00
Terminology
0:19
Experiment
0:26
Outcome
0:56
Sample Space
1:16
Event
1:55
Key Formula
2:47
Formula for Finding the Probability of an Event
2:48
Example: Drawing a Card
3:36
Example I
5:01
Experiment
5:38
Outcomes
5:54
Probability of the Event
8:11
Example II
12:00
Experiment
12:17
Outcomes
12:34
Probability of the Event
13:49
Example III
16:33
Experiment
17:09
Outcomes
17:33
Probability of the Event
18:25
Example IV
21:20
Experiment
21:21
Outcomes
22:00
Probability of the Event
23:22
Example V
31:41
Experiment
32:14
Outcomes
32:35
Probability of the Event
33:27
Alternate Solution
40:16
Example VI
43:33
Experiment
44:08
Outcomes
44:24
Probability of the Event
53:35

1h 2m 47s

Intro
0:00
Unions of Events
0:40
Unions of Events
0:41
Disjoint Events
3:42
Intersections of Events
4:18
Intersections of Events
4:19
Conditional Probability
5:47
Conditional Probability
5:48
Independence
8:20
Independence
8:21
Warning: Independent Does Not Mean Disjoint
9:53
If A and B are Independent
11:20
Example I: Choosing a Number at Random
12:41
Solving by Counting
12:52
Solving by Probability
17:26
Example II: Combination
22:07
Combination Deal at a Restaurant
22:08
Example III: Rolling Two Dice
24:18
Define the Events
24:20
Solving by Counting
27:35
Solving by Probability
29:32
Example IV: Flipping a Coin
35:07
Flipping a Coin Four Times
35:08
Example V: Conditional Probabilities
41:22
Define the Events
42:23
Calculate the Conditional Probabilities
46:21
Example VI: Independent Events
53:42
Define the Events
53:43
Are Events Independent?
55:21
Choices: Combinations & Permutations

56m 3s

Intro
0:00
Choices: With or Without Replacement?
0:12
Choices: With or Without Replacement?
0:13
Example: With Replacement
2:17
Example: Without Replacement
2:55
Choices: Ordered or Unordered?
4:10
Choices: Ordered or Unordered?
4:11
Example: Unordered
4:52
Example: Ordered
6:08
Combinations
9:23
Definition & Equation: Combinations
9:24
Example: Combinations
12:12
Permutations
13:56
Definition & Equation: Permutations
13:57
Example: Permutations
15:00
Key Formulas
17:19
Number of Ways to Pick r Things from n Possibilities
17:20
Example I: Five Different Candy Bars
18:31
Example II: Five Identical Candy Bars
24:53
Example III: Five Identical Candy Bars
31:56
Example IV: Five Different Candy Bars
39:21
Example V: Pizza & Toppings
45:03
Inclusion & Exclusion

43m 40s

Intro
0:00
Inclusion/Exclusion: Two Events
0:09
Inclusion/Exclusion: Two Events
0:10
Inclusion/Exclusion: Three Events
2:30
Inclusion/Exclusion: Three Events
2:31
Example I: Inclusion & Exclusion
6:24
Example II: Inclusion & Exclusion
11:01
Example III: Inclusion & Exclusion
18:41
Example IV: Inclusion & Exclusion
28:24
Example V: Inclusion & Exclusion
39:33
Independence

46m 9s

Intro
0:00
Formula and Intuition
0:12
Definition of Independence
0:19
Intuition
0:49
Common Misinterpretations
1:37
Myth & Truth 1
1:38
Myth & Truth 2
2:23
Combining Independent Events
3:56
Recall: Formula for Conditional Probability
3:58
Combining Independent Events
4:10
Example I: Independence
5:36
Example II: Independence
14:14
Example III: Independence
21:10
Example IV: Independence
32:45
Example V: Independence
41:13
Bayes' Rule

1h 2m 10s

Intro
0:00
When to Use Bayes' Rule
0:08
When to Use Bayes' Rule: Disjoint Union of Events
0:09
Bayes' Rule for Two Choices
2:50
Bayes' Rule for Two Choices
2:51
Bayes' Rule for Multiple Choices
5:03
Bayes' Rule for Multiple Choices
5:04
Example I: What is the Chance that She is Diabetic?
6:55
Example I: Setting up the Events
6:56
Example I: Solution
11:33
Example II: What is the chance that It Belongs to a Woman?
19:28
Example II: Setting up the Events
19:29
Example II: Solution
21:45
Example III: What is the Probability that She is a Democrat?
27:31
Example III: Setting up the Events
27:32
Example III: Solution
32:08
Example IV: What is the chance that the Fruit is an Apple?
39:11
Example IV: Setting up the Events
39:12
Example IV: Solution
43:50
Example V: What is the Probability that the Oldest Child is a Girl?
51:16
Example V: Setting up the Events
51:17
Example V: Solution
53:07
Section 2: Random Variables
Random Variables & Probability Distribution

38m 21s

Intro
0:00
Intuition
0:15
Intuition for Random Variable
0:16
Example: Random Variable
0:44
Intuition, Cont.
2:52
Example: Random Variable as Payoff
2:57
Definition
5:11
Definition of a Random Variable
5:13
Example: Random Variable in Baseball
6:02
Probability Distributions
7:18
Probability Distributions
7:19
Example I: Probability Distribution for the Random Variable
9:29
Example II: Probability Distribution for the Random Variable
14:52
Example III: Probability Distribution for the Random Variable
21:52
Example IV: Probability Distribution for the Random Variable
27:25
Example V: Probability Distribution for the Random Variable
34:12
Expected Value (Mean)

46m 14s

Intro
0:00
Definition of Expected Value
0:20
Expected Value of a (Discrete) Random Variable or Mean
0:21
Indicator Variables
3:03
Indicator Variable
3:04
Linearity of Expectation
4:36
Linearity of Expectation for Random Variables
4:37
Expected Value of a Function
6:03
Expected Value of a Function
6:04
Example I: Expected Value
7:30
Example II: Expected Value
14:14
Example III: Expected Value of Flipping a Coin
21:42
Example III: Part A
21:43
Example III: Part B
30:43
Example IV: Semester Average
36:39
Example V: Expected Value of a Function of a Random Variable
41:28
Variance & Standard Deviation

47m 23s

Intro
0:00
Definition of Variance
0:11
Variance of a Random Variable
0:12
Variance is a Measure of the Variability, or Volatility
1:06
Most Useful Way to Calculate Variance
2:46
Definition of Standard Deviation
3:44
Standard Deviation of a Random Variable
3:45
Example I: Which of the Following Sets of Data Has the Largest Variance?
5:34
Example II: Which of the Following Would be the Least Useful in Understanding a Set of Data?
9:02
Example III: Calculate the Mean, Variance, & Standard Deviation
11:48
Example III: Mean
12:56
Example III: Variance
14:06
Example III: Standard Deviation
15:42
Example IV: Calculate the Mean, Variance, & Standard Deviation
17:54
Example IV: Mean
18:47
Example IV: Variance
20:36
Example IV: Standard Deviation
25:34
Example V: Calculate the Mean, Variance, & Standard Deviation
29:56
Example V: Mean
30:13
Example V: Variance
33:28
Example V: Standard Deviation
34:48
Example VI: Calculate the Mean, Variance, & Standard Deviation
37:29
Example VI: Possible Outcomes
38:09
Example VI: Mean
39:29
Example VI: Variance
41:22
Example VI: Standard Deviation
43:28
Markov's Inequality

26m 45s

Intro
0:00
Markov's Inequality
0:25
Markov's Inequality: Definition & Condition
0:26
Markov's Inequality: Equation
1:15
Markov's Inequality: Reverse Equation
2:48
Example I: Money
4:11
Example II: Rental Car
9:23
Example III: Probability of an Earthquake
12:22
Example IV: Defective Laptops
16:52
Example V: Cans of Tuna
21:06
Tchebysheff's Inequality

42m 11s

Intro
0:00
Tchebysheff's Inequality (Also Known as Chebyshev's Inequality)
0:52
Tchebysheff's Inequality: Definition
0:53
Tchebysheff's Inequality: Equation
1:19
Tchebysheff's Inequality: Intuition
3:21
Tchebysheff's Inequality in Reverse
4:09
Tchebysheff's Inequality in Reverse
4:10
Intuition
5:13
Example I: Money
5:55
Example II: College Units
13:20
Example III: Using Tchebysheff's Inequality to Estimate Proportion
16:40
Example IV: Probability of an Earthquake
25:21
Example V: Using Tchebysheff's Inequality to Estimate Proportion
32:57
Section 3: Discrete Distributions
Binomial Distribution (Bernoulli Trials)

52m 36s

Intro
0:00
Binomial Distribution
0:29
Binomial Distribution (Bernoulli Trials) Overview
0:30
Prototypical Examples: Flipping a Coin n Times
1:36
Process with Two Outcomes: Games Between Teams
2:12
Process with Two Outcomes: Rolling a Die to Get a 6
2:42
Formula for the Binomial Distribution
3:45
Fixed Parameters
3:46
Formula for the Binomial Distribution
6:27
Key Properties of the Binomial Distribution
9:54
Mean
9:55
Variance
10:56
Standard Deviation
11:13
Example I: Games Between Teams
11:36
Example II: Exam Score
17:01
Example III: Expected Grade & Standard Deviation
25:59
Example IV: Pogo-sticking Championship, Part A
33:25
Example IV: Pogo-sticking Championship, Part B
38:24
Example V: Expected Championships Winning & Standard Deviation
45:22
Geometric Distribution

52m 50s

Intro
0:00
Geometric Distribution
0:22
Geometric Distribution: Definition
0:23
Prototypical Example: Flipping a Coin Until We Get a Head
1:08
Geometric Distribution vs. Binomial Distribution.
1:31
Formula for the Geometric Distribution
2:13
Fixed Parameters
2:14
Random Variable
2:49
Formula for the Geometric Distribution
3:16
Key Properties of the Geometric Distribution
6:47
Mean
6:48
Variance
7:10
Standard Deviation
7:25
Geometric Series
7:46
Recall from Calculus II: Sum of Infinite Series
7:47
Application to Geometric Distribution
10:10
Example I: Drawing Cards from a Deck (With Replacement) Until You Get an Ace
13:02
Example I: Question & Solution
13:03
Example II: Mean & Standard Deviation of Winning Pin the Tail on the Donkey
16:32
Example II: Mean
16:33
Example II: Standard Deviation
18:37
Example III: Rolling a Die
22:09
Example III: Setting Up
22:10
Example III: Part A
24:18
Example III: Part B
26:01
Example III: Part C
27:38
Example III: Summary
32:02
Example IV: Job Interview
35:16
Example IV: Setting Up
35:15
Example IV: Part A
37:26
Example IV: Part B
38:33
Example IV: Summary
39:37
Example V: Mean & Standard Deviation of Time to Conduct All the Interviews
41:13
Example V: Setting Up
42:50
Example V: Mean
46:05
Example V: Variance
47:37
Example V: Standard Deviation
48:22
Example V: Summary
49:36
Negative Binomial Distribution

51m 39s

Intro
0:00
Negative Binomial Distribution
0:11
Negative Binomial Distribution: Definition
0:12
Prototypical Example: Flipping a Coin Until We Get r Successes
0:46
Negative Binomial Distribution vs. Binomial Distribution
1:04
Negative Binomial Distribution vs. Geometric Distribution
1:33
Formula for Negative Binomial Distribution
3:39
Fixed Parameters
3:40
Random Variable
4:57
Formula for Negative Binomial Distribution
5:18
Key Properties of Negative Binomial
7:44
Mean
7:47
Variance
8:03
Standard Deviation
8:09
Example I: Drawing Cards from a Deck (With Replacement) Until You Get Four Aces
8:32
Example I: Question & Solution
8:33
Example II: Chinchilla Grooming
12:37
Example II: Mean
12:38
Example II: Variance
15:09
Example II: Standard Deviation
15:51
Example II: Summary
17:10
Example III: Rolling a Die Until You Get Four Sixes
18:27
Example III: Setting Up
19:38
Example III: Mean
19:38
Example III: Variance
20:31
Example III: Standard Deviation
21:21
Example IV: Job Applicants
24:00
Example IV: Setting Up
24:01
Example IV: Part A
26:16
Example IV: Part B
29:53
Example V: Mean & Standard Deviation of Time to Conduct All the Interviews
40:10
Example V: Setting Up
40:11
Example V: Mean
45:24
Example V: Variance
46:22
Example V: Standard Deviation
47:01
Example V: Summary
48:16
Hypergeometric Distribution

36m 27s

Intro
0:00
Hypergeometric Distribution
0:11
Hypergeometric Distribution: Definition
0:12
Random Variable
1:38
Formula for the Hypergeometric Distribution
1:50
Fixed Parameters
1:51
Formula for the Hypergeometric Distribution
2:53
Key Properties of Hypergeometric
6:14
Mean
6:15
Variance
6:42
Standard Deviation
7:16
Example I: Students Committee
7:30
Example II: Expected Number of Women on the Committee in Example I
11:08
Example III: Pairs of Shoes
13:49
Example IV: What is the Expected Number of Left Shoes in Example III?
20:46
Example V: Using Indicator Variables & Linearity of Expectation
25:40
Poisson Distribution

52m 19s

Intro
0:00
Poisson Distribution
0:18
Poisson Distribution: Definition
0:19
Formula for the Poisson Distribution
2:16
Fixed Parameter
2:17
Formula for the Poisson Distribution
2:59
Key Properties of the Poisson Distribution
5:30
Mean
5:34
Variance
6:07
Standard Deviation
6:27
Example I: Forest Fires
6:41
Example II: Call Center, Part A
15:56
Example II: Call Center, Part B
20:50
Example III: Confirming that the Mean of the Poisson Distribution is λ
26:53
Example IV: Find E (Y²) for the Poisson Distribution
35:24
Example V: Earthquakes, Part A
37:57
Example V: Earthquakes, Part B
44:02
Section 4: Continuous Distributions
Density & Cumulative Distribution Functions

57m 17s

Intro
0:00
Density Functions
0:43
Density Functions
0:44
Density Function to Calculate Probabilities
2:41
Cumulative Distribution Functions
4:28
Cumulative Distribution Functions
4:29
Using F to Calculate Probabilities
5:58
Properties of the CDF (Density & Cumulative Distribution Functions)
7:27
F(-∞) = 0
7:34
F(∞) = 1
8:30
F is Increasing
9:14
F'(y) = f(y)
9:21
Example I: Density & Cumulative Distribution Functions, Part A
9:43
Example I: Density & Cumulative Distribution Functions, Part B
14:16
Example II: Density & Cumulative Distribution Functions, Part A
21:41
Example II: Density & Cumulative Distribution Functions, Part B
26:16
Example III: Density & Cumulative Distribution Functions, Part A
32:17
Example III: Density & Cumulative Distribution Functions, Part B
37:08
Example IV: Density & Cumulative Distribution Functions
43:34
Example V: Density & Cumulative Distribution Functions, Part A
51:53
Example V: Density & Cumulative Distribution Functions, Part B
54:19
Mean & Variance for Continuous Distributions

36m 18s

Intro
0:00
Mean
0:32
Mean for a Continuous Random Variable
0:33
Expectation is Linear
2:07
Variance
2:55
Variance for Continuous random Variable
2:56
Easier to Calculate Via the Mean
3:26
Standard Deviation
5:03
Standard Deviation
5:04
Example I: Mean & Variance for Continuous Distributions
5:43
Example II: Mean & Variance for Continuous Distributions
10:09
Example III: Mean & Variance for Continuous Distributions
16:05
Example IV: Mean & Variance for Continuous Distributions
26:40
Example V: Mean & Variance for Continuous Distributions
30:12
Uniform Distribution

32m 49s

Intro
0:00
Uniform Distribution
0:15
Uniform Distribution
0:16
Each Part of the Region is Equally Probable
1:39
Key Properties of the Uniform Distribution
2:45
Mean
2:46
Variance
3:27
Standard Deviation
3:48
Example I: Newspaper Delivery
5:25
Example II: Picking a Real Number from a Uniform Distribution
8:21
Example III: Dinner Date
11:02
Example IV: Proving that a Variable is Uniformly Distributed
18:50
Example V: Ice Cream Serving
27:22
Normal (Gaussian) Distribution

1h 3m 54s

Intro
0:00
Normal (Gaussian) Distribution
0:35
Normal (Gaussian) Distribution & The Bell Curve
0:36
Fixed Parameters
0:55
Formula for the Normal Distribution
1:32
Formula for the Normal Distribution
1:33
Calculating on the Normal Distribution can be Tricky
3:32
Standard Normal Distribution
5:12
Standard Normal Distribution
5:13
Graphing the Standard Normal Distribution
6:13
Standard Normal Distribution, Cont.
8:30
Standard Normal Distribution Chart
8:31
Nonstandard Normal Distribution
14:44
Nonstandard Normal Variable & Associated Standard Normal
14:45
Finding Probabilities for Z
15:39
Example I: Chance that Standard Normal Variable Will Land Between 1 and 2?
16:46
Example I: Setting Up the Equation & Graph
16:47
Example I: Solving for z Using the Standard Normal Chart
19:05
Example II: What Proportion of the Data Lies within Two Standard Deviations of the Mean?
20:41
Example II: Setting Up the Equation & Graph
20:42
Example II: Solving for z Using the Standard Normal Chart
24:38
Example III: Scores on an Exam
27:34
Example III: Setting Up the Equation & Graph, Part A
27:35
Example III: Setting Up the Equation & Graph, Part B
33:48
Example III: Solving for z Using the Standard Normal Chart, Part A
38:23
Example III: Solving for z Using the Standard Normal Chart, Part B
40:49
Example IV: Temperatures
42:54
Example IV: Setting Up the Equation & Graph
42:55
Example IV: Solving for z Using the Standard Normal Chart
47:03
Example V: Scores on an Exam
48:41
Example V: Setting Up the Equation & Graph, Part A
48:42
Example V: Setting Up the Equation & Graph, Part B
53:20
Example V: Solving for z Using the Standard Normal Chart, Part A
57:45
Example V: Solving for z Using the Standard Normal Chart, Part B
59:17
Gamma Distribution (with Exponential & Chi-square)

1h 8m 27s

Intro
0:00
Gamma Function
0:49
The Gamma Function
0:50
Properties of the Gamma Function
2:07
Formula for the Gamma Distribution
3:50
Fixed Parameters
3:51
Density Function for Gamma Distribution
4:07
Key Properties of the Gamma Distribution
7:13
Mean
7:14
Variance
7:25
Standard Deviation
7:30
Exponential Distribution
8:03
Definition of Exponential Distribution
8:04
Density
11:23
Mean
13:26
Variance
13:48
Standard Deviation
13:55
Chi-square Distribution
14:34
Chi-square Distribution: Overview
14:35
Chi-square Distribution: Mean
16:27
Chi-square Distribution: Variance
16:37
Chi-square Distribution: Standard Deviation
16:55
Example I: Graphing Gamma Distribution
17:30
Example I: Graphing Gamma Distribution
17:31
Example I: Describe the Effects of Changing α and β on the Shape of the Graph
23:33
Example II: Exponential Distribution
27:11
Example II: Using the Exponential Distribution
27:12
Example II: Summary
35:34
Example III: Earthquake
37:05
Example III: Estimate Using Markov's Inequality
37:06
Example III: Estimate Using Tchebysheff's Inequality
40:13
Example III: Summary
44:13
Example IV: Finding Exact Probability of Earthquakes
46:45
Example IV: Finding Exact Probability of Earthquakes
46:46
Example IV: Summary
51:44
Example V: Prove and Interpret Why the Exponential Distribution is Called 'Memoryless'
52:51
Example V: Prove
52:52
Example V: Interpretation
57:44
Example V: Summary
1:03:54
Beta Distribution

52m 45s

Intro
0:00
Beta Function
0:29
Fixed parameters
0:30
Defining the Beta Function
1:19
Relationship between the Gamma & Beta Functions
2:02
Beta Distribution
3:31
Density Function for the Beta Distribution
3:32
Key Properties of the Beta Distribution
6:56
Mean
6:57
Variance
7:16
Standard Deviation
7:37
Example I: Calculate B(3,4)
8:10
Example II: Graphing the Density Functions for the Beta Distribution
12:25
Example III: Show that the Uniform Distribution is a Special Case of the Beta Distribution
24:57
Example IV: Show that this Triangular Distribution is a Special Case of the Beta Distribution
31:20
Example V: Morning Commute
37:39
Example V: Identify the Density Function
38:45
Example V: Morning Commute, Part A
42:22
Example V: Morning Commute, Part B
44:19
Example V: Summary
49:13
Moment-Generating Functions

51m 58s

Intro
0:00
Moments
0:30
Definition of Moments
0:31
Moment-Generating Functions (MGFs)
3:53
Moment-Generating Functions
3:54
Using the MGF to Calculate the Moments
5:21
Moment-Generating Functions for the Discrete Distributions
8:22
Moment-Generating Functions for Binomial Distribution
8:36
Moment-Generating Functions for Geometric Distribution
9:06
Moment-Generating Functions for Negative Binomial Distribution
9:28
Moment-Generating Functions for Hypergeometric Distribution
9:43
Moment-Generating Functions for Poisson Distribution
9:57
Moment-Generating Functions for the Continuous Distributions
11:34
Moment-Generating Functions for the Uniform Distributions
11:43
Moment-Generating Functions for the Normal Distributions
12:24
Moment-Generating Functions for the Gamma Distributions
12:36
Moment-Generating Functions for the Exponential Distributions
12:44
Moment-Generating Functions for the Chi-square Distributions
13:11
Moment-Generating Functions for the Beta Distributions
13:48
Useful Formulas with Moment-Generating Functions
15:02
Useful Formulas with Moment-Generating Functions 1
15:03
Useful Formulas with Moment-Generating Functions 2
16:21
Example I: Moment-Generating Function for the Binomial Distribution
17:33
Example II: Use the MGF for the Binomial Distribution to Find the Mean of the Distribution
24:40
Example III: Find the Moment Generating Function for the Poisson Distribution
29:28
Example IV: Use the MGF for Poisson Distribution to Find the Mean and Variance of the Distribution
36:27
Example V: Find the Moment-generating Function for the Uniform Distribution
44:47
Section 5: Multivariate Distributions
Bivariate Density & Distribution Functions

50m 52s

Intro
0:00
Bivariate Density Functions
0:21
Two Variables
0:23
Bivariate Density Function
0:52
Properties of the Density Function
1:57
Properties of the Density Function 1
1:59
Properties of the Density Function 2
2:20
We Can Calculate Probabilities
2:53
If You Have a Discrete Distribution
4:36
Bivariate Distribution Functions
5:25
Bivariate Distribution Functions
5:26
Properties of the Bivariate Distribution Functions 1
7:19
Properties of the Bivariate Distribution Functions 2
7:36
Example I: Bivariate Density & Distribution Functions
8:08
Example II: Bivariate Density & Distribution Functions
14:40
Example III: Bivariate Density & Distribution Functions
24:33
Example IV: Bivariate Density & Distribution Functions
32:04
Example V: Bivariate Density & Distribution Functions
40:26
Marginal Probability

42m 38s

Intro
0:00
Discrete Case
0:48
Marginal Probability Functions
0:49
Continuous Case
3:07
Marginal Density Functions
3:08
Example I: Compute the Marginal Probability Function
5:58
Example II: Compute the Marginal Probability Function
14:07
Example III: Marginal Density Function
24:01
Example IV: Marginal Density Function
30:47
Example V: Marginal Density Function
36:05
Conditional Probability & Conditional Expectation

1h 2m 24s

Intro
0:00
Review of Marginal Probability
0:46
Recall the Marginal Probability Functions & Marginal Density Functions
0:47
Conditional Probability, Discrete Case
3:14
Conditional Probability, Discrete Case
3:15
Conditional Probability, Continuous Case
4:15
Conditional Density of Y₁ given that Y₂ = y₂
4:16
Interpret This as a Density on Y₁ & Calculate Conditional Probability
5:03
Conditional Expectation
6:44
Conditional Expectation: Continuous
6:45
Conditional Expectation: Discrete
8:03
Example I: Conditional Probability
8:29
Example II: Conditional Probability
23:59
Example III: Conditional Probability
34:28
Example IV: Conditional Expectation
43:16
Example V: Conditional Expectation
48:28
Independent Random Variables

51m 39s

Intro
0:00
Intuition
0:55
Experiment with Two Random Variables
0:56
Intuition Formula
2:17
Definition and Formulas
4:43
Definition
4:44
Short Version: Discrete
5:10
Short Version: Continuous
5:48
Theorem
9:33
For Continuous Random Variables, Y₁ & Y₂ are Independent If & Only If: Condition 1
9:34
For Continuous Random Variables, Y₁ & Y₂ are Independent If & Only If: Condition 2
11:22
Example I: Use the Definition to Determine if Y₁ and Y₂ are Independent
12:49
Example II: Use the Definition to Determine if Y₁ and Y₂ are Independent
21:33
Example III: Are Y₁ and Y₂ Independent?
27:01
Example IV: Are Y₁ and Y₂ Independent?
34:51
Example V: Are Y₁ and Y₂ Independent?
43:44
Expected Value of a Function of Random Variables

37m 7s

Intro
0:00
Review of Single Variable Case
0:29
Expected Value of a Single Variable
0:30
Expected Value of a Function g(Y)
1:12
Bivariate Case
2:11
Expected Value of a Function g(Y₁, Y₂)
2:12
Linearity of Expectation
3:24
Linearity of Expectation 1
3:25
Linearity of Expectation 2
3:38
4:03
Example I: Calculate E (Y₁ + Y₂)
4:39
Example II: Calculate E (Y₁Y₂)
14:47
Example III: Calculate E (U₁) and E(U₂)
19:33
Example IV: Calculate E (Y₁) and E(Y₂)
22:50
Example V: Calculate E (2Y₁ + 3Y₂)
33:05
Covariance, Correlation & Linear Functions

59m 50s

Intro
0:00
Definition and Formulas for Covariance
0:38
Definition of Covariance
0:39
Formulas to Calculate Covariance
1:36
Intuition for Covariance
3:54
Covariance is a Measure of Dependence
3:55
Dependence Doesn't Necessarily Mean that the Variables Do the Same Thing
4:12
If Variables Move Together
4:47
If Variables Move Against Each Other
5:04
Both Cases Show Dependence!
5:30
Independence Theorem
8:10
Independence Theorem
8:11
The Converse is Not True
8:32
Correlation Coefficient
9:33
Correlation Coefficient
9:34
Linear Functions of Random Variables
11:57
Linear Functions of Random Variables: Expected Value
11:58
Linear Functions of Random Variables: Variance
12:58
Linear Functions of Random Variables, Cont.
14:30
Linear Functions of Random Variables: Covariance
14:35
Example I: Calculate E (Y₁), E (Y₂), and E (Y₁Y₂)
15:31
Example II: Are Y₁ and Y₂ Independent?
29:16
Example III: Calculate V (U₁) and V (U₂)
36:14
Example IV: Calculate the Covariance Correlation Coefficient
42:12
Example V: Find the Mean and Variance of the Average
52:19
Section 6: Distributions of Functions of Random Variables
Distribution Functions

1h 7m 35s

Intro
0:00
Premise
0:44
Premise
0:45
Goal
1:38
Goal Number 1: Find the Full Distribution Function
1:39
Goal Number 2: Find the Density Function
1:55
Goal Number 3: Calculate Probabilities
2:17
Three Methods
3:05
Method 1: Distribution Functions
3:06
Method 2: Transformations
3:38
Method 3: Moment-generating Functions
3:47
Distribution Functions
4:03
Distribution Functions
4:04
Example I: Find the Density Function
6:41
Step 1: Find the Distribution Function
6:42
Step 2: Find the Density Function
10:20
Summary
11:51
Example II: Find the Density Function
14:36
Step 1: Find the Distribution Function
14:37
Step 2: Find the Density Function
18:19
Summary
19:22
Example III: Find the Cumulative Distribution & Density Functions
20:39
Step 1: Find the Cumulative Distribution
20:40
Step 2: Find the Density Function
28:58
Summary
30:20
Example IV: Find the Density Function
33:01
Step 1: Setting Up the Equation & Graph
33:02
Step 2: If u ≤ 1
38:32
Step 3: If u ≥ 1
41:02
Step 4: Find the Distribution Function
42:40
Step 5: Find the Density Function
43:11
Summary
45:03
Example V: Find the Density Function
48:32
Step 1: Exponential
48:33
Step 2: Independence
50:48
Step 2: Find the Distribution Function
51:47
Step 3: Find the Density Function
1:00:17
Summary
1:02:05
Transformations

1h 16s

Intro
0:00
Premise
0:32
Premise
0:33
Goal
1:37
Goal Number 1: Find the Full Distribution Function
1:38
Goal Number 2: Find the Density Function
1:49
Goal Number 3: Calculate Probabilities
2:04
Three Methods
2:34
Method 1: Distribution Functions
2:35
Method 2: Transformations
2:57
Method 3: Moment-generating Functions
3:05
Requirements for Transformation Method
3:22
The Transformation Method Only Works for Single-variable Situations
3:23
Must be a Strictly Monotonic Function
3:50
Example: Strictly Monotonic Function
4:50
If the Function is Monotonic, Then It is Invertible
5:30
Formula for Transformations
7:09
Formula for Transformations
7:11
Example I: Determine whether the Function is Monotonic, and if so, Find Its Inverse
8:26
Example II: Find the Density Function
12:07
Example III: Determine whether the Function is Monotonic, and if so, Find Its Inverse
17:12
Example IV: Find the Density Function for the Magnitude of the Next Earthquake
21:30
Example V: Find the Expected Magnitude of the Next Earthquake
33:20
Example VI: Find the Density Function, Including the Range of Possible Values for u
47:42
Moment-Generating Functions

1h 18m 52s

Intro
0:00
Premise
0:30
Premise
0:31
Goal
1:40
Goal Number 1: Find the Full Distribution Function
1:41
Goal Number 2: Find the Density Function
1:51
Goal Number 3: Calculate Probabilities
2:01
Three Methods
2:39
Method 1: Distribution Functions
2:40
Method 2: Transformations
2:50
Method 3: Moment-Generating Functions
2:55
Review of Moment-Generating Functions
3:04
Recall: The Moment-Generating Function for a Random Variable Y
3:05
The Moment-Generating Function is a Function of t (Not y)
3:45
Moment-Generating Functions for the Discrete Distributions
4:31
Binomial
4:50
Geometric
5:12
Negative Binomial
5:24
Hypergeometric
5:33
Poisson
5:42
Moment-Generating Functions for the Continuous Distributions
6:08
Uniform
6:09
Normal
6:17
Gamma
6:29
Exponential
6:34
Chi-square
7:05
Beta
7:48
Useful Formulas with the Moment-Generating Functions
8:48
Useful Formula 1
8:49
Useful Formula 2
9:51
How to Use Moment-Generating Functions
10:41
How to Use Moment-Generating Functions
10:42
Example I: Find the Density Function
12:22
Example II: Find the Density Function
30:58
Example III: Find the Probability Function
43:29
Example IV: Find the Probability Function
51:43
Example V: Find the Distribution
1:00:14
Example VI: Find the Density Function
1:12:10
Order Statistics

1h 4m 56s

Intro
0:00
Premise
0:11
Example Question: How Tall Will the Tallest Student in My Next Semester's Probability Class Be?
0:12
Setting
0:56
Definition 1
1:49
Definition 2
2:01
Question: What are the Distributions & Densities?
4:08
Formulas
4:47
Distribution of Max
5:11
Density of Max
6:00
Distribution of Min
7:08
Density of Min
7:18
Example I: Distribution & Density Functions
8:29
Example I: Distribution
8:30
Example I: Density
11:07
Example I: Summary
12:33
Example II: Distribution & Density Functions
14:25
Example II: Distribution
14:26
Example II: Density
17:21
Example II: Summary
19:00
Example III: Mean & Variance
20:32
Example III: Mean
20:33
Example III: Variance
25:48
Example III: Summary
30:57
Example IV: Distribution & Density Functions
35:43
Example IV: Distribution
35:44
Example IV: Density
43:03
Example IV: Summary
46:11
Example V: Find the Expected Time Until the Team's First Injury
51:14
Example V: Solution
51:15
Example V: Summary
1:01:11
Sampling from a Normal Distribution

1h 7s

Intro
0:00
Setting
0:36
Setting
0:37
Assumptions and Notation
2:18
Assumption Forever
2:19
Assumption for this Lecture Only
3:21
Notation
3:49
The Sample Mean
4:15
Statistic We'll Study the Sample Mean
4:16
Theorem
5:40
Standard Normal Distribution
7:03
Standard Normal Distribution
7:04
Converting to Standard Normal
10:11
Recall
10:12
Corollary to Theorem
10:41
Example I: Heights of Students
13:18
Example II: What Happens to This Probability as n → ∞
22:36
Example III: Units at a University
32:24
Example IV: Probability of Sample Mean
40:53
Example V: How Many Samples Should We Take?
48:34
The Central Limit Theorem

1h 9m 55s

Intro
0:00
Setting
0:52
Setting
0:53
Assumptions and Notation
2:53
Our Samples are Independent (Independent Identically Distributed)
2:54
No Longer Assume that the Population is Normally Distributed
3:30
The Central Limit Theorem
4:36
The Central Limit Theorem Overview
4:38
The Central Limit Theorem in Practice
6:24
Standard Normal Distribution
8:09
Standard Normal Distribution
8:13
Converting to Standard Normal
10:13
Recall: If Y is Normal, Then …
10:14
Corollary to Theorem
11:09
Example I: Probability of Finishing Your Homework
12:56
Example I: Solution
12:57
Example I: Summary
18:20
Example I: Confirming with the Standard Normal Distribution Chart
20:18
Example II: Probability of Selling Muffins
21:26
Example II: Solution
21:27
Example II: Summary
29:09
Example II: Confirming with the Standard Normal Distribution Chart
31:09
Example III: Probability that a Soda Dispenser Gives the Correct Amount of Soda
32:41
Example III: Solution
32:42
Example III: Summary
38:03
Example III: Confirming with the Standard Normal Distribution Chart
40:58
Example IV: How Many Samples Should She Take?
42:06
Example IV: Solution
42:07
Example IV: Summary
49:18
Example IV: Confirming with the Standard Normal Distribution Chart
51:57
Example V: Restaurant Revenue
54:41
Example V: Solution
54:42
Example V: Summary
1:04:21
Example V: Confirming with the Standard Normal Distribution Chart
1:06:48
Bookmark & Share Embed

## Copy & Paste this embed code into your website’s HTML

Please ensure that your website editor is in text mode when you paste the code.
(In Wordpress, the mode button is on the top right corner.)
×
• - Allow users to view the embedded video in full-size.
Since this lesson is not free, only the preview will appear on your website.

• ## Transcription 3 answers Last reply by: Dr. William MurrayTue Sep 2, 2014 7:57 PMPost by David Llewellyn on August 28, 2014I don't follow where you get each indicator variable to be equal to r/N.Y1 is OK but, surely, the probability of picking the gender of the second person depends on your first choice as there is no replacement. If the first choice is a woman then the probability of getting a woman on the second choice is (r-1)/(N-1) but if the first choice was a man then the probability of getting a woman is r/(N-1). The probability of picking a woman gets even more complex on the third choice as it depends on whether you have picked 2, 1 or 0 women already being (r-2)/(N-2), (r-1)/(N-2) and r/(N-2) respectively. This trend continues all the way up to the nth choice where depending on how many women have been picked already the probability of picking a woman is (r-n-1)/(N-n-1), ... (r-2)/(N-n-1), (r-1)/(N-n-1), r/(N-n-1).I can't see how this simplifies to nr/N.What am I missing?

### Hypergeometric Distribution

Lecture Slides are screen-captured images of important points in the lecture. Students can download and print out these lecture slide images to do practice problems as well as take notes while watching the lecture.

• Intro 0:00
• Hypergeometric Distribution 0:11
• Hypergeometric Distribution: Definition
• Random Variable
• Formula for the Hypergeometric Distribution 1:50
• Fixed Parameters
• Formula for the Hypergeometric Distribution
• Key Properties of Hypergeometric 6:14
• Mean
• Variance
• Standard Deviation
• Example I: Students Committee 7:30
• Example II: Expected Number of Women on the Committee in Example I 11:08
• Example III: Pairs of Shoes 13:49
• Example IV: What is the Expected Number of Left Shoes in Example III? 20:46
• Example V: Using Indicator Variables & Linearity of Expectation 25:40

### Transcription: Hypergeometric Distribution

Hi and welcome back to the probability lectures here on www.educator.com, my name is Will Murray.0000

Today, we are going to be discussing the glamorously named hyper geometric distribution.0006

Let me tell you about the situation where you would use the hyper geometric distribution.0011

I set it up in terms of picking a committee of women and men.0016

The idea is that you have a larger group, you have a big group of N people.0021

There is N and there is a n in the hyper geometric distribution.0029

Make sure you do not get those mixed up.0033

You got N people total, all women, and N - R man.0035

What you are going to do is you are going to form a committee from this larger group.0043

Your committee is going to have n, that is the number of men and women you are going to put on your committee.0048

We want to emphasize here that this is an unordered choice.0058

You are going to just grab a group of people, it does not matter which order you are grabbing them in.0061

You are not going to have a chair of the committee, you are not going to have any special positions.0066

You are going to have a group of people, you can think of it may be as a team, a sports team.0071

It is without replacement meaning you cannot pick the same person twice.0077

You grab this group of people and then the question is, how many women did you end up with on your committee, out of all the possible men and women?0081

More specifically, what are the chances of getting exactly y women on your committee?0092

Our random variable here represents the number of women that you end up with on our committee.0100

Let us go ahead and look at all the parameters, there is a lot of them, and let us figure out the formula.0106

There is a lot of parameters here N is the total number of people that you are looking at.0113

That is the number people that are available to be selected on your committee.0119

R is the number of women available and that means that all that remain are men, that is N - R is the number of men available.0125

And then n is the number of people we are going to pick.0135

When I look at this large pool, let me draw a little Venn diagram here.0140

This large pool of N people available, there is N people available /all.0145

All of them are women, N - R of them are men.0152

We are going to create our committee of N people and that means that,0157

we want to find the probability of why those people being women which means that n - Y of those people are men.0163

The probability distribution formula looks very complicated but0173

I'm going to try to persuade you that it is actually a very easy formula to remember,0178

if you can remember this situation that we are describing.0182

The probability formula is R choose Y/N - R choose n – Y.0187

Multiply by that and then N choose n.0194

I want to emphasize that these are all binomial coefficients.0199

These are combinations, you will use the factorial formula to simplify these.0202

That looks like a very difficult formula to remember but it is not, and here is why.0208

The denominator that just represents, remember there is N people total and you are choosing n of them.0214

This is the total number of ways to choose your committee.0221

There is N people total and you are choosing n of those people to be on your committee.0234

If you are going to disregard gender, you are just making a choice of n people out of the total number of people.0243

Suppose you take gender into account and suppose you want to get exactly Y women on your committee.0250

You have a fixed number of women that you want to get on your committee.0256

Then you will look at all the women in the room and you would choose exactly Y of them to be on your committee.0260

There are women and you are choosing Y of them to be on your committee.0266

You are making a choice of Y people out of R women available.0274

Then, after you have chosen your women, you look around at all the men and you choose the number of men you need.0280

How many men do you need?0286

If you want to get Y women, that means you need n – Y men and how many men are available.0288

We said there is N – R, the number of men available.0299

This term really represents you choosing the men to be on your committee.0303

You have a certain number of ways you can pick the women.0310

You can have a certain number of ways you can pick the men.0313

You multiply those together, that gives you the total number of ways to pick your committee that has exactly Y women.0316

And then, you divide that by the total number of ways to pick your committee, if you do not pay any attention to gender at all.0325

That is actually, I think that is a fairly easy formula to remember, even though it looks very complicated.0332

It is definitely one of the most complicated probability distribution formula.0339

This Y here, the range for Y, you could have as few as 0 people, 0 women on your committee.0344

Or it is a n bit complicated here because the most number of women you can have on your committee would be N,0350

because that is the size of the committee, or R because that is the number of women available.0359

Whichever one of those is smaller, that is the maximum possible number of women you can have on your committee.0364

We need to get a couple of properties down with the hyper geometric distribution.0373

The most useful one is the mean, which you remember is the same as expected value.0377

The expected value of the hyper geometric distribution, this n × R/N.0384

N is the size of your committee, R is the number of women available,0392

and N is the total number of people in the room that you are choosing from.0398

The variance is really a kind of a nasty formula, I do not recommend memorizing it.0403

I do not use it very often but I wanted to record it for posterity, in case you do need it.0409

These are actual fractions, let me emphasize, these are not binomial coefficients.0415

This is just what it turns out to be.0422

Like I said, I do not really think there is a lot of intuition to be gained from this variance.0424

I do not think it is worth memorizing that formula.0434

The standard deviation, of course, is just the square root of the variance.0436

It is always the square root of the variance.0440

I just took the variance formula and took the square root of it, to get the standard deviation.0442

Let us go ahead and jump into some examples here.0449

In example 1, we got 33 students in a class and 12 women and 21 men.0452

We are going to pick a committee, maybe we are going to do a group project and 7 students are going in a group project.0459

I will pick 7 students at random, what is the chance that we will get exactly 5 women working on that project?0465

This is a hyper geometric distribution, let me set up the parameters here.0471

N is the total number of people available, that is 33.0475

R is the number of women in the room, that is 12.0482

That means that N - R is the number of men available, that is 21.0486

The number people on our committee is 7 and we are interested in the chance that we are going to end up with Y,0499

with 5 women on our committee, that is the value of Y or Y is 5.0507

That is because we want our committee to have exactly 5 women.0512

Let me write down the formula for the hyper geometric distribution.0516

P of Y is R choose Y, that is where we picked the women, × N -R men available, n - - Y men on our committee ÷ N ÷ n,0520

that is the total number of ways we could have chosen this committee or this group of students do a project.0536

I will just drop the numbers in.0541

R is 12, Y is 5, N - R is 21, n -y is 7 -5 is 2, N is 33, and n is 7.0543

I'm going to leave that as a fraction like that, I did not bother to work it out to a decimal.0565

It would be a fairly small number, if you actually worked out the numbers, it should be pretty small.0572

But it would be a load of factorials that I just did not want to calculate.0577

I did not think it would be very illuminating but it would be pretty small,0582

because if you pick 7 people at random from a class like this,0587

the chance you getting 5 women is very low because there is there is more men than women in this class.0590

Let me recap where those came.0597

First, I set up all my parameters, the N, R, n, n – R, and Y.0601

Then I just use the probability distribution formula for the hyper geometric distribution.0607

This is the formula, I know it looks difficult to remember but if you kind of think about what each one of those factors represents,0612

it is really not hard to remember the formula.0619

I think this formula kind of makes intuitive sense, if you think about the R choose Y0625

means you are picking Y women from R available women.0631

N -R being is the number of men available and n - Y is the number of men you want.0637

We multiply those together and N choose n is the number ways of choosing your committee in the first place.0644

We drop the numbers in for each one of those and we just give that as our answer.0652

That is our chance that the committee will contain exactly 5 women.0656

We are going to hang onto these numbers for the next example.0660

Remember the basic setup of this example and we will go ahead and take a look at that.0663

Example 2 was referring back to example 1.0670

In example 1, we were picking students from a class and we are picking a committee of 7 students, maybe a group project in a class.0673

Let me just remind you of the parameters from example 1.0683

We had N was the number of students in the class, 33.0686

R was the number women in the class, I got this from example 1, they were 12 woman in the class.0692

N was the number of people that we are picking to be on our committees, that is 7.0698

The expected number of women is the expected number of our random variable Y.0706

Y is the number of women on our committee.0712

We have a formula for the expected value of a hyper geometric random variable, the mean.0723

E of Y is n × r/N.0734

In this case, that n is 7 × r is 12, N is 33.0740

I guess we could simplify that, 12 and 33, you can take out a 3 from each of those.0751

7 × 4/11, that is a 28/11.0757

Our units here are women, that is the total number of women we expect on our committee.0763

Obviously, you cannot have fractions of women but on average, if we did this many times,0769

we would expect to see on average, 28/11 is a lot less than 3.0775

A little less than 3 women on the committee, on average.0782

To recap here, I got these parameters from example 1.0788

Example 1 setup how many people there were in the room, how many women, how many men,0793

how many people we are picking on our committee.0797

I got this formula for the mean from the third slide at the beginning of the lecture.0800

If you scroll back a couple of slides, you will see this mean formula.0806

I will just drop the numbers in and I simplified that down to a certain number of women.0809

Of course, in real life, we will either have 1 woman, or 2 women, or 3 women.0816

On average, we will have a bit fewer than 3 women on our committee.0823

In example 3 here, you open up your shoe closet and you do a shoe inventory.0831

It looks like you have 10 pairs of shoes in your closet.0837

You have lots of pairs of shoes in your closet.0840

You are getting ready to move to a new apartment.0843

You are in a hurry, you grab the nearest box you see and you start throwing your shoes in.0846

You are not really keeping track of which shoe matches up which.0852

You are just throwing them all in, you will unpack them after you move.0855

You start throwing your shoes in and you get 13 shoes in the box, and it is full.0859

You seal up the box and then you start to wonder, how many left shoes are in the box and how many right shoes are in the box?0866

In particular, what is the probability that there are exactly 5 left shoes and 8 right shoes in the box?0873

This is a hyper geometric distribution because if you think about it, it is just like selecting women and men to be on a committee.0879

You have a certain number right shoes in your closet.0893

You grab some and put them in the box, it is just like selecting women and men to be on your committee.0894

Let me set up the parameters here for the hyper geometric distribution.0900

N is the total number of people in a room, or in this case, it is the total number of shoes in the closet,0905

before you start packing them.0913

Shoes in the closet, counting both left and right.0915

Let us say we got 10 pairs, there are 20 of those.0920

R is the number of left handed shoes.0926

Left handed shoes sounds a little strange, I will just say left shoes.0930

There are 10 left shoes in your closet, assuming that all your pairs match up.0936

Let me go ahead and calculate N – R, that is the number of right shoes but that is 20 -10 is still 10.0942

N is the number of shoes that you have chosen randomly, when you throw them in a box.0960

The number in the box and that is given to us to be 13.0968

Y is the number of left shoes that we are interested in.0979

Y is 5, 5 left shoes, because we are curious about the likelihood that there are exactly 5 left shoes in the box.0983

Let me go ahead and remind you of the formula for the hyper geometric distribution.0998

P of Y, it is not hard to remember if you think about what these things are measuring.1003

It is R choose Y because it is the number of left shoes available, the number that you are interested in,1009

× the number of right shoes available, that is N – R.1017

N - R and n – y, that is the number of right shoes that should be in the box ×1022

all the possible ways of choosing your shoes, that is N choose n.1028

I will just fill in all the numbers here.1035

R is 10, Y is 5, N - R is 10, n - Y is 13 – 5, that is 8.1037

N was 20 and n was 13, 20 choose 13.1053

That is all the number of ways that you could have chosen 13 there.1075

Again, I did not bother to simplify this down because it will be a lot of factorials.1080

I think I will just leave it that way.1088

If you want to simplify that down, you could just calculate a bunch of factorials,1093

and then do some arithmetic there and get a decimal answer.1098

Let me recap and show you where each one of those values came from.1104

Each one of these numbers, these parameters for the problem came from somewhere in the problem.1108

N is the total number of shoes available in the closet.1114

They were 10 pairs which means they were 20 shoes available.1118

R is the number of left shoes.1122

We figure this analogously to picking a committee of people from a group of women and men.1125

Instead, we are picking a box of shoes from a group of left and right shoes.1133

R is the number of left shoes that we just picked.1137

We picked R to be the number of left shoes.1143

We could have switched the role of left shoes and right shoes, and it really would not matter,1147

we would end up getting the same answer here.1150

The number of left shoes, since there is 10 pairs, there is exactly 10 left shoes that makes1155

the number of right shoes to be 20 -10 which is 10.1161

That is easy to figure out as well.1166

The number of shoes in the box total is 13, that is where that 13 came from, that is n right there.1168

Y is the number of left shoes that we are interested in.1179

We want to find the probability of getting 5 left choose, that 5 came from that number right there.1183

You could switch the roles of left shoes and right shoes.1189

You could have keep track of right shoes instead, and that we are giving you the same answer.1192

The probability of that Y, I just wrote down the formula for the hyper geometric distribution.1197

I do remember this, even though this is kind of a complicated formula,1203

it is not hard to remember when you think about what each one of these things it is counting and what each one represents physically.1206

I just dropped in all the parameters, r, y, N, n.1214

We got some number that you could simplify to a fraction or to a decimal but it did not seem to me to be that relevant.1221

We are going to hang onto this example and we are going to keep using this example in problem 4.1232

Remember these numbers and we will look in another aspect of this in the next example.1242

Example 4, this refers back to example 3.1248

If you have not just watched example 3, go back and watch example 3.1251

Or at least, read the setup before you look at example 4 and that will make sense.1255

Remember back then, we have a shoe closet which has 10 pairs of shoes.1261

We start throwing the shoes into a box at random because we are getting ready to move and we are in a hurry.1266

We are not going to bother to keep the left shoe with its corresponding right shoe.1271

We just throw our shoes into the box and it turns out that there are 13 shoes in the box.1276

I'm curious about how many left shoes there might be in the box?1281

This is again a hyper geometric distribution, let me remind you of the parameters that we had on example 3.1287

This was coming from example 3.1294

N was the total number of shoes, that is 20 total number of shoes in your closet.1296

r is the number of left shoes, there is 10 left shoes which means that there is 10 right shoes.1302

n is the number of shoes in the box which we said back in example 3, we said the box fills up when you got 13 shoes in there.1313

Our n is 13.1323

I want to know the expected number of left shoes in the box.1325

Remember, we sealed up the box, we cannot go and count.1329

Let us try to find the expected number of our random variable here.1332

Y is the number of left shoes in the box.1338

We want to find the expected number of left shoes, E of Y, the expected number of left shoes.1351

We have a formula for the expected value of the hyper geometric distribution.1358

Let me remind you what it was.1363

It is the same as the mean.1365

It is n × r/N, that is in this case, n is 13, r is 10.1368

I’m just reading these from up above.1380

N is 20, the 10 and the 20 simplify down to 13/2.1382

13/2 which is 6.5 left shoes.1390

It makes perfect sense and another sense is absurd because you cannot have half a shoe.1400

You are not cutting your shoes in half.1406

It does not really mean that we open the box, there will be 6.5 left shoes in there.1408

You either find some whole number shoes, you might find 4 left shoes, you might find 7 left shoes.1416

You will not find 6.5 left shoes.1422

What it does mean is that if you pack many boxes and there are 13 shoes in each one,1425

on average, over the long run you will expect to find 6 1/2 left shoes per box.1433

On average, if you add up all the left shoes and divide by the number of boxes.1440

Of course, that does not make sense because if you have 13 left shoes,1446

remember that in your shoe closet, half of the shoes are left and half of the shoes were right.1451

On average, you expect see half of them being left shoes.1458

If you have 13 shoes total then on average you expect to see 6 1/2 left shoes.1461

Let me recap that problem.1468

We took these parameters from examples 3.1470

If these numbers look strange to you, just go back and read the setup in examples 3.1474

You will see that we had 20 shoes in the closet, 10 left shoes, 10 right shoes.1479

We took 13 of them, we threw them into a box.1485

The mean number of shoes there, the mean of the number left shoes using the hyper geometric distribution is n × r/N.1490

That formula came from our slide about means and standard deviations, earlier on in this lecture.1499

I think it was the third slide of this video.1506

You can scroll back and see where that comes from.1508

I just drop the numbers in 13, 10, and 20.1511

Simplify down to 6.5 left shoes which of course, does not make sense because you will find a whole number of shoes in the box.1513

But as an expected value, as an average value, it makes perfect sense because out of 13 shoes,1522

you can expect half of them to be left shoes and half of them to be right shoes.1528

You would expect in the long run, an average of 6 1/2 left shoes in the box.1532

Example 5 here is a little more theoretical, it is asking us to use indicator variables and linearity of expectation1541

to prove that the expected value of a hyper geometric random variable is n × r/N.1549

This one is a little more theoretical, we are going to prove this value.1559

We cannot just pull it from the earlier slide.1562

Let me show you how this works out.1565

Remember the premise of the hyper geometric distribution.1568

We are calculating a random variable that represents the number of women on a committee of,1572

n was the number of people on our committee.1593

We have several parameters here.1597

N is the total number of people in the room that we are going to pick from, total number of people.1602

Among those total number of people, R is the number of women and N - R is the number of men in the room.1614

N – r is the number of men.1629

We are going to pick a committee of n people and we want to find the expected number of women.1632

There is a very clever way to do this which is to set up indicator variable.1639

Let me show you what I mean by indicator variables.1642

Let me define Y1, by definition is an indicator variable.1645

Let us consider that we are going to pick these people to be on our committee one by one.1652

We look around the room and say I want you, you, you, and you, to be on the committee.1658

We are picking these people one by one.1663

Y1 is going to be an indicator variable that tells us whether the first person on that committee is a woman or not.1665

Y1 is defined to be, one if we get a woman on the first pick.1675

We pick our first person to be on the committee, Y1 is an indicator variable.1693

It is going to be a one if it is a woman, 0 if it is a man.1698

It is a little strange but when we say Y1 is the number of women we get on the first choice.1707

We either get one woman or we get a man, that is 0 women.1712

We will define Y2 to be one, if we get a woman on the second pick.1716

The second person we look at.1730

If that is a woman, we say Y2 was going to be 1.1732

If it is a man, we say Y2 is going to be 0.1736

Let us keep on going and we are picking n people to be on this committee.1742

We go to Yn here, we define our indicator variables.1746

There is one variable for each person on this committee.1752

What that means is Y is the total number of women on the committee.1756

What that means is it is the number of women we got on the first pick, which is either 1 or 0.1768

The number of woman we got on the second pick up to Yn.1775

The total number of women, we can count the number of women just by counting all the 1 we got by those indicator variables.1779

That breaks down into a sum of these indicator variables.1787

In order to find the expected value of Y, the expected number of women, it is the same as the expected value of Y1 + Y2, up to Yn.1791

We can use linearity of expectations.1806

This is where we are going to use linearity right here, linearity of expectation, very important here.1809

These variables are not independent but linearity of expectation does not require that.1819

Even though these variables are not independent, if you get a woman on the first pick,1825

you are less likely to get a woman on the second because there is fewer women to pick now.1829

Even though they are not independent, you can still use linearity of expectation.1834

That is the glorious thing about linearity of expectation.1838

It breaks up in the expected value of each of these indicator variables.1843

What is the expected value of each of these indicator variables?1848

Let us think about that, I will give you good way to think about that.1853

If you think about just listing Y1, we pick one woman out of a crowd.1856

The original definition of expected value is, you look at all the possible values of that variable1865

and you multiply that variable × the probability of getting that value.1873

This is going back to the original definition of expected value.1879

I covered this in one of the very early lectures on probability.1885

You can go back and look at some of those early lectures on probability and you will see this.1890

What are the possible values of these indicator variables?1893

There is only 0 and 1 because we setup here that the indicator variable is always going to be 0 or 1.1897

This expands out in to 0 × the probability of 0 + 1 × the probability of 1.1907

What is the probability that indicator variable is going to come up 0?1919

It is the probability that we get a man because the indicator variable was 0 if we get a man + 1 ×1923

the probability that we get a woman when we make our first pick.1934

I do not care about the 0, the probability of getting a woman.1940

How many people were there in the room?1947

There were N people in the room and r of those of people is women.1950

This is exactly r/N, that is the expected value of one of those indicator variables.1959

It is just r/N.1967

We can say that all of those indicator variables, they all have the same expected value.1972

Each one of these is r/N and there are n of these variables.1980

What we get here is n × r/N.1993

That is the expected value of our random variable.2000

That is the expected number of women on our committee.2006

That checks with the value of the mean that I gave you way back on the third slide of this video.2011

That is really where that number comes from, now you have the derivation to back it up.2019

Now, you hopefully understand it yourself.2023

In case that did not make sense, a quick recap here.2026

N was the total number of people in the room N.2029

r is the number of women which leaves N - R to be the number of men left over.2033

We are going to pick a committee of n people and Y is the number of women we get on our committee.2039

One way to break that down is to look at our picks one by one.2048

We pick this person and then that person and then that person and then that person, to be on our committee.2051

Each person we set up this n indicator variable, that is going to be 1 if we get a woman and 0 if we get a man.2057

Each person has their own indicator variable and that means the total number of women2065

is just the sum of all these indicator variables.2071

It is the sum of all the women that we got when we made each one of these picks.2075

The expected value is expected value of the sum here is where we use linearity of expectation.2081

That is kind of a big deal in probability, let me highlight that to break that up into the expected value of each of these indicator variables.2089

We can calculate the expected value of these indicator variables, we just say the only possible values they can take are 0 and 1.2098

Using our original definition for expected value, we have 0 × the probability of 0, 1 × the probability of 1.2107

We really only need to calculate the probability of 1, which means the probability that we get a woman,2115

when we pick a certain person from this room.2121

There are R women in the room and n total people in the room, that probability is r/N.2124

We fill that in for each of our expected values here, it is the same for every indicator variable.2137

We are adding up a bunch of r/N.2148

We are adding up n of them and we get n × r/N as our answer.2152

That checks with the mean of the hyper geometric random variable that I gave you back earlier on in this lecture.2162

That is our last example problem and that wraps up our lecture here on the hyper geometric distribution.2171

You are watching the probability videos here on www.educator.com.2179

My name is Will Murray, thank you for joining us, see you next time, bye.2184

OR

### Start Learning Now

Our free lessons will get you started (Adobe Flash® required).