William Murray

William Murray

Sampling from a Normal Distribution

Slide Duration:

Table of Contents

Section 1: Probability by Counting
Experiments, Outcomes, Samples, Spaces, Events

59m 30s

Intro
0:00
Terminology
0:19
Experiment
0:26
Outcome
0:56
Sample Space
1:16
Event
1:55
Key Formula
2:47
Formula for Finding the Probability of an Event
2:48
Example: Drawing a Card
3:36
Example I
5:01
Experiment
5:38
Outcomes
5:54
Probability of the Event
8:11
Example II
12:00
Experiment
12:17
Outcomes
12:34
Probability of the Event
13:49
Example III
16:33
Experiment
17:09
Outcomes
17:33
Probability of the Event
18:25
Example IV
21:20
Experiment
21:21
Outcomes
22:00
Probability of the Event
23:22
Example V
31:41
Experiment
32:14
Outcomes
32:35
Probability of the Event
33:27
Alternate Solution
40:16
Example VI
43:33
Experiment
44:08
Outcomes
44:24
Probability of the Event
53:35
Combining Events: Multiplication & Addition

1h 2m 47s

Intro
0:00
Unions of Events
0:40
Unions of Events
0:41
Disjoint Events
3:42
Intersections of Events
4:18
Intersections of Events
4:19
Conditional Probability
5:47
Conditional Probability
5:48
Independence
8:20
Independence
8:21
Warning: Independent Does Not Mean Disjoint
9:53
If A and B are Independent
11:20
Example I: Choosing a Number at Random
12:41
Solving by Counting
12:52
Solving by Probability
17:26
Example II: Combination
22:07
Combination Deal at a Restaurant
22:08
Example III: Rolling Two Dice
24:18
Define the Events
24:20
Solving by Counting
27:35
Solving by Probability
29:32
Example IV: Flipping a Coin
35:07
Flipping a Coin Four Times
35:08
Example V: Conditional Probabilities
41:22
Define the Events
42:23
Calculate the Conditional Probabilities
46:21
Example VI: Independent Events
53:42
Define the Events
53:43
Are Events Independent?
55:21
Choices: Combinations & Permutations

56m 3s

Intro
0:00
Choices: With or Without Replacement?
0:12
Choices: With or Without Replacement?
0:13
Example: With Replacement
2:17
Example: Without Replacement
2:55
Choices: Ordered or Unordered?
4:10
Choices: Ordered or Unordered?
4:11
Example: Unordered
4:52
Example: Ordered
6:08
Combinations
9:23
Definition & Equation: Combinations
9:24
Example: Combinations
12:12
Permutations
13:56
Definition & Equation: Permutations
13:57
Example: Permutations
15:00
Key Formulas
17:19
Number of Ways to Pick r Things from n Possibilities
17:20
Example I: Five Different Candy Bars
18:31
Example II: Five Identical Candy Bars
24:53
Example III: Five Identical Candy Bars
31:56
Example IV: Five Different Candy Bars
39:21
Example V: Pizza & Toppings
45:03
Inclusion & Exclusion

43m 40s

Intro
0:00
Inclusion/Exclusion: Two Events
0:09
Inclusion/Exclusion: Two Events
0:10
Inclusion/Exclusion: Three Events
2:30
Inclusion/Exclusion: Three Events
2:31
Example I: Inclusion & Exclusion
6:24
Example II: Inclusion & Exclusion
11:01
Example III: Inclusion & Exclusion
18:41
Example IV: Inclusion & Exclusion
28:24
Example V: Inclusion & Exclusion
39:33
Independence

46m 9s

Intro
0:00
Formula and Intuition
0:12
Definition of Independence
0:19
Intuition
0:49
Common Misinterpretations
1:37
Myth & Truth 1
1:38
Myth & Truth 2
2:23
Combining Independent Events
3:56
Recall: Formula for Conditional Probability
3:58
Combining Independent Events
4:10
Example I: Independence
5:36
Example II: Independence
14:14
Example III: Independence
21:10
Example IV: Independence
32:45
Example V: Independence
41:13
Bayes' Rule

1h 2m 10s

Intro
0:00
When to Use Bayes' Rule
0:08
When to Use Bayes' Rule: Disjoint Union of Events
0:09
Bayes' Rule for Two Choices
2:50
Bayes' Rule for Two Choices
2:51
Bayes' Rule for Multiple Choices
5:03
Bayes' Rule for Multiple Choices
5:04
Example I: What is the Chance that She is Diabetic?
6:55
Example I: Setting up the Events
6:56
Example I: Solution
11:33
Example II: What is the chance that It Belongs to a Woman?
19:28
Example II: Setting up the Events
19:29
Example II: Solution
21:45
Example III: What is the Probability that She is a Democrat?
27:31
Example III: Setting up the Events
27:32
Example III: Solution
32:08
Example IV: What is the chance that the Fruit is an Apple?
39:11
Example IV: Setting up the Events
39:12
Example IV: Solution
43:50
Example V: What is the Probability that the Oldest Child is a Girl?
51:16
Example V: Setting up the Events
51:17
Example V: Solution
53:07
Section 2: Random Variables
Random Variables & Probability Distribution

38m 21s

Intro
0:00
Intuition
0:15
Intuition for Random Variable
0:16
Example: Random Variable
0:44
Intuition, Cont.
2:52
Example: Random Variable as Payoff
2:57
Definition
5:11
Definition of a Random Variable
5:13
Example: Random Variable in Baseball
6:02
Probability Distributions
7:18
Probability Distributions
7:19
Example I: Probability Distribution for the Random Variable
9:29
Example II: Probability Distribution for the Random Variable
14:52
Example III: Probability Distribution for the Random Variable
21:52
Example IV: Probability Distribution for the Random Variable
27:25
Example V: Probability Distribution for the Random Variable
34:12
Expected Value (Mean)

46m 14s

Intro
0:00
Definition of Expected Value
0:20
Expected Value of a (Discrete) Random Variable or Mean
0:21
Indicator Variables
3:03
Indicator Variable
3:04
Linearity of Expectation
4:36
Linearity of Expectation for Random Variables
4:37
Expected Value of a Function
6:03
Expected Value of a Function
6:04
Example I: Expected Value
7:30
Example II: Expected Value
14:14
Example III: Expected Value of Flipping a Coin
21:42
Example III: Part A
21:43
Example III: Part B
30:43
Example IV: Semester Average
36:39
Example V: Expected Value of a Function of a Random Variable
41:28
Variance & Standard Deviation

47m 23s

Intro
0:00
Definition of Variance
0:11
Variance of a Random Variable
0:12
Variance is a Measure of the Variability, or Volatility
1:06
Most Useful Way to Calculate Variance
2:46
Definition of Standard Deviation
3:44
Standard Deviation of a Random Variable
3:45
Example I: Which of the Following Sets of Data Has the Largest Variance?
5:34
Example II: Which of the Following Would be the Least Useful in Understanding a Set of Data?
9:02
Example III: Calculate the Mean, Variance, & Standard Deviation
11:48
Example III: Mean
12:56
Example III: Variance
14:06
Example III: Standard Deviation
15:42
Example IV: Calculate the Mean, Variance, & Standard Deviation
17:54
Example IV: Mean
18:47
Example IV: Variance
20:36
Example IV: Standard Deviation
25:34
Example V: Calculate the Mean, Variance, & Standard Deviation
29:56
Example V: Mean
30:13
Example V: Variance
33:28
Example V: Standard Deviation
34:48
Example VI: Calculate the Mean, Variance, & Standard Deviation
37:29
Example VI: Possible Outcomes
38:09
Example VI: Mean
39:29
Example VI: Variance
41:22
Example VI: Standard Deviation
43:28
Markov's Inequality

26m 45s

Intro
0:00
Markov's Inequality
0:25
Markov's Inequality: Definition & Condition
0:26
Markov's Inequality: Equation
1:15
Markov's Inequality: Reverse Equation
2:48
Example I: Money
4:11
Example II: Rental Car
9:23
Example III: Probability of an Earthquake
12:22
Example IV: Defective Laptops
16:52
Example V: Cans of Tuna
21:06
Tchebysheff's Inequality

42m 11s

Intro
0:00
Tchebysheff's Inequality (Also Known as Chebyshev's Inequality)
0:52
Tchebysheff's Inequality: Definition
0:53
Tchebysheff's Inequality: Equation
1:19
Tchebysheff's Inequality: Intuition
3:21
Tchebysheff's Inequality in Reverse
4:09
Tchebysheff's Inequality in Reverse
4:10
Intuition
5:13
Example I: Money
5:55
Example II: College Units
13:20
Example III: Using Tchebysheff's Inequality to Estimate Proportion
16:40
Example IV: Probability of an Earthquake
25:21
Example V: Using Tchebysheff's Inequality to Estimate Proportion
32:57
Section 3: Discrete Distributions
Binomial Distribution (Bernoulli Trials)

52m 36s

Intro
0:00
Binomial Distribution
0:29
Binomial Distribution (Bernoulli Trials) Overview
0:30
Prototypical Examples: Flipping a Coin n Times
1:36
Process with Two Outcomes: Games Between Teams
2:12
Process with Two Outcomes: Rolling a Die to Get a 6
2:42
Formula for the Binomial Distribution
3:45
Fixed Parameters
3:46
Formula for the Binomial Distribution
6:27
Key Properties of the Binomial Distribution
9:54
Mean
9:55
Variance
10:56
Standard Deviation
11:13
Example I: Games Between Teams
11:36
Example II: Exam Score
17:01
Example III: Expected Grade & Standard Deviation
25:59
Example IV: Pogo-sticking Championship, Part A
33:25
Example IV: Pogo-sticking Championship, Part B
38:24
Example V: Expected Championships Winning & Standard Deviation
45:22
Geometric Distribution

52m 50s

Intro
0:00
Geometric Distribution
0:22
Geometric Distribution: Definition
0:23
Prototypical Example: Flipping a Coin Until We Get a Head
1:08
Geometric Distribution vs. Binomial Distribution.
1:31
Formula for the Geometric Distribution
2:13
Fixed Parameters
2:14
Random Variable
2:49
Formula for the Geometric Distribution
3:16
Key Properties of the Geometric Distribution
6:47
Mean
6:48
Variance
7:10
Standard Deviation
7:25
Geometric Series
7:46
Recall from Calculus II: Sum of Infinite Series
7:47
Application to Geometric Distribution
10:10
Example I: Drawing Cards from a Deck (With Replacement) Until You Get an Ace
13:02
Example I: Question & Solution
13:03
Example II: Mean & Standard Deviation of Winning Pin the Tail on the Donkey
16:32
Example II: Mean
16:33
Example II: Standard Deviation
18:37
Example III: Rolling a Die
22:09
Example III: Setting Up
22:10
Example III: Part A
24:18
Example III: Part B
26:01
Example III: Part C
27:38
Example III: Summary
32:02
Example IV: Job Interview
35:16
Example IV: Setting Up
35:15
Example IV: Part A
37:26
Example IV: Part B
38:33
Example IV: Summary
39:37
Example V: Mean & Standard Deviation of Time to Conduct All the Interviews
41:13
Example V: Setting Up
42:50
Example V: Mean
46:05
Example V: Variance
47:37
Example V: Standard Deviation
48:22
Example V: Summary
49:36
Negative Binomial Distribution

51m 39s

Intro
0:00
Negative Binomial Distribution
0:11
Negative Binomial Distribution: Definition
0:12
Prototypical Example: Flipping a Coin Until We Get r Successes
0:46
Negative Binomial Distribution vs. Binomial Distribution
1:04
Negative Binomial Distribution vs. Geometric Distribution
1:33
Formula for Negative Binomial Distribution
3:39
Fixed Parameters
3:40
Random Variable
4:57
Formula for Negative Binomial Distribution
5:18
Key Properties of Negative Binomial
7:44
Mean
7:47
Variance
8:03
Standard Deviation
8:09
Example I: Drawing Cards from a Deck (With Replacement) Until You Get Four Aces
8:32
Example I: Question & Solution
8:33
Example II: Chinchilla Grooming
12:37
Example II: Mean
12:38
Example II: Variance
15:09
Example II: Standard Deviation
15:51
Example II: Summary
17:10
Example III: Rolling a Die Until You Get Four Sixes
18:27
Example III: Setting Up
19:38
Example III: Mean
19:38
Example III: Variance
20:31
Example III: Standard Deviation
21:21
Example IV: Job Applicants
24:00
Example IV: Setting Up
24:01
Example IV: Part A
26:16
Example IV: Part B
29:53
Example V: Mean & Standard Deviation of Time to Conduct All the Interviews
40:10
Example V: Setting Up
40:11
Example V: Mean
45:24
Example V: Variance
46:22
Example V: Standard Deviation
47:01
Example V: Summary
48:16
Hypergeometric Distribution

36m 27s

Intro
0:00
Hypergeometric Distribution
0:11
Hypergeometric Distribution: Definition
0:12
Random Variable
1:38
Formula for the Hypergeometric Distribution
1:50
Fixed Parameters
1:51
Formula for the Hypergeometric Distribution
2:53
Key Properties of Hypergeometric
6:14
Mean
6:15
Variance
6:42
Standard Deviation
7:16
Example I: Students Committee
7:30
Example II: Expected Number of Women on the Committee in Example I
11:08
Example III: Pairs of Shoes
13:49
Example IV: What is the Expected Number of Left Shoes in Example III?
20:46
Example V: Using Indicator Variables & Linearity of Expectation
25:40
Poisson Distribution

52m 19s

Intro
0:00
Poisson Distribution
0:18
Poisson Distribution: Definition
0:19
Formula for the Poisson Distribution
2:16
Fixed Parameter
2:17
Formula for the Poisson Distribution
2:59
Key Properties of the Poisson Distribution
5:30
Mean
5:34
Variance
6:07
Standard Deviation
6:27
Example I: Forest Fires
6:41
Example II: Call Center, Part A
15:56
Example II: Call Center, Part B
20:50
Example III: Confirming that the Mean of the Poisson Distribution is λ
26:53
Example IV: Find E (Y²) for the Poisson Distribution
35:24
Example V: Earthquakes, Part A
37:57
Example V: Earthquakes, Part B
44:02
Section 4: Continuous Distributions
Density & Cumulative Distribution Functions

57m 17s

Intro
0:00
Density Functions
0:43
Density Functions
0:44
Density Function to Calculate Probabilities
2:41
Cumulative Distribution Functions
4:28
Cumulative Distribution Functions
4:29
Using F to Calculate Probabilities
5:58
Properties of the CDF (Density & Cumulative Distribution Functions)
7:27
F(-∞) = 0
7:34
F(∞) = 1
8:30
F is Increasing
9:14
F'(y) = f(y)
9:21
Example I: Density & Cumulative Distribution Functions, Part A
9:43
Example I: Density & Cumulative Distribution Functions, Part B
14:16
Example II: Density & Cumulative Distribution Functions, Part A
21:41
Example II: Density & Cumulative Distribution Functions, Part B
26:16
Example III: Density & Cumulative Distribution Functions, Part A
32:17
Example III: Density & Cumulative Distribution Functions, Part B
37:08
Example IV: Density & Cumulative Distribution Functions
43:34
Example V: Density & Cumulative Distribution Functions, Part A
51:53
Example V: Density & Cumulative Distribution Functions, Part B
54:19
Mean & Variance for Continuous Distributions

36m 18s

Intro
0:00
Mean
0:32
Mean for a Continuous Random Variable
0:33
Expectation is Linear
2:07
Variance
2:55
Variance for Continuous random Variable
2:56
Easier to Calculate Via the Mean
3:26
Standard Deviation
5:03
Standard Deviation
5:04
Example I: Mean & Variance for Continuous Distributions
5:43
Example II: Mean & Variance for Continuous Distributions
10:09
Example III: Mean & Variance for Continuous Distributions
16:05
Example IV: Mean & Variance for Continuous Distributions
26:40
Example V: Mean & Variance for Continuous Distributions
30:12
Uniform Distribution

32m 49s

Intro
0:00
Uniform Distribution
0:15
Uniform Distribution
0:16
Each Part of the Region is Equally Probable
1:39
Key Properties of the Uniform Distribution
2:45
Mean
2:46
Variance
3:27
Standard Deviation
3:48
Example I: Newspaper Delivery
5:25
Example II: Picking a Real Number from a Uniform Distribution
8:21
Example III: Dinner Date
11:02
Example IV: Proving that a Variable is Uniformly Distributed
18:50
Example V: Ice Cream Serving
27:22
Normal (Gaussian) Distribution

1h 3m 54s

Intro
0:00
Normal (Gaussian) Distribution
0:35
Normal (Gaussian) Distribution & The Bell Curve
0:36
Fixed Parameters
0:55
Formula for the Normal Distribution
1:32
Formula for the Normal Distribution
1:33
Calculating on the Normal Distribution can be Tricky
3:32
Standard Normal Distribution
5:12
Standard Normal Distribution
5:13
Graphing the Standard Normal Distribution
6:13
Standard Normal Distribution, Cont.
8:30
Standard Normal Distribution Chart
8:31
Nonstandard Normal Distribution
14:44
Nonstandard Normal Variable & Associated Standard Normal
14:45
Finding Probabilities for Z
15:39
Example I: Chance that Standard Normal Variable Will Land Between 1 and 2?
16:46
Example I: Setting Up the Equation & Graph
16:47
Example I: Solving for z Using the Standard Normal Chart
19:05
Example II: What Proportion of the Data Lies within Two Standard Deviations of the Mean?
20:41
Example II: Setting Up the Equation & Graph
20:42
Example II: Solving for z Using the Standard Normal Chart
24:38
Example III: Scores on an Exam
27:34
Example III: Setting Up the Equation & Graph, Part A
27:35
Example III: Setting Up the Equation & Graph, Part B
33:48
Example III: Solving for z Using the Standard Normal Chart, Part A
38:23
Example III: Solving for z Using the Standard Normal Chart, Part B
40:49
Example IV: Temperatures
42:54
Example IV: Setting Up the Equation & Graph
42:55
Example IV: Solving for z Using the Standard Normal Chart
47:03
Example V: Scores on an Exam
48:41
Example V: Setting Up the Equation & Graph, Part A
48:42
Example V: Setting Up the Equation & Graph, Part B
53:20
Example V: Solving for z Using the Standard Normal Chart, Part A
57:45
Example V: Solving for z Using the Standard Normal Chart, Part B
59:17
Gamma Distribution (with Exponential & Chi-square)

1h 8m 27s

Intro
0:00
Gamma Function
0:49
The Gamma Function
0:50
Properties of the Gamma Function
2:07
Formula for the Gamma Distribution
3:50
Fixed Parameters
3:51
Density Function for Gamma Distribution
4:07
Key Properties of the Gamma Distribution
7:13
Mean
7:14
Variance
7:25
Standard Deviation
7:30
Exponential Distribution
8:03
Definition of Exponential Distribution
8:04
Density
11:23
Mean
13:26
Variance
13:48
Standard Deviation
13:55
Chi-square Distribution
14:34
Chi-square Distribution: Overview
14:35
Chi-square Distribution: Mean
16:27
Chi-square Distribution: Variance
16:37
Chi-square Distribution: Standard Deviation
16:55
Example I: Graphing Gamma Distribution
17:30
Example I: Graphing Gamma Distribution
17:31
Example I: Describe the Effects of Changing α and β on the Shape of the Graph
23:33
Example II: Exponential Distribution
27:11
Example II: Using the Exponential Distribution
27:12
Example II: Summary
35:34
Example III: Earthquake
37:05
Example III: Estimate Using Markov's Inequality
37:06
Example III: Estimate Using Tchebysheff's Inequality
40:13
Example III: Summary
44:13
Example IV: Finding Exact Probability of Earthquakes
46:45
Example IV: Finding Exact Probability of Earthquakes
46:46
Example IV: Summary
51:44
Example V: Prove and Interpret Why the Exponential Distribution is Called 'Memoryless'
52:51
Example V: Prove
52:52
Example V: Interpretation
57:44
Example V: Summary
1:03:54
Beta Distribution

52m 45s

Intro
0:00
Beta Function
0:29
Fixed parameters
0:30
Defining the Beta Function
1:19
Relationship between the Gamma & Beta Functions
2:02
Beta Distribution
3:31
Density Function for the Beta Distribution
3:32
Key Properties of the Beta Distribution
6:56
Mean
6:57
Variance
7:16
Standard Deviation
7:37
Example I: Calculate B(3,4)
8:10
Example II: Graphing the Density Functions for the Beta Distribution
12:25
Example III: Show that the Uniform Distribution is a Special Case of the Beta Distribution
24:57
Example IV: Show that this Triangular Distribution is a Special Case of the Beta Distribution
31:20
Example V: Morning Commute
37:39
Example V: Identify the Density Function
38:45
Example V: Morning Commute, Part A
42:22
Example V: Morning Commute, Part B
44:19
Example V: Summary
49:13
Moment-Generating Functions

51m 58s

Intro
0:00
Moments
0:30
Definition of Moments
0:31
Moment-Generating Functions (MGFs)
3:53
Moment-Generating Functions
3:54
Using the MGF to Calculate the Moments
5:21
Moment-Generating Functions for the Discrete Distributions
8:22
Moment-Generating Functions for Binomial Distribution
8:36
Moment-Generating Functions for Geometric Distribution
9:06
Moment-Generating Functions for Negative Binomial Distribution
9:28
Moment-Generating Functions for Hypergeometric Distribution
9:43
Moment-Generating Functions for Poisson Distribution
9:57
Moment-Generating Functions for the Continuous Distributions
11:34
Moment-Generating Functions for the Uniform Distributions
11:43
Moment-Generating Functions for the Normal Distributions
12:24
Moment-Generating Functions for the Gamma Distributions
12:36
Moment-Generating Functions for the Exponential Distributions
12:44
Moment-Generating Functions for the Chi-square Distributions
13:11
Moment-Generating Functions for the Beta Distributions
13:48
Useful Formulas with Moment-Generating Functions
15:02
Useful Formulas with Moment-Generating Functions 1
15:03
Useful Formulas with Moment-Generating Functions 2
16:21
Example I: Moment-Generating Function for the Binomial Distribution
17:33
Example II: Use the MGF for the Binomial Distribution to Find the Mean of the Distribution
24:40
Example III: Find the Moment Generating Function for the Poisson Distribution
29:28
Example IV: Use the MGF for Poisson Distribution to Find the Mean and Variance of the Distribution
36:27
Example V: Find the Moment-generating Function for the Uniform Distribution
44:47
Section 5: Multivariate Distributions
Bivariate Density & Distribution Functions

50m 52s

Intro
0:00
Bivariate Density Functions
0:21
Two Variables
0:23
Bivariate Density Function
0:52
Properties of the Density Function
1:57
Properties of the Density Function 1
1:59
Properties of the Density Function 2
2:20
We Can Calculate Probabilities
2:53
If You Have a Discrete Distribution
4:36
Bivariate Distribution Functions
5:25
Bivariate Distribution Functions
5:26
Properties of the Bivariate Distribution Functions 1
7:19
Properties of the Bivariate Distribution Functions 2
7:36
Example I: Bivariate Density & Distribution Functions
8:08
Example II: Bivariate Density & Distribution Functions
14:40
Example III: Bivariate Density & Distribution Functions
24:33
Example IV: Bivariate Density & Distribution Functions
32:04
Example V: Bivariate Density & Distribution Functions
40:26
Marginal Probability

42m 38s

Intro
0:00
Discrete Case
0:48
Marginal Probability Functions
0:49
Continuous Case
3:07
Marginal Density Functions
3:08
Example I: Compute the Marginal Probability Function
5:58
Example II: Compute the Marginal Probability Function
14:07
Example III: Marginal Density Function
24:01
Example IV: Marginal Density Function
30:47
Example V: Marginal Density Function
36:05
Conditional Probability & Conditional Expectation

1h 2m 24s

Intro
0:00
Review of Marginal Probability
0:46
Recall the Marginal Probability Functions & Marginal Density Functions
0:47
Conditional Probability, Discrete Case
3:14
Conditional Probability, Discrete Case
3:15
Conditional Probability, Continuous Case
4:15
Conditional Density of Y₁ given that Y₂ = y₂
4:16
Interpret This as a Density on Y₁ & Calculate Conditional Probability
5:03
Conditional Expectation
6:44
Conditional Expectation: Continuous
6:45
Conditional Expectation: Discrete
8:03
Example I: Conditional Probability
8:29
Example II: Conditional Probability
23:59
Example III: Conditional Probability
34:28
Example IV: Conditional Expectation
43:16
Example V: Conditional Expectation
48:28
Independent Random Variables

51m 39s

Intro
0:00
Intuition
0:55
Experiment with Two Random Variables
0:56
Intuition Formula
2:17
Definition and Formulas
4:43
Definition
4:44
Short Version: Discrete
5:10
Short Version: Continuous
5:48
Theorem
9:33
For Continuous Random Variables, Y₁ & Y₂ are Independent If & Only If: Condition 1
9:34
For Continuous Random Variables, Y₁ & Y₂ are Independent If & Only If: Condition 2
11:22
Example I: Use the Definition to Determine if Y₁ and Y₂ are Independent
12:49
Example II: Use the Definition to Determine if Y₁ and Y₂ are Independent
21:33
Example III: Are Y₁ and Y₂ Independent?
27:01
Example IV: Are Y₁ and Y₂ Independent?
34:51
Example V: Are Y₁ and Y₂ Independent?
43:44
Expected Value of a Function of Random Variables

37m 7s

Intro
0:00
Review of Single Variable Case
0:29
Expected Value of a Single Variable
0:30
Expected Value of a Function g(Y)
1:12
Bivariate Case
2:11
Expected Value of a Function g(Y₁, Y₂)
2:12
Linearity of Expectation
3:24
Linearity of Expectation 1
3:25
Linearity of Expectation 2
3:38
Linearity of Expectation 3: Additivity
4:03
Example I: Calculate E (Y₁ + Y₂)
4:39
Example II: Calculate E (Y₁Y₂)
14:47
Example III: Calculate E (U₁) and E(U₂)
19:33
Example IV: Calculate E (Y₁) and E(Y₂)
22:50
Example V: Calculate E (2Y₁ + 3Y₂)
33:05
Covariance, Correlation & Linear Functions

59m 50s

Intro
0:00
Definition and Formulas for Covariance
0:38
Definition of Covariance
0:39
Formulas to Calculate Covariance
1:36
Intuition for Covariance
3:54
Covariance is a Measure of Dependence
3:55
Dependence Doesn't Necessarily Mean that the Variables Do the Same Thing
4:12
If Variables Move Together
4:47
If Variables Move Against Each Other
5:04
Both Cases Show Dependence!
5:30
Independence Theorem
8:10
Independence Theorem
8:11
The Converse is Not True
8:32
Correlation Coefficient
9:33
Correlation Coefficient
9:34
Linear Functions of Random Variables
11:57
Linear Functions of Random Variables: Expected Value
11:58
Linear Functions of Random Variables: Variance
12:58
Linear Functions of Random Variables, Cont.
14:30
Linear Functions of Random Variables: Covariance
14:35
Example I: Calculate E (Y₁), E (Y₂), and E (Y₁Y₂)
15:31
Example II: Are Y₁ and Y₂ Independent?
29:16
Example III: Calculate V (U₁) and V (U₂)
36:14
Example IV: Calculate the Covariance Correlation Coefficient
42:12
Example V: Find the Mean and Variance of the Average
52:19
Section 6: Distributions of Functions of Random Variables
Distribution Functions

1h 7m 35s

Intro
0:00
Premise
0:44
Premise
0:45
Goal
1:38
Goal Number 1: Find the Full Distribution Function
1:39
Goal Number 2: Find the Density Function
1:55
Goal Number 3: Calculate Probabilities
2:17
Three Methods
3:05
Method 1: Distribution Functions
3:06
Method 2: Transformations
3:38
Method 3: Moment-generating Functions
3:47
Distribution Functions
4:03
Distribution Functions
4:04
Example I: Find the Density Function
6:41
Step 1: Find the Distribution Function
6:42
Step 2: Find the Density Function
10:20
Summary
11:51
Example II: Find the Density Function
14:36
Step 1: Find the Distribution Function
14:37
Step 2: Find the Density Function
18:19
Summary
19:22
Example III: Find the Cumulative Distribution & Density Functions
20:39
Step 1: Find the Cumulative Distribution
20:40
Step 2: Find the Density Function
28:58
Summary
30:20
Example IV: Find the Density Function
33:01
Step 1: Setting Up the Equation & Graph
33:02
Step 2: If u ≤ 1
38:32
Step 3: If u ≥ 1
41:02
Step 4: Find the Distribution Function
42:40
Step 5: Find the Density Function
43:11
Summary
45:03
Example V: Find the Density Function
48:32
Step 1: Exponential
48:33
Step 2: Independence
50:48
Step 2: Find the Distribution Function
51:47
Step 3: Find the Density Function
1:00:17
Summary
1:02:05
Transformations

1h 16s

Intro
0:00
Premise
0:32
Premise
0:33
Goal
1:37
Goal Number 1: Find the Full Distribution Function
1:38
Goal Number 2: Find the Density Function
1:49
Goal Number 3: Calculate Probabilities
2:04
Three Methods
2:34
Method 1: Distribution Functions
2:35
Method 2: Transformations
2:57
Method 3: Moment-generating Functions
3:05
Requirements for Transformation Method
3:22
The Transformation Method Only Works for Single-variable Situations
3:23
Must be a Strictly Monotonic Function
3:50
Example: Strictly Monotonic Function
4:50
If the Function is Monotonic, Then It is Invertible
5:30
Formula for Transformations
7:09
Formula for Transformations
7:11
Example I: Determine whether the Function is Monotonic, and if so, Find Its Inverse
8:26
Example II: Find the Density Function
12:07
Example III: Determine whether the Function is Monotonic, and if so, Find Its Inverse
17:12
Example IV: Find the Density Function for the Magnitude of the Next Earthquake
21:30
Example V: Find the Expected Magnitude of the Next Earthquake
33:20
Example VI: Find the Density Function, Including the Range of Possible Values for u
47:42
Moment-Generating Functions

1h 18m 52s

Intro
0:00
Premise
0:30
Premise
0:31
Goal
1:40
Goal Number 1: Find the Full Distribution Function
1:41
Goal Number 2: Find the Density Function
1:51
Goal Number 3: Calculate Probabilities
2:01
Three Methods
2:39
Method 1: Distribution Functions
2:40
Method 2: Transformations
2:50
Method 3: Moment-Generating Functions
2:55
Review of Moment-Generating Functions
3:04
Recall: The Moment-Generating Function for a Random Variable Y
3:05
The Moment-Generating Function is a Function of t (Not y)
3:45
Moment-Generating Functions for the Discrete Distributions
4:31
Binomial
4:50
Geometric
5:12
Negative Binomial
5:24
Hypergeometric
5:33
Poisson
5:42
Moment-Generating Functions for the Continuous Distributions
6:08
Uniform
6:09
Normal
6:17
Gamma
6:29
Exponential
6:34
Chi-square
7:05
Beta
7:48
Useful Formulas with the Moment-Generating Functions
8:48
Useful Formula 1
8:49
Useful Formula 2
9:51
How to Use Moment-Generating Functions
10:41
How to Use Moment-Generating Functions
10:42
Example I: Find the Density Function
12:22
Example II: Find the Density Function
30:58
Example III: Find the Probability Function
43:29
Example IV: Find the Probability Function
51:43
Example V: Find the Distribution
1:00:14
Example VI: Find the Density Function
1:12:10
Order Statistics

1h 4m 56s

Intro
0:00
Premise
0:11
Example Question: How Tall Will the Tallest Student in My Next Semester's Probability Class Be?
0:12
Setting
0:56
Definition 1
1:49
Definition 2
2:01
Question: What are the Distributions & Densities?
4:08
Formulas
4:47
Distribution of Max
5:11
Density of Max
6:00
Distribution of Min
7:08
Density of Min
7:18
Example I: Distribution & Density Functions
8:29
Example I: Distribution
8:30
Example I: Density
11:07
Example I: Summary
12:33
Example II: Distribution & Density Functions
14:25
Example II: Distribution
14:26
Example II: Density
17:21
Example II: Summary
19:00
Example III: Mean & Variance
20:32
Example III: Mean
20:33
Example III: Variance
25:48
Example III: Summary
30:57
Example IV: Distribution & Density Functions
35:43
Example IV: Distribution
35:44
Example IV: Density
43:03
Example IV: Summary
46:11
Example V: Find the Expected Time Until the Team's First Injury
51:14
Example V: Solution
51:15
Example V: Summary
1:01:11
Sampling from a Normal Distribution

1h 7s

Intro
0:00
Setting
0:36
Setting
0:37
Assumptions and Notation
2:18
Assumption Forever
2:19
Assumption for this Lecture Only
3:21
Notation
3:49
The Sample Mean
4:15
Statistic We'll Study the Sample Mean
4:16
Theorem
5:40
Standard Normal Distribution
7:03
Standard Normal Distribution
7:04
Converting to Standard Normal
10:11
Recall
10:12
Corollary to Theorem
10:41
Example I: Heights of Students
13:18
Example II: What Happens to This Probability as n → ∞
22:36
Example III: Units at a University
32:24
Example IV: Probability of Sample Mean
40:53
Example V: How Many Samples Should We Take?
48:34
The Central Limit Theorem

1h 9m 55s

Intro
0:00
Setting
0:52
Setting
0:53
Assumptions and Notation
2:53
Our Samples are Independent (Independent Identically Distributed)
2:54
No Longer Assume that the Population is Normally Distributed
3:30
The Central Limit Theorem
4:36
The Central Limit Theorem Overview
4:38
The Central Limit Theorem in Practice
6:24
Standard Normal Distribution
8:09
Standard Normal Distribution
8:13
Converting to Standard Normal
10:13
Recall: If Y is Normal, Then …
10:14
Corollary to Theorem
11:09
Example I: Probability of Finishing Your Homework
12:56
Example I: Solution
12:57
Example I: Summary
18:20
Example I: Confirming with the Standard Normal Distribution Chart
20:18
Example II: Probability of Selling Muffins
21:26
Example II: Solution
21:27
Example II: Summary
29:09
Example II: Confirming with the Standard Normal Distribution Chart
31:09
Example III: Probability that a Soda Dispenser Gives the Correct Amount of Soda
32:41
Example III: Solution
32:42
Example III: Summary
38:03
Example III: Confirming with the Standard Normal Distribution Chart
40:58
Example IV: How Many Samples Should She Take?
42:06
Example IV: Solution
42:07
Example IV: Summary
49:18
Example IV: Confirming with the Standard Normal Distribution Chart
51:57
Example V: Restaurant Revenue
54:41
Example V: Solution
54:42
Example V: Summary
1:04:21
Example V: Confirming with the Standard Normal Distribution Chart
1:06:48
Loading...
This is a quick preview of the lesson. For full access, please Log In or Sign up.
For more information, please see full course syllabus of Probability
Bookmark & Share Embed

Share this knowledge with your friends!

Copy & Paste this embed code into your website’s HTML

Please ensure that your website editor is in text mode when you paste the code.
(In Wordpress, the mode button is on the top right corner.)
  ×
  • - Allow users to view the embedded video in full-size.
Since this lesson is not free, only the preview will appear on your website.
  • Discussion

  • Answer Engine

  • Study Guides

  • Download Lecture Slides

  • Table of Contents

  • Transcription

Start Learning Now

Our free lessons will get you started (Adobe Flash® required).
Get immediate access to our entire library.

Sign up for Educator.com

Membership Overview

  • Unlimited access to our entire library of courses.
  • Search and jump to exactly what you want to learn.
  • *Ask questions and get answers from the community and our teachers!
  • Practice questions with step-by-step solutions.
  • Download lesson files for programming and software training practice.
  • Track your course viewing progress.
  • Download lecture slides for taking notes.
  • Learn at your own pace... anytime, anywhere!

Sampling from a Normal Distribution

Download Quick Notes

Sampling from a Normal Distribution

Lecture Slides are screen-captured images of important points in the lecture. Students can download and print out these lecture slide images to do practice problems as well as take notes while watching the lecture.

  • Intro 0:00
  • Setting 0:36
    • Setting
  • Assumptions and Notation 2:18
    • Assumption Forever
    • Assumption for this Lecture Only
    • Notation
  • The Sample Mean 4:15
    • Statistic We'll Study the Sample Mean
    • Theorem
  • Standard Normal Distribution 7:03
    • Standard Normal Distribution
  • Converting to Standard Normal 10:11
    • Recall
    • Corollary to Theorem
  • Example I: Heights of Students 13:18
  • Example II: What Happens to This Probability as n → ∞ 22:36
  • Example III: Units at a University 32:24
  • Example IV: Probability of Sample Mean 40:53
  • Example V: How Many Samples Should We Take? 48:34

Transcription: Sampling from a Normal Distribution

Hello, welcome back to the probability lectures here on www.educator.com, my name is Will Murray.0000

Today, we are going to talk about sampling from a normal distribution, which is really starting to get into statistics.0006

Sometimes it is still considered as a topic in probability.0013

We are going to go ahead and talk about it.0017

We are not using the Central Limit Theorem,0020

I have another lecture on the Central Limit Theorem that is going to come after this one.0022

If you are looking for Central Limit Theorem, just skip ahead and look for that video.0026

In the meantime, we are going to learn how to take samples from a normal distribution0030

without using the Central Limit Theorem.0035

Let us see what this is all about.0037

First of all, I have to tell you what it means to take samples.0040

The idea is that, we have some kind of population of stuff and that can be almost anything.0042

The example I have here is, we have, maybe students in the university and the students are all different heights.0048

We are going to select some students at random and measure their heights.0054

That is taking a bunch of samples, it could be that you are testing,0060

for example, a soda machine and you are testing whether it dispenses the right amount of soda.0064

You fill up 10 cups of soda and you measure how much they put in each cup.0070

That is the same kind of idea, you are taking a bunch of samples of the population.0075

The population that we are studying has a mean ν and the variance σ².0080

We do not always know those, in this lecture we are going to need to know the variance but we will not always know the mean.0085

A lot of examples that we will be studying, we would not know the mean of the population.0091

As I mentioned, we are going to take samples.0098

If we are talking about students and different heights, then our Y1 would represent the first student that we sample.0100

We meet a student at random and then we measure how tall that person is.0106

We meet another student at random and measure how tall that person is, and that is Y2.0110

We meet another student at random, and so on, until we meet our last student which is the YN.0115

We measure the height of that last sample student.0123

That is what it means to take samples, you have a population,0126

you select some of them at random and you measure whatever quantity you are interested in0129

for each one of those of random selections.0135

Let me show you some assumptions that we need to get started.0138

The assumption that we are going to make for this lecture and for the next one, is that our samples are independent.0142

If we happen to find a couple of tall students, it does not make it more or less likely0149

that the student after that will be extra tall or extra short.0154

The samples are totally independent of each other, that let us use some of our theorems about random variables.0157

There is a buzz phrase that people use in probability and statistics, it is independent identically distributed.0163

That is often shortened to IID because that phrase is used so often.0172

IID means independent identically distributed random variables.0177

The independent part is the assumption we just made.0182

Identically distributed means that they are coming from the same population.0185

Taking students from the same university or we are measuring how much a soda machine dispenses0189

and we are taking samples from the same soda machine, that kind of thing.0197

The assumption for this lecture only, for this lecture on normal population, is that our population has a normal distribution.0201

That is the key difference between this lecture and the next lecture.0211

The next lecture is on the central limit theorem.0214

Up till now, everything else is the same.0216

In the next lecture, we would not need to assume their population is normally distributed.0218

In this lecture, we are assuming normal distribution because we are not using the central limit theorem.0224

The notation that we use for that normal distribution is that,0230

we say each of our samples has a normal distribution and it has mean ν and variance σ².0234

That is the notation that we use for a normal distribution.0243

The first variable is always the mean and the second variable is always the variance there.0249

What we are going to do is take all of our samples, we will measure the heights of each student.0257

And then, we will take the average of the heights that we measure, that is called the sample mean.0263

We often use Y ̅ as a notation for the sample mean.0268

It just means the average, it just means we take all the qualities that you measure0271

and you add them up, and you divide by N.0275

The idea is, we are going to use the sample mean and ask questions about how it relates to the actual mean of the population.0280

It is not necessarily true that the average of the students that I study is equal to the average of the entire student population.0289

The examples in this lecture are, all have to do with questions about how close are those 2 means to each other.0298

What is the mean of the students we study, that is the sample mean.0305

What is the mean of the entire population?0309

That was the population mean which is not the same as the sample mean.0312

The population mean, remember we were saying was μ.0319

That is what we mean by the population mean, it is the average of all the students at the entire university.0324

The sample mean is just the mean of the few students that we select to study and actually measure 1 by 1.0331

The theorem that we are going to be using here is, the assumption we already made is that0341

each of one of the variables Yi has a normal distribution with mean ν and variance σ².0347

Then, the sample mean also has a normal distribution.0355

It also has mean ν but its variance is σ²/N.0359

It actually has a smaller variance than the individual random variables, representing the individual measurements.0366

Just to summarize that words, Y ̅ the sample mean is a normally distributed random variable0377

with mean ν and variance σ²/N.0383

This is the key to the whole idea of sampling which is that, as you take more samples that means N grows.0387

N is the number of samples that you take.0396

If you take more samples, it shrinks the variance of your mean,0397

what it means that your sample mean is more likely to be accurate.0403

It is less variable, that is why it is more accurate to take a survey with a large number of people0409

than with a small number because it shrinks the variance.0415

This is the mathematical principle that guarantees that.0419

What we are going to do is, we are going to use the fact that0426

we have a normal distribution on this sample mean to ask questions about probabilities.0430

The whole point of asking questions about probabilities is that,0440

we can convert normal distributions to standard normal distributions.0445

I will show you the equation for that on the next slide.0449

Let me just mention that we can answer questions about standard normal distributions using charts.0452

That is what I got on this slide, right here, a standard normal distribution.0459

We often use the variable Z for a standard normal distribution.0463

What that means is that, it is a normal distribution with mean 0 and variance 1 or standard deviation 1.0467

That is what the standard normal distribution means.0477

We got mean 0 and variance 1.0481

The whole point of standard normal distributions is that, there are standard charts that you can look up probabilities on.0486

We will be using this to solve the problems later on in the lecture.0492

What all these little numbers represent on the chart is0496

the probability of being above a certain cut off for a standard normal distribution.0499

For example, if you want to say what is the probability that Z is greater than 0.93, for example.0506

I would look at 0.9 here and I see the second decimal place is 0.03.0520

I see that, that number on my chart there is 0.1762.0527

My answer here would be 0.1762.0534

That is how we look up to a probability for a standard normal distribution.0539

It is also frequently asked, what is the probability that Z is less than something or between to bounds.0546

What you have to do is always keep this picture in mind.0555

If you want to find the probability that Z is less than something,0560

what you do is you use this chart to find the probability that Z is greater than that cutoff, then do 1 - that probability.0564

Let me write that down, the probability that Z is less than some cutoff is equal to 1 –0573

the probability that Z is greater than some cutoff.0580

We will see some examples of that in the exercises.0584

You can also talk about the probability of Z being between 2 cutoffs.0588

You can also flip these numbers around to get probabilities for negative values of Z.0592

We will practice all of that in the examples.0599

Remember, this is just for a standard normal distribution, when we have mean 0 and standard deviation 1.0602

Let me show you now what you do for a nonstandard normal distribution.0609

If you have a nonstandard normal distribution, what you do is you take any normal distribution.0614

To make it standard normal, you subtract off its mean and you divide by its standard deviation.0623

And then, what you get we are going to call that Z is a normal distribution that is standard.0630

The mean is 0 and the standard deviation is 1.0638

Let me remind you of that theorem we had.0642

We had that theorem that said that Y ̅ is a normal distribution.0644

Its mean is μ and its variance is σ²/N.0649

That means its mean is μ and the standard deviation, remember is always the square root of the variance.0657

Standard deviation is the square root of σ²/N, which is σ/√N.0663

What we do to convert to Y ̅ to a standard normal is, we do Y ̅ - its mean.0676

Y ̅ - μ divided by its standard deviation, we do σ divided by √N.0683

Since that is in the denominator, a fraction, I’m going to do a little flip.0692

I will get our √N × Y ̅ - μ divided by σ.0698

That, according to my theorem is a standard normal distribution.0708

If you call that Z then that is a standard normal distribution.0713

It means I can use those charts, the chart that I showed you on the previous side, to look up probabilities for Z.0718

That is the way you play this game.0725

Often, the way the examples pen out is you are asked something about Y ̅.0727

Often, you are asked about the relationship between Y ̅ and μ.0733

How likely is it that Y ̅ and μ are within 1 unit of each other, something like this.0737

You are trying to solve something about Y ̅ – μ.0743

The trick here is to take Y ̅ – μ, and you are asked how likely is that Y ̅ - μ is within ½ unit of each other, something like that.0747

The trick to solving these things is to convert it to a standard normal.0760

The way you convert to standard normal is you multiply by √N/σ.0764

That is a standard normal variable and you can look up probabilities for that on the charts.0773

We will get a lot of practice of that in the exercises, but I want you to see the general idea first,0780

which is that you start with Y ̅ - μ and then you multiply by √N/σ to convert it into a standard normal variable.0787

I will go ahead and practice some exercises with that.0796

The first example, we are going to measure the heights of students at a particular university.0800

We are given that they are normally distributed, that means we can use the theorems0805

that we have learned from this lecture.0809

There is a standard deviation of 4 inches, that is going to be the σ that we are going to use later on.0812

We are going to measure the heights of 9 students.0819

We are going to sample 9 students, we are just going to go out in the crowd at the university0822

and grab the first 9 students that we see.0826

We are going to measure how tall they are.0829

The question is, what is the probability that their mean,0831

that means the mean of the students that we study is within 2 inches of the global means.0835

That is the mean of all the students at the university.0841

I will just clarify the difference between those 2 means here.0846

When I say the global mean, that is all students at the university,0849

all the students at enormous state university that you attend, or maybe you do not.0855

Their mean, when I say their mean that means the sample mean of the students that we include in our sample.0864

That is those 9 students that we are going to study.0871

The global mean, the notation we have been using for that is μ, the sample mean is Y ̅.0875

The question is, what is the probability that those 2 means are within 2 inches of each other?0884

Within 2 inches of each other, that means + or -2 inches in either direction.0891

We are trying to solve the question of Y ̅ – μ, the sample mean - the global mean.0896

We are going to put absolute values to get myself within 2 inches.0906

The absolute value of A - B means the distance from A to B.0910

What is the probability that the distance from the sample mean to the global mean is less than or equal to 2?0915

That is what we are trying to solve, we are trying to find the probability of that.0923

But remember, the trick here is to convert to a standard normal distribution.0927

The way you convert to a standard normal distribution is, you multiply both sides Y ̅ – μ.0933

We multiply both sides by √N/σ.0942

I will do the same thing on the right, 2 √N/σ.0953

Let me fill in now what I know, I know that √N, N = 9, that is the number of students that we surveyed here.0960

√N would then be 3, that is 2 × 3.0972

The σ is the standard deviation, that is 4 inches.0982

2 × 3/4, that is a very easy thing to simplify, that is just 3/2, that is 1.5.0992

The point of that is that, that was a standard normal variable.1001

We are asking about the absolute value of Z being less than or equal to 1.5.1005

We are trying to find the probability that a standard normal variable will be less than or equal to 1.5.1012

Let me draw a little graph here and show you what kind of area we are looking for.1019

We will actually look it up on the next slide.1024

What we are looking for here, I will draw my standard normal variable centered at 0 because it has mean 0.1027

That is supposed to be symmetric.1034

Here is -1.5 and I'm looking for that probability in between those two bounds right there.1036

That is not exactly what the chart will tell me directly.1048

What the chart will tell me is, if I have a particular Z value in mind, remember what the chart tells me.1052

It will tell me the probability that Z is bigger than that value.1060

I have to figure out from that, what my probability is that Z is between -1.5 and 1.5.1065

The probability that the absolute value of Z is less than or equal 1.5.1073

I see I can get it by taking that outside probability and subtracting off two copies of it, because there is a bottom tail and a top tail there.1080

It is 1 -2 × the probability that Z is greater than 1.5.1089

That is something that I will be able to look up on my chart, on the next slide.1096

I just wanted to make sure that you understand where it is going to come from.1100

I will look that up and that will give me my answer to this example.1104

Let me recap the steps here, before I turn the page.1109

I'm setting up a sample mean Y ̅ and a global mean μ.1113

I want them to be within 2 inches of each other.1120

That means, I want their difference to be less than 2 in absolute value.1124

And then, I'm kind of building up my standard normal variable by multiplying that by √N/σ.1128

Remember, the whole point was √N/σ × Y ̅ - μ is the standard normal variable.1135

I’m building up that expression here, I multiply by √N/σ.1150

My N was 9, √N was 3, my σ was 4 because that was the standard deviation given to me.1155

I can simplify that down to 1.5, I want the probability that a standard normal variable1163

will be less than 1.5 and absolute value, which really means between -1.5 and 1.5.1168

Which means to calculate that area, I am going to have to look at the tail and subtract off two copies of the tail,1176

because there is a top tail and bottom tail there.1183

That is why I'm going to multiply this probability by 2, the probability that I will figure out on the next slide.1186

Let me go ahead and figure out that probability using the chart.1193

What we figured out is that, our probability that Z is less than or equal to 1.5.1199

This is coming from the previous slide, you can scroll back and watch it if you like.1208

That is 1 -2 × the probability that Z is greater than or equal to 1.5.1212

I need to find 1.5 on my chart, there it is right there 1.5, it is 1.50.1220

I’m going to take this number right here, 0.0668.1228

1 - 2 × 0.0668, and that is 1 – 0.668 × 2 is 1336.1234

It is 1 -0.1336 and 1 -0.1336 is 0.8664.1257

If you wanted to estimate that as a percentage then that is about 87%.1271

That is the answer to my problem.1277

If I survey these 9 students and measure their heights,1281

there is an 87% chance that my sample mean will be within 2 inches of the global mean of the population.1285

That is how likely I am to get an accurate estimate, when I survey 9 students.1294

If I want to make it more accurate, I will survey more students because that would increase the value of N in my calculations.1302

Just to recap what we did on this side, we figured out on the previous side, we want Z to be less than 1.5 and absolute value.1310

We figured out that, to get Z less than 1.5, what we can do is look at these two tails1320

and cut off the two tails that represent the probability of Z being bigger than 1.5.1327

That is why I have 1 -2 × the probability of Z being bigger than 1.5.1335

Then I found 1.50 on my chart, 1.50 and there it is 0.0668.1340

Plug in that number, do a little calculation and got to my answer of 87%.1349

In our second example here, we re given that Y1 through YN1358

are independent identically distributed random variables, that is what that phrase IID means.1363

It means independent identically distributed.1371

It comes up so often in probability that people just use that abbreviation for it.1373

Each Y, it has a normal distribution with mean μ and variance σ².1378

We are told that σ² is 64 and N is 36.1386

N is the number of samples that we are going to be taking.1390

We want to find the probability that the sample mean Y ̅ is within 1 unit of the global mean μ.1393

We are also asked, what happens to this probability as N goes to infinity?1401

Let us think about that, remember Y ̅ is the sample mean.1408

It is the average of the samples and we want that to be close to μ which means we want Y ̅ - μ to be small.1412

We want it to be less than 1.1421

What I would like to do is build up a standard normal variables so I can use my theorem here.1428

I'm going to multiply just like before, by √N/σ.1436

I will √N/σ on the right hand side, I got to do the same thing to the left and right hand side.1444

The point of that was that, that will give me a standard normal variable.1451

I’m putting absolute values on the Z because I have the absolute values on Y – μ there.1456

In this case, we are given that N is 36, √N = 6.1460

Let me say that Z is less than or equal to 1 × √N is 6, σ² is 64 and that tells me that σ would be 8.1469

We are given the variance, instead of the standard deviation that time.1483

It is a quick maneuver to go from the variance to the standard deviation.1487

N the standard deviation is always the square root of the variance.1491

In this case, we got 8 and that is ¾.1493

I need to find the probability that a standard normal variable is going to be less than or equal to ¾.1498

Let me draw a little picture here.1510

There is ¾, we want it to be between ¾ and - ¾.1518

We are looking for that area right there.1523

Just like in example 1, the way we can calculate that area is by finding the area outside there, the tail area.1526

And then, subtracting off two copies of the tail.1533

The probability that Z is less than or equal to ¾ is equal to 1 - 2 × the tail area, 2 × the probability that Z is bigger than ¾.1539

I got a standard normal distribution setup on the next slide.1561

But I think what I'm going to do is just tell you what the answer is right now,1567

I can go ahead and use that space on this slide to show you the rest of the calculations.1571

I will justify this number on the next slide, but what we are going to find is we will look up 0.75 on the next slide.1576

We will figure out that the probability, the tail area of being bigger than that is 0.2266.1584

That is what I looked up on the next slide.1595

This is 1 -2 × 0.2266, and in turn that is 1 -, 2 × that is 0.4532.1599

1 - 0.4532 is 0.5468 or approximately 55% there.1613

That is the probability that you are going to be within 1 unit of the global mean μ, when you take these samples.1623

The second part of this question here says, what happens to this probability as N goes to infinity.1635

Let us think about it, as N goes to infinity that means your N is getting bigger and bigger.1641

Which, if we trace through these calculations, that means that N would get very big.1647

That in turn, would make that number very big, get bigger and bigger.1655

That number is big, which means what you are doing is you are kind of moving these goal posts farther and farther out.1660

This ¾ would get replaced by a bigger number.1674

You would be looking at a wider range of your normal distribution.1680

That probability would get bigger and bigger.1686

Another way to think about it is that, the probability of being in the tail would get smaller and smaller.1690

The probability of being in the tail would get smaller.1696

The overall probability would get bigger and bigger.1702

In fact, as N goes to infinity that probability would go to 11709

because you encompass more and more of the area, showing that you have more and more probability.1714

The probability goes to 1, as N goes to infinity.1720

That should make sense to you, it is kind of the precise version of saying that as you take more samples,1724

your probability of being accurate is higher and higher and in fact, it approaches 1.1733

If you take infinitely many samples then you are guaranteed to get an accurate average.1738

Let me recap the steps here, before I jump onto the next slide and show you where that 1 number comes from.1745

I still owe you that 0.2266, that part is still mysterious.1751

I started out with Y - μ being less than or equal to 1.1755

That came from this phrase here, Y is within 1 unit or Y ̅ is within 1 unit of μ.1760

I want to build up a standard normal variable.1769

I multiply by the √N/σ.1772

I multiply both sides there by √N/σ.1775

The N was 36 which means the √N is 6, that is √N right there.1778

I was told σ² was 64, that tells me σ is 8, that is where that 8 comes from.1786

6/8 collapses to 3/4 which means I'm looking at the region between - ¾ and ¾.1796

The clever way to calculate that using the chart is, to find the tail region bigger than ¾1805

because that is what our chart will do for us.1811

The probability of being in that tail region, this is the part I’m going to look up on the chart on the next slide, is 0.2266.1814

That is the only part that should not make sense yet, until you see the next slide.1822

And then, when I plugged in 0.2266 and I simplified the calculations,1826

it just reduced down 2.5468 or just about 55% is my probability.1831

I went through and trace the role of N, in those calculations.1838

I noticed that, if you put in a bigger and bigger N, we get a bigger cutoff for the bounds of the region here.1842

When you extend the bounds outwards, it means your tails are getting smaller.1849

The tails is what we subtracted, the total probabilities can get bigger and bigger, and go to 1 as N goes to infinity.1855

That should conform with your intuition because it means that, as you take more and more samples,1865

you are more and more likely to get accurate estimations of the global mean.1873

That is sort of reassuring that it worked out that way.1878

Let me show you where this number comes from, this 0.2266, that is the one part that I have not shown you yet.1881

That comes from this chart right here of the normal table, we are trying to find the probability that Z was greater than ¾.1887

Of course, ¾ in decimal is 0.75.1899

I just have to find 0.75 on this chart, here 0.7 over here and here is the 0.05.1904

I find their intersection, there it is 0.2266, that is the answer that I plugged into my calculations on the previous slide.1915

That was the only missing piece of the puzzle on the previous slide.1930

Everything else is supposed to make sense and you are supposed to understand the answer to example 2 now.1934

In example 3, we have students at a university and we are going to keep track of the number of units that they have taken.1946

It turns out that they have taken an average of 70 units, but their standard deviation is 20 units.1954

Most students are probably somewhere between 50 units and 90 units, averaging around 70 units.1960

We are going to assume that this is a normal distribution.1966

We are going to sample 9 students.1970

We are just going to go out and meet 9 students in the quad.1972

We are going to say, how many units have you taken?1977

We will do that 9 × and then we will calculate the average, just of those 9 students that we have met.1981

We do not have the time to meet every student in the university,1989

we will just sample 9 of them and calculate the average unit load of those 9 students.1992

The question is, are we likely to get an average between 67 and 73 units?1997

Let me show you how you think about that.2005

67 is 3 units down from the mean, that is 70 -3.2008

73 is 3 units up from the mean.2015

What we are really asking is, what is our chance that we will be within 3 units of the global mean.2019

The global mean is 70 units, the mean of the students that we are surveying is Y ̅.2027

We want Y ̅ - μ here to be 3 units, to be less than or equal to 3 units.2034

That is to get Y ̅ between 67 and 73, that is what we want to study there.2043

Remember, the whole point is that we want to convert this into a standard normal variable.2054

Our standard normal variable is always √N/σ × Y ̅ – μ.2060

I'm going to multiply on some factors of √N/σ.2068

√N/σ × Y ̅ - μ is less than or equal to 3 √N/σ.2075

The whole point of that was that gives me a standard normal variable.2086

I want to find now the probability that a standard normal variable will be within this range.2092

√N, what is my N, that is the number of students that I'm sampling, that is 9.2098

√N is going to be 3, that is coming from the √9 there, that is not the 3 that I found up above.2103

The σ is the standard deviation of the population which is given to me to be 20 units.2113

That is going to be 20 and my Z should be less than or equal to 3 × 3/20.2123

9/20, I can convert that into a decimal, I think it is going to be useful 1/20 is 0.0579.2134

9 of those is 0.45.2141

We really want to find that probability that Z and absolute value is less than 0.45.2147

Let me draw a picture of what I'm trying to calculate here.2155

It is always very useful to draw pictures of these normal distributions.2158

I hope to keep track of what you are looking up on the table.2162

In this case, I wanted to be between -0.45 and 0.45.2165

Those were not very symmetric there, that should be symmetric because 0.45 is the same distance on either side, there is 0.45.2172

I'm looking for that area in between there.2182

The probability that the absolute value of Z is less than 0.45.2186

Another way to find that would be the probability that -0.45 is less than Z, is less than 0.45.2196

That is not something that the table will tell me directly.2206

Remember, the table will tell me how big, how much area I have in the tail of the distribution.2208

What I will do is, I will find the area and the tail from the table.2215

It looks like that I have to subtract 2 tails there.2220

1 - 2 × the probability of Z being bigger than 0.45, that is what I'm going to look up on the next slide2223

and actually convert that into an answer.2234

Let me just to go over the steps again quickly for this slide.2238

I was given that I'm looking for unit total between 67 and 73.2241

If I want Y ̅ to be between 67 and 73, that is + or -3 from the mean of 70.2246

It is the same as saying Y ̅ - μ is less than 3.2255

I wanted to convert that into a standard normal variable.2259

I multiplied both sides by √N/Σ, in order to build up my standard normal formula.2263

My Z is now less than or equal to, 3 × √N.2271

I was given that N = 9, where did that come from?2277

That is number of students that you sample.2279

My standard deviation is 20, that is my σ right there and the 9 was the N.2283

I plug those values in and I get up absolute value of Z is less than 0.45.2292

And then, I did a quick little picture to see what kind of area I’m measuring.2297

I see that the way to measure that area is really to measure the tails, and then subtract the 2 tails from 1.2303

That is what I'm going to carry over onto my neck slide and solve it out using a normal chart.2310

This is kind of the rest of the example 3, we are going to use the normal chart.2317

We figured out that, we are looking for the probability that Z is less than 0.45.2321

The absolute value of Z is less than 0.45 which is 1 - 2 × the probability that Z is greater than 0.45.2326

Remember, that is what we are finding with this normal chart.2340

It will tell you the amount of area in the tail there, the probability in the tail.2342

I need to find 0.45 on this chart.2347

Here is 0.4, here 0.5, I see 0.3264 at the intersection of that row and column.2350

1 -2 × 0.3264 and 2 × 0.3264 is 0.65, 64 × 2 is 128, .028.2359

That is 0.3472 or approximately 35%, that is the probability that the 9 students2388

that I survey will have their average unit load somewhere between 67 and 72.2404

What I really calculated there, in other words the steps in the middle, was the probability that Y ̅ is between 67 and 73.2412

Most of this was done on the previous page.2426

Most of the dirty works was done on the previous page.2427

All that I did on this page was, I use the chart to find the probability that Z was less than 0.45.2429

In order to figure that out, I subtracted off 2 tails here.2438

I looked up the value of the area in that tail, and then I just did2442

a little simplification with the numbers and reduced it down to 35%.2447

An example 4, we are going to take 6 samples from a normally distributed population with variance 0.67.2455

We want to find the probability that the sample mean,2463

the average of our samples will be within 0.5 units of the population mean.2465

Let us calculate that out.2471

The sample mean is Y ̅, that is the average of the samples that you have taken.2474

The population mean is always μ.2481

Even though, you do not know exactly what the value of μ is, we always call the population mean μ.2484

The probability that they will be within 0.5 units of each other.2490

We want Y ̅ - μ to be less than 0.5 in absolute value.2495

Let me build up my standard normal variable, as usual.2503

I will multiply this by √N/σ × Y ̅ – μ.2508

That should be less than or equal 2.5 × √N/σ.2517

I actually rigged the numbers for this one, it is supposed to work out fairly well.2522

Let me show you how it works out.2526

My N was 6, there is N = 6 there.2528

I rigged this 0.67 to be equal or very close to 2/3.2532

I hope it actually turns out to work.2540

That was the variance, that was σ².2543

What we have here is a0.5 × √6 divided by, σ by itself will be √2/3.2549

The whole point of this was that, this was Z that is supposed to be a standard normal variable.2561

We want Z in absolute value to be less than or equal 2.5 ×, I can simplify that in 2.5 × √6 divided by 2/3.2567

If I do a little flip on the denominator, I will get 0.5 × 6 × 3/2.2581

That is 0.5 × √9 and that is 3/2 or 1.5.2591

What I'm really looking for is the probability that the absolute value of Z will be less than 1.5.2599

Since I know I have a chart that will tell me the tail, the area in the tail of the distribution.2608

That is what the chart will tell me, the area and the tail.2615

What I want is the probability that Z is less than 1.5, that means the absolute value of Z is less than 1.5.2618

Z is between -1.5 and 1.5.2628

The way I can figure that out is I can subtract off 2 tails.2634

1 -2 × the probability that Z is bigger than 1.5.2639

That is really all I need to do for now, I’m going to look at a standard normal chart on the next slide.2646

And, I will go ahead and finish that calculation.2651

Let me recap the steps here.2654

We want Y ̅ – μ.2656

Y ̅ is the sample mean and μ is the population mean.2657

Those mean different things.2662

The population mean means the entire population.2664

Sample mean means just the samples that we are looking at.2666

We want to be within 0.5 units of each other.2671

I said it is less than 0.5 and I cannot do much for that, until I convert it to a standard normal variable.2676

That is what I'm doing here, multiplying both sides by √N/σ.2682

I know what N is, it is √6.2688

I know what σ is, it is √2/3, it came from there.2690

I plug those in, it worked out fairly nicely and I got o3/2 or 1.5.2699

We are going to find the probability of being between -1.5 and 1.5.2705

They way that I’m going to do that is, by finding the probability of being in the tail and then, subtracting off 2 of those tails.2710

I will make that out in the next slide.2720

Let me just mention right now, before I bury this slide, that are we are going to use the same setup in example 5.2722

The only difference is we are going to change the number of samples, in order to get a better probability.2731

Make sure you understand this, before we move on to example 5,2737

because example 5 would not make sense, unless you understand example 4 here.2741

Let me just flip over to our chart of the normal distribution and we will finish this problem.2748

On the previous page, I had solved it down to the probability is 1 -2 × the probability that Z is bigger than 1.5.2753

I solved it out into a matter of finding the probability of the tail of the distribution.2769

I need to find 1.5, there it is 1.50, the probability is 0.0668.2776

This is 1 -2 × 0.0668 and just a little computation is all we have to do here.2788

That is 1 – 0.668 × 2 is 0.1336.2799

If I simplify that, 1 -0.1336 is 0.8664 and that is approximately 87%.2813

If you take 6 samples and you want to find the probability that Y ̅ - μ was less than 0.5, that is what we just calculated here.2826

We found the probability that our sample mean will be within 0.5 units of the global mean, it worked out to 87%.2848

Most of the work there was on the previous page.2855

I just kind of brought it down to looking at one number in the chart, and then just plug it in.2858

That one number was the probability that Z is bigger than 1.5.2863

We found that probability from the chart here, drop it into the computation, and reduced it down to 0.8664 or 87%.2869

We are going to reuse this scenario in example 5.2878

I really want to make sure the example 4 makes sense to you, before you go on to example 5.2882

In particular, what we are going to do in example 5 is, we are going to try to raise that probability to 95% in example 5.2888

Which means we are going to have to change the number of samples that we take.2899

In order to get higher probability and more accurate answer, we will need to take more samples.2907

Let us go ahead and take a look at example 5, and see how that works out.2912

In example 5, we are going to reuse this scenario from example 4.2916

If you have not just watched example 4, you really want to go back and watch example 4.2921

Make sure that that make sense to you, before you start working through example 5.2925

It is the same scenario as in example 4.2932

We have a normally distributed population with variance 0.67.2935

I want to ensure that our sample mean, our sample mean2940

means the average of the samples that we take, will be within 0.5 units of the population mean.2943

The population mean is the average of the entire population, that is what we have been calling μ.2952

We want the probability to come out to be 95%.2958

Since, we are dictating the probability, we cannot dictate the number of samples.2962

We are asking, how many samples should we take?2965

Let me work this out and show you how to think about this.2969

First of all, we want our probability to be 95%.2975

Let me think about that, in terms of the picture.2979

I will draw a picture there.2984

We want some cutoffs where we get 95% of the area in between those cutoff.2985

We want those cutoffs to surround 95% of the area.2993

I work backwards, that means that the 2 tail areas collectively give me 5% of the area.2999

Those 2 tail areas 1 -0.95/2 which is 0.05/2 which is 0.025,3011

I'm going to want the probability in the 2 tail areas to be a 0.025 each.3023

I want to figure out what cutoff value of Z would correspond to that.3031

I want to save myself space on this slide, I’m not going to show you the chart right away.3037

We will see that on the next slide.3041

You will see that that corresponds to Z = 1.96.3043

We will look that up on the next slide and you will see that that is the Z value we are looking for.3047

Let us put that on hold for now and let me go back and set up our standard normal variable.3052

We want Y ̅ – μ to be within 0.5 units of each other.3060

I’m going to set up my standard normal variable just like before, where I multiply both sides by √N/σ.3067

I get √N/σ here.3077

I will fill in what I can, the problem is I do not know N right now.3081

That is going to be a little tricky, my Z is going to be the standard normal variable on the left.3086

But, I do not know what N is.3092

I do know what I want my Z value to be, or at least I know that I want my Z value to be between -1.96 and 1.96.3094

I’m going to put 1.96 in for my Z value, my absolute value of Z.3104

And then, I'm going to solve for the other quantities in this picture.3109

0.5 √N, I do not know that, that is what I'm going to have to solve for.3114

Σ, I think I do know.3119

I'm given variance is 0.67, that tells me that σ² is the variance is 0.67, that is √0.67.3122

What I’m going to do is solve this equation for N.3136

It is going to work out pretty well, it is a calculator exercise really.3143

If I multiply over to the other side, I get 1.96 × √0.67 divided by 0.5 is less than or equal to √N.3148

You know what, dividing by 1/5 is the same as multiplying by 2.3163

Let me go ahead and multiply 1.96 by 2, that will give me 3.92.3168

I still have √0.67, and that is supposed to be less than √N.3176

Let me square both sides now.3182

I think I’m going to flip the N over to the other side.3184

N is bigger than or equal to 3.92².3186

If I square 0.67, the square root of that, I will just get 0.67 again.3192

Now, that is just a matter of dropping the numbers into a calculator.3199

I did that, when I drop that into a calculator I get 10.296.3202

I just solved for N, and remember that N is the number of samples we are going to take.3216

You cannot take a fraction of a sample, you take a whole number samples.3223

This 10.296 does not make sense, I'm going to round it up to be on the safe side.3232

I will take N = 11 samples and that should be enough to get my probability where I want it to be.3240

That is my answer right there, N = 11 samples.3250

That is the end of the problem, except to recap it and to show you that on a normal table where that 1.96 came from.3255

Let me recap the steps there.3264

First, I was thinking that I wanted to have 95% in between whatever boundaries I found,3266

which means on the outside of boundaries I’m going to have 5%, 0.05.3272

Since, there are two tails, I will divide that by 2 and I got 0.0252.3277

I’m looking for a cutoff that cuts off 0.025 of the area.3283

I will show you on the next slide that, when we look at the normal table, Z = 1.96 will give us that cut off.3289

That is the only part that I need to fill in on the next slide.3300

Meanwhile, over here I was setting up my standard normal variable.3303

I wanted Y ̅ - μ to be within 0.5 units of each other.3307

That is why I set their absolute value of their difference less than 0.5.3314

And then, to set up my standard normal variable, I multiplied top and bottom by √N/σ3319

and that gives me my absolute value of Z.3328

Now, I plugged in that Z = 1.96 here.3329

I do not have √N because I do not know what N is.3333

That is what I'm asking, how many sample should I take?3338

I have to leave that, but I can plug σ which I figure it out here to be √0.67.3340

Now, it is an algebra problem, I have manipulated the algebra a little bit.3348

1.96 divided by ½ is the same as multiplying by 2.3352

That is where that 3.92 came from.3357

I’m solving for N, I square both sides and I get N bigger than 3.92² × 0.67.3360

I just threw those numbers in my calculator, I would not want to do something like that by hand.3368

What I got was 10.296, but since we are talking about numbers of samples.3374

We have to take a whole number of samples.3379

I rounded that up to be safe to take, N =11 samples.3381

That pretty much wraps up example 5, except I have to show you where that 1.96 came from.3387

It really came from looking for 0.025 in the chart on the next slide.3393

What we are doing here is, I just want to justify to you,3401

we wanted the probability that Z is bigger than some little cutoff value of Z to be 0.025.3404

That is what we figure out on the previous slide.3415

I’m looking for 0.025 in the chart here.3418

It looks like these numbers are getting smaller and smaller.3421

I’m going to keep looking through these numbers.3429

Here, I’m getting close 0.28, .0281, .0274, .0265, 0.0262, .0256, .0250.3430

I found it, there is my answer right there.3441

I’m going to read off what row and column those came from.3444

It came from 1.9 and 0.06, that means that my Z value, my z is 1.96.3448

That is where that number came from.3460

I will say, we use that number, use on the previous slide,3465

and we did some work calculations with that to derive that we want N to be 11, was the answer that we got.3474

N = 11 samples.3483

You can go back and watch the previous slide, if you do not remember where that came from.3486

I would not go over that again now, you can just watch it again if you like.3492

What we did on this slide was, we are looking for that cutoff that gave us a tail probability.3495

Remember, this tail probability is what we are looking for, that was supposed to be 0.025.3500

The real reason for that was, that would make the other tail probability 0.025.3508

When you take those 2 probabilities away from 1 ,you get in the middle the probability is 0.95 which is what we are looking for.3514

That is where we got the 0.025 from.3523

But, we need to figure out which cutoff gave us that probability.3526

I found 0.025 in the table, read off its numbers 1.9 and 0.06.3529

I got Z = 1.96, and then I did some more calculations with that on the previous slide, to get down to N =11 samples.3535

That wraps up this lecture on sampling from a normal distribution.3545

The next lecture is going to look very similar to this, but we are going to be using the central limit theorem.3550

All the examples will have very similar flavor, we are sort of converting to3555

a standard normal variable then looking things up in the charts.3560

But, the difference is we are going to be using the central limit theorem3564

which means we would not have to start with a normal population anymore.3567

When we use the central limit theorem, you can start with any population in the world,3571

and then answer the same kinds of questions about whether your sample mean3575

is going to be close to your population mean.3580

I hope you will stick around and learn the center limit theorem.3583

It is probably one of the most important results in probability, that is in the next lecture.3586

That is also going to be our last lecture in the series, we are getting near the end.3591

I really appreciate you are sticking around me to enjoy these probability lectures.3594

This is the probability lecture series here on www.educator.com.3599

I am your host, my name is Will Murray, thank you for joining me today, bye.3604

Educator®

Please sign in to participate in this lecture discussion.

Resetting Your Password?
OR

Start Learning Now

Our free lessons will get you started (Adobe Flash® required).
Get immediate access to our entire library.

Membership Overview

  • Available 24/7. Unlimited Access to Our Entire Library.
  • Search and jump to exactly what you want to learn.
  • *Ask questions and get answers from the community and our teachers!
  • Practice questions with step-by-step solutions.
  • Download lecture slides for taking notes.
  • Track your course viewing progress.
  • Accessible anytime, anywhere with our Android and iOS apps.