William Murray

William Murray

Uniform Distribution

Slide Duration:

Table of Contents

Section 1: Probability by Counting
Experiments, Outcomes, Samples, Spaces, Events

59m 30s

Intro
0:00
Terminology
0:19
Experiment
0:26
Outcome
0:56
Sample Space
1:16
Event
1:55
Key Formula
2:47
Formula for Finding the Probability of an Event
2:48
Example: Drawing a Card
3:36
Example I
5:01
Experiment
5:38
Outcomes
5:54
Probability of the Event
8:11
Example II
12:00
Experiment
12:17
Outcomes
12:34
Probability of the Event
13:49
Example III
16:33
Experiment
17:09
Outcomes
17:33
Probability of the Event
18:25
Example IV
21:20
Experiment
21:21
Outcomes
22:00
Probability of the Event
23:22
Example V
31:41
Experiment
32:14
Outcomes
32:35
Probability of the Event
33:27
Alternate Solution
40:16
Example VI
43:33
Experiment
44:08
Outcomes
44:24
Probability of the Event
53:35
Combining Events: Multiplication & Addition

1h 2m 47s

Intro
0:00
Unions of Events
0:40
Unions of Events
0:41
Disjoint Events
3:42
Intersections of Events
4:18
Intersections of Events
4:19
Conditional Probability
5:47
Conditional Probability
5:48
Independence
8:20
Independence
8:21
Warning: Independent Does Not Mean Disjoint
9:53
If A and B are Independent
11:20
Example I: Choosing a Number at Random
12:41
Solving by Counting
12:52
Solving by Probability
17:26
Example II: Combination
22:07
Combination Deal at a Restaurant
22:08
Example III: Rolling Two Dice
24:18
Define the Events
24:20
Solving by Counting
27:35
Solving by Probability
29:32
Example IV: Flipping a Coin
35:07
Flipping a Coin Four Times
35:08
Example V: Conditional Probabilities
41:22
Define the Events
42:23
Calculate the Conditional Probabilities
46:21
Example VI: Independent Events
53:42
Define the Events
53:43
Are Events Independent?
55:21
Choices: Combinations & Permutations

56m 3s

Intro
0:00
Choices: With or Without Replacement?
0:12
Choices: With or Without Replacement?
0:13
Example: With Replacement
2:17
Example: Without Replacement
2:55
Choices: Ordered or Unordered?
4:10
Choices: Ordered or Unordered?
4:11
Example: Unordered
4:52
Example: Ordered
6:08
Combinations
9:23
Definition & Equation: Combinations
9:24
Example: Combinations
12:12
Permutations
13:56
Definition & Equation: Permutations
13:57
Example: Permutations
15:00
Key Formulas
17:19
Number of Ways to Pick r Things from n Possibilities
17:20
Example I: Five Different Candy Bars
18:31
Example II: Five Identical Candy Bars
24:53
Example III: Five Identical Candy Bars
31:56
Example IV: Five Different Candy Bars
39:21
Example V: Pizza & Toppings
45:03
Inclusion & Exclusion

43m 40s

Intro
0:00
Inclusion/Exclusion: Two Events
0:09
Inclusion/Exclusion: Two Events
0:10
Inclusion/Exclusion: Three Events
2:30
Inclusion/Exclusion: Three Events
2:31
Example I: Inclusion & Exclusion
6:24
Example II: Inclusion & Exclusion
11:01
Example III: Inclusion & Exclusion
18:41
Example IV: Inclusion & Exclusion
28:24
Example V: Inclusion & Exclusion
39:33
Independence

46m 9s

Intro
0:00
Formula and Intuition
0:12
Definition of Independence
0:19
Intuition
0:49
Common Misinterpretations
1:37
Myth & Truth 1
1:38
Myth & Truth 2
2:23
Combining Independent Events
3:56
Recall: Formula for Conditional Probability
3:58
Combining Independent Events
4:10
Example I: Independence
5:36
Example II: Independence
14:14
Example III: Independence
21:10
Example IV: Independence
32:45
Example V: Independence
41:13
Bayes' Rule

1h 2m 10s

Intro
0:00
When to Use Bayes' Rule
0:08
When to Use Bayes' Rule: Disjoint Union of Events
0:09
Bayes' Rule for Two Choices
2:50
Bayes' Rule for Two Choices
2:51
Bayes' Rule for Multiple Choices
5:03
Bayes' Rule for Multiple Choices
5:04
Example I: What is the Chance that She is Diabetic?
6:55
Example I: Setting up the Events
6:56
Example I: Solution
11:33
Example II: What is the chance that It Belongs to a Woman?
19:28
Example II: Setting up the Events
19:29
Example II: Solution
21:45
Example III: What is the Probability that She is a Democrat?
27:31
Example III: Setting up the Events
27:32
Example III: Solution
32:08
Example IV: What is the chance that the Fruit is an Apple?
39:11
Example IV: Setting up the Events
39:12
Example IV: Solution
43:50
Example V: What is the Probability that the Oldest Child is a Girl?
51:16
Example V: Setting up the Events
51:17
Example V: Solution
53:07
Section 2: Random Variables
Random Variables & Probability Distribution

38m 21s

Intro
0:00
Intuition
0:15
Intuition for Random Variable
0:16
Example: Random Variable
0:44
Intuition, Cont.
2:52
Example: Random Variable as Payoff
2:57
Definition
5:11
Definition of a Random Variable
5:13
Example: Random Variable in Baseball
6:02
Probability Distributions
7:18
Probability Distributions
7:19
Example I: Probability Distribution for the Random Variable
9:29
Example II: Probability Distribution for the Random Variable
14:52
Example III: Probability Distribution for the Random Variable
21:52
Example IV: Probability Distribution for the Random Variable
27:25
Example V: Probability Distribution for the Random Variable
34:12
Expected Value (Mean)

46m 14s

Intro
0:00
Definition of Expected Value
0:20
Expected Value of a (Discrete) Random Variable or Mean
0:21
Indicator Variables
3:03
Indicator Variable
3:04
Linearity of Expectation
4:36
Linearity of Expectation for Random Variables
4:37
Expected Value of a Function
6:03
Expected Value of a Function
6:04
Example I: Expected Value
7:30
Example II: Expected Value
14:14
Example III: Expected Value of Flipping a Coin
21:42
Example III: Part A
21:43
Example III: Part B
30:43
Example IV: Semester Average
36:39
Example V: Expected Value of a Function of a Random Variable
41:28
Variance & Standard Deviation

47m 23s

Intro
0:00
Definition of Variance
0:11
Variance of a Random Variable
0:12
Variance is a Measure of the Variability, or Volatility
1:06
Most Useful Way to Calculate Variance
2:46
Definition of Standard Deviation
3:44
Standard Deviation of a Random Variable
3:45
Example I: Which of the Following Sets of Data Has the Largest Variance?
5:34
Example II: Which of the Following Would be the Least Useful in Understanding a Set of Data?
9:02
Example III: Calculate the Mean, Variance, & Standard Deviation
11:48
Example III: Mean
12:56
Example III: Variance
14:06
Example III: Standard Deviation
15:42
Example IV: Calculate the Mean, Variance, & Standard Deviation
17:54
Example IV: Mean
18:47
Example IV: Variance
20:36
Example IV: Standard Deviation
25:34
Example V: Calculate the Mean, Variance, & Standard Deviation
29:56
Example V: Mean
30:13
Example V: Variance
33:28
Example V: Standard Deviation
34:48
Example VI: Calculate the Mean, Variance, & Standard Deviation
37:29
Example VI: Possible Outcomes
38:09
Example VI: Mean
39:29
Example VI: Variance
41:22
Example VI: Standard Deviation
43:28
Markov's Inequality

26m 45s

Intro
0:00
Markov's Inequality
0:25
Markov's Inequality: Definition & Condition
0:26
Markov's Inequality: Equation
1:15
Markov's Inequality: Reverse Equation
2:48
Example I: Money
4:11
Example II: Rental Car
9:23
Example III: Probability of an Earthquake
12:22
Example IV: Defective Laptops
16:52
Example V: Cans of Tuna
21:06
Tchebysheff's Inequality

42m 11s

Intro
0:00
Tchebysheff's Inequality (Also Known as Chebyshev's Inequality)
0:52
Tchebysheff's Inequality: Definition
0:53
Tchebysheff's Inequality: Equation
1:19
Tchebysheff's Inequality: Intuition
3:21
Tchebysheff's Inequality in Reverse
4:09
Tchebysheff's Inequality in Reverse
4:10
Intuition
5:13
Example I: Money
5:55
Example II: College Units
13:20
Example III: Using Tchebysheff's Inequality to Estimate Proportion
16:40
Example IV: Probability of an Earthquake
25:21
Example V: Using Tchebysheff's Inequality to Estimate Proportion
32:57
Section 3: Discrete Distributions
Binomial Distribution (Bernoulli Trials)

52m 36s

Intro
0:00
Binomial Distribution
0:29
Binomial Distribution (Bernoulli Trials) Overview
0:30
Prototypical Examples: Flipping a Coin n Times
1:36
Process with Two Outcomes: Games Between Teams
2:12
Process with Two Outcomes: Rolling a Die to Get a 6
2:42
Formula for the Binomial Distribution
3:45
Fixed Parameters
3:46
Formula for the Binomial Distribution
6:27
Key Properties of the Binomial Distribution
9:54
Mean
9:55
Variance
10:56
Standard Deviation
11:13
Example I: Games Between Teams
11:36
Example II: Exam Score
17:01
Example III: Expected Grade & Standard Deviation
25:59
Example IV: Pogo-sticking Championship, Part A
33:25
Example IV: Pogo-sticking Championship, Part B
38:24
Example V: Expected Championships Winning & Standard Deviation
45:22
Geometric Distribution

52m 50s

Intro
0:00
Geometric Distribution
0:22
Geometric Distribution: Definition
0:23
Prototypical Example: Flipping a Coin Until We Get a Head
1:08
Geometric Distribution vs. Binomial Distribution.
1:31
Formula for the Geometric Distribution
2:13
Fixed Parameters
2:14
Random Variable
2:49
Formula for the Geometric Distribution
3:16
Key Properties of the Geometric Distribution
6:47
Mean
6:48
Variance
7:10
Standard Deviation
7:25
Geometric Series
7:46
Recall from Calculus II: Sum of Infinite Series
7:47
Application to Geometric Distribution
10:10
Example I: Drawing Cards from a Deck (With Replacement) Until You Get an Ace
13:02
Example I: Question & Solution
13:03
Example II: Mean & Standard Deviation of Winning Pin the Tail on the Donkey
16:32
Example II: Mean
16:33
Example II: Standard Deviation
18:37
Example III: Rolling a Die
22:09
Example III: Setting Up
22:10
Example III: Part A
24:18
Example III: Part B
26:01
Example III: Part C
27:38
Example III: Summary
32:02
Example IV: Job Interview
35:16
Example IV: Setting Up
35:15
Example IV: Part A
37:26
Example IV: Part B
38:33
Example IV: Summary
39:37
Example V: Mean & Standard Deviation of Time to Conduct All the Interviews
41:13
Example V: Setting Up
42:50
Example V: Mean
46:05
Example V: Variance
47:37
Example V: Standard Deviation
48:22
Example V: Summary
49:36
Negative Binomial Distribution

51m 39s

Intro
0:00
Negative Binomial Distribution
0:11
Negative Binomial Distribution: Definition
0:12
Prototypical Example: Flipping a Coin Until We Get r Successes
0:46
Negative Binomial Distribution vs. Binomial Distribution
1:04
Negative Binomial Distribution vs. Geometric Distribution
1:33
Formula for Negative Binomial Distribution
3:39
Fixed Parameters
3:40
Random Variable
4:57
Formula for Negative Binomial Distribution
5:18
Key Properties of Negative Binomial
7:44
Mean
7:47
Variance
8:03
Standard Deviation
8:09
Example I: Drawing Cards from a Deck (With Replacement) Until You Get Four Aces
8:32
Example I: Question & Solution
8:33
Example II: Chinchilla Grooming
12:37
Example II: Mean
12:38
Example II: Variance
15:09
Example II: Standard Deviation
15:51
Example II: Summary
17:10
Example III: Rolling a Die Until You Get Four Sixes
18:27
Example III: Setting Up
19:38
Example III: Mean
19:38
Example III: Variance
20:31
Example III: Standard Deviation
21:21
Example IV: Job Applicants
24:00
Example IV: Setting Up
24:01
Example IV: Part A
26:16
Example IV: Part B
29:53
Example V: Mean & Standard Deviation of Time to Conduct All the Interviews
40:10
Example V: Setting Up
40:11
Example V: Mean
45:24
Example V: Variance
46:22
Example V: Standard Deviation
47:01
Example V: Summary
48:16
Hypergeometric Distribution

36m 27s

Intro
0:00
Hypergeometric Distribution
0:11
Hypergeometric Distribution: Definition
0:12
Random Variable
1:38
Formula for the Hypergeometric Distribution
1:50
Fixed Parameters
1:51
Formula for the Hypergeometric Distribution
2:53
Key Properties of Hypergeometric
6:14
Mean
6:15
Variance
6:42
Standard Deviation
7:16
Example I: Students Committee
7:30
Example II: Expected Number of Women on the Committee in Example I
11:08
Example III: Pairs of Shoes
13:49
Example IV: What is the Expected Number of Left Shoes in Example III?
20:46
Example V: Using Indicator Variables & Linearity of Expectation
25:40
Poisson Distribution

52m 19s

Intro
0:00
Poisson Distribution
0:18
Poisson Distribution: Definition
0:19
Formula for the Poisson Distribution
2:16
Fixed Parameter
2:17
Formula for the Poisson Distribution
2:59
Key Properties of the Poisson Distribution
5:30
Mean
5:34
Variance
6:07
Standard Deviation
6:27
Example I: Forest Fires
6:41
Example II: Call Center, Part A
15:56
Example II: Call Center, Part B
20:50
Example III: Confirming that the Mean of the Poisson Distribution is λ
26:53
Example IV: Find E (Y²) for the Poisson Distribution
35:24
Example V: Earthquakes, Part A
37:57
Example V: Earthquakes, Part B
44:02
Section 4: Continuous Distributions
Density & Cumulative Distribution Functions

57m 17s

Intro
0:00
Density Functions
0:43
Density Functions
0:44
Density Function to Calculate Probabilities
2:41
Cumulative Distribution Functions
4:28
Cumulative Distribution Functions
4:29
Using F to Calculate Probabilities
5:58
Properties of the CDF (Density & Cumulative Distribution Functions)
7:27
F(-∞) = 0
7:34
F(∞) = 1
8:30
F is Increasing
9:14
F'(y) = f(y)
9:21
Example I: Density & Cumulative Distribution Functions, Part A
9:43
Example I: Density & Cumulative Distribution Functions, Part B
14:16
Example II: Density & Cumulative Distribution Functions, Part A
21:41
Example II: Density & Cumulative Distribution Functions, Part B
26:16
Example III: Density & Cumulative Distribution Functions, Part A
32:17
Example III: Density & Cumulative Distribution Functions, Part B
37:08
Example IV: Density & Cumulative Distribution Functions
43:34
Example V: Density & Cumulative Distribution Functions, Part A
51:53
Example V: Density & Cumulative Distribution Functions, Part B
54:19
Mean & Variance for Continuous Distributions

36m 18s

Intro
0:00
Mean
0:32
Mean for a Continuous Random Variable
0:33
Expectation is Linear
2:07
Variance
2:55
Variance for Continuous random Variable
2:56
Easier to Calculate Via the Mean
3:26
Standard Deviation
5:03
Standard Deviation
5:04
Example I: Mean & Variance for Continuous Distributions
5:43
Example II: Mean & Variance for Continuous Distributions
10:09
Example III: Mean & Variance for Continuous Distributions
16:05
Example IV: Mean & Variance for Continuous Distributions
26:40
Example V: Mean & Variance for Continuous Distributions
30:12
Uniform Distribution

32m 49s

Intro
0:00
Uniform Distribution
0:15
Uniform Distribution
0:16
Each Part of the Region is Equally Probable
1:39
Key Properties of the Uniform Distribution
2:45
Mean
2:46
Variance
3:27
Standard Deviation
3:48
Example I: Newspaper Delivery
5:25
Example II: Picking a Real Number from a Uniform Distribution
8:21
Example III: Dinner Date
11:02
Example IV: Proving that a Variable is Uniformly Distributed
18:50
Example V: Ice Cream Serving
27:22
Normal (Gaussian) Distribution

1h 3m 54s

Intro
0:00
Normal (Gaussian) Distribution
0:35
Normal (Gaussian) Distribution & The Bell Curve
0:36
Fixed Parameters
0:55
Formula for the Normal Distribution
1:32
Formula for the Normal Distribution
1:33
Calculating on the Normal Distribution can be Tricky
3:32
Standard Normal Distribution
5:12
Standard Normal Distribution
5:13
Graphing the Standard Normal Distribution
6:13
Standard Normal Distribution, Cont.
8:30
Standard Normal Distribution Chart
8:31
Nonstandard Normal Distribution
14:44
Nonstandard Normal Variable & Associated Standard Normal
14:45
Finding Probabilities for Z
15:39
Example I: Chance that Standard Normal Variable Will Land Between 1 and 2?
16:46
Example I: Setting Up the Equation & Graph
16:47
Example I: Solving for z Using the Standard Normal Chart
19:05
Example II: What Proportion of the Data Lies within Two Standard Deviations of the Mean?
20:41
Example II: Setting Up the Equation & Graph
20:42
Example II: Solving for z Using the Standard Normal Chart
24:38
Example III: Scores on an Exam
27:34
Example III: Setting Up the Equation & Graph, Part A
27:35
Example III: Setting Up the Equation & Graph, Part B
33:48
Example III: Solving for z Using the Standard Normal Chart, Part A
38:23
Example III: Solving for z Using the Standard Normal Chart, Part B
40:49
Example IV: Temperatures
42:54
Example IV: Setting Up the Equation & Graph
42:55
Example IV: Solving for z Using the Standard Normal Chart
47:03
Example V: Scores on an Exam
48:41
Example V: Setting Up the Equation & Graph, Part A
48:42
Example V: Setting Up the Equation & Graph, Part B
53:20
Example V: Solving for z Using the Standard Normal Chart, Part A
57:45
Example V: Solving for z Using the Standard Normal Chart, Part B
59:17
Gamma Distribution (with Exponential & Chi-square)

1h 8m 27s

Intro
0:00
Gamma Function
0:49
The Gamma Function
0:50
Properties of the Gamma Function
2:07
Formula for the Gamma Distribution
3:50
Fixed Parameters
3:51
Density Function for Gamma Distribution
4:07
Key Properties of the Gamma Distribution
7:13
Mean
7:14
Variance
7:25
Standard Deviation
7:30
Exponential Distribution
8:03
Definition of Exponential Distribution
8:04
Density
11:23
Mean
13:26
Variance
13:48
Standard Deviation
13:55
Chi-square Distribution
14:34
Chi-square Distribution: Overview
14:35
Chi-square Distribution: Mean
16:27
Chi-square Distribution: Variance
16:37
Chi-square Distribution: Standard Deviation
16:55
Example I: Graphing Gamma Distribution
17:30
Example I: Graphing Gamma Distribution
17:31
Example I: Describe the Effects of Changing α and β on the Shape of the Graph
23:33
Example II: Exponential Distribution
27:11
Example II: Using the Exponential Distribution
27:12
Example II: Summary
35:34
Example III: Earthquake
37:05
Example III: Estimate Using Markov's Inequality
37:06
Example III: Estimate Using Tchebysheff's Inequality
40:13
Example III: Summary
44:13
Example IV: Finding Exact Probability of Earthquakes
46:45
Example IV: Finding Exact Probability of Earthquakes
46:46
Example IV: Summary
51:44
Example V: Prove and Interpret Why the Exponential Distribution is Called 'Memoryless'
52:51
Example V: Prove
52:52
Example V: Interpretation
57:44
Example V: Summary
1:03:54
Beta Distribution

52m 45s

Intro
0:00
Beta Function
0:29
Fixed parameters
0:30
Defining the Beta Function
1:19
Relationship between the Gamma & Beta Functions
2:02
Beta Distribution
3:31
Density Function for the Beta Distribution
3:32
Key Properties of the Beta Distribution
6:56
Mean
6:57
Variance
7:16
Standard Deviation
7:37
Example I: Calculate B(3,4)
8:10
Example II: Graphing the Density Functions for the Beta Distribution
12:25
Example III: Show that the Uniform Distribution is a Special Case of the Beta Distribution
24:57
Example IV: Show that this Triangular Distribution is a Special Case of the Beta Distribution
31:20
Example V: Morning Commute
37:39
Example V: Identify the Density Function
38:45
Example V: Morning Commute, Part A
42:22
Example V: Morning Commute, Part B
44:19
Example V: Summary
49:13
Moment-Generating Functions

51m 58s

Intro
0:00
Moments
0:30
Definition of Moments
0:31
Moment-Generating Functions (MGFs)
3:53
Moment-Generating Functions
3:54
Using the MGF to Calculate the Moments
5:21
Moment-Generating Functions for the Discrete Distributions
8:22
Moment-Generating Functions for Binomial Distribution
8:36
Moment-Generating Functions for Geometric Distribution
9:06
Moment-Generating Functions for Negative Binomial Distribution
9:28
Moment-Generating Functions for Hypergeometric Distribution
9:43
Moment-Generating Functions for Poisson Distribution
9:57
Moment-Generating Functions for the Continuous Distributions
11:34
Moment-Generating Functions for the Uniform Distributions
11:43
Moment-Generating Functions for the Normal Distributions
12:24
Moment-Generating Functions for the Gamma Distributions
12:36
Moment-Generating Functions for the Exponential Distributions
12:44
Moment-Generating Functions for the Chi-square Distributions
13:11
Moment-Generating Functions for the Beta Distributions
13:48
Useful Formulas with Moment-Generating Functions
15:02
Useful Formulas with Moment-Generating Functions 1
15:03
Useful Formulas with Moment-Generating Functions 2
16:21
Example I: Moment-Generating Function for the Binomial Distribution
17:33
Example II: Use the MGF for the Binomial Distribution to Find the Mean of the Distribution
24:40
Example III: Find the Moment Generating Function for the Poisson Distribution
29:28
Example IV: Use the MGF for Poisson Distribution to Find the Mean and Variance of the Distribution
36:27
Example V: Find the Moment-generating Function for the Uniform Distribution
44:47
Section 5: Multivariate Distributions
Bivariate Density & Distribution Functions

50m 52s

Intro
0:00
Bivariate Density Functions
0:21
Two Variables
0:23
Bivariate Density Function
0:52
Properties of the Density Function
1:57
Properties of the Density Function 1
1:59
Properties of the Density Function 2
2:20
We Can Calculate Probabilities
2:53
If You Have a Discrete Distribution
4:36
Bivariate Distribution Functions
5:25
Bivariate Distribution Functions
5:26
Properties of the Bivariate Distribution Functions 1
7:19
Properties of the Bivariate Distribution Functions 2
7:36
Example I: Bivariate Density & Distribution Functions
8:08
Example II: Bivariate Density & Distribution Functions
14:40
Example III: Bivariate Density & Distribution Functions
24:33
Example IV: Bivariate Density & Distribution Functions
32:04
Example V: Bivariate Density & Distribution Functions
40:26
Marginal Probability

42m 38s

Intro
0:00
Discrete Case
0:48
Marginal Probability Functions
0:49
Continuous Case
3:07
Marginal Density Functions
3:08
Example I: Compute the Marginal Probability Function
5:58
Example II: Compute the Marginal Probability Function
14:07
Example III: Marginal Density Function
24:01
Example IV: Marginal Density Function
30:47
Example V: Marginal Density Function
36:05
Conditional Probability & Conditional Expectation

1h 2m 24s

Intro
0:00
Review of Marginal Probability
0:46
Recall the Marginal Probability Functions & Marginal Density Functions
0:47
Conditional Probability, Discrete Case
3:14
Conditional Probability, Discrete Case
3:15
Conditional Probability, Continuous Case
4:15
Conditional Density of Y₁ given that Y₂ = y₂
4:16
Interpret This as a Density on Y₁ & Calculate Conditional Probability
5:03
Conditional Expectation
6:44
Conditional Expectation: Continuous
6:45
Conditional Expectation: Discrete
8:03
Example I: Conditional Probability
8:29
Example II: Conditional Probability
23:59
Example III: Conditional Probability
34:28
Example IV: Conditional Expectation
43:16
Example V: Conditional Expectation
48:28
Independent Random Variables

51m 39s

Intro
0:00
Intuition
0:55
Experiment with Two Random Variables
0:56
Intuition Formula
2:17
Definition and Formulas
4:43
Definition
4:44
Short Version: Discrete
5:10
Short Version: Continuous
5:48
Theorem
9:33
For Continuous Random Variables, Y₁ & Y₂ are Independent If & Only If: Condition 1
9:34
For Continuous Random Variables, Y₁ & Y₂ are Independent If & Only If: Condition 2
11:22
Example I: Use the Definition to Determine if Y₁ and Y₂ are Independent
12:49
Example II: Use the Definition to Determine if Y₁ and Y₂ are Independent
21:33
Example III: Are Y₁ and Y₂ Independent?
27:01
Example IV: Are Y₁ and Y₂ Independent?
34:51
Example V: Are Y₁ and Y₂ Independent?
43:44
Expected Value of a Function of Random Variables

37m 7s

Intro
0:00
Review of Single Variable Case
0:29
Expected Value of a Single Variable
0:30
Expected Value of a Function g(Y)
1:12
Bivariate Case
2:11
Expected Value of a Function g(Y₁, Y₂)
2:12
Linearity of Expectation
3:24
Linearity of Expectation 1
3:25
Linearity of Expectation 2
3:38
Linearity of Expectation 3: Additivity
4:03
Example I: Calculate E (Y₁ + Y₂)
4:39
Example II: Calculate E (Y₁Y₂)
14:47
Example III: Calculate E (U₁) and E(U₂)
19:33
Example IV: Calculate E (Y₁) and E(Y₂)
22:50
Example V: Calculate E (2Y₁ + 3Y₂)
33:05
Covariance, Correlation & Linear Functions

59m 50s

Intro
0:00
Definition and Formulas for Covariance
0:38
Definition of Covariance
0:39
Formulas to Calculate Covariance
1:36
Intuition for Covariance
3:54
Covariance is a Measure of Dependence
3:55
Dependence Doesn't Necessarily Mean that the Variables Do the Same Thing
4:12
If Variables Move Together
4:47
If Variables Move Against Each Other
5:04
Both Cases Show Dependence!
5:30
Independence Theorem
8:10
Independence Theorem
8:11
The Converse is Not True
8:32
Correlation Coefficient
9:33
Correlation Coefficient
9:34
Linear Functions of Random Variables
11:57
Linear Functions of Random Variables: Expected Value
11:58
Linear Functions of Random Variables: Variance
12:58
Linear Functions of Random Variables, Cont.
14:30
Linear Functions of Random Variables: Covariance
14:35
Example I: Calculate E (Y₁), E (Y₂), and E (Y₁Y₂)
15:31
Example II: Are Y₁ and Y₂ Independent?
29:16
Example III: Calculate V (U₁) and V (U₂)
36:14
Example IV: Calculate the Covariance Correlation Coefficient
42:12
Example V: Find the Mean and Variance of the Average
52:19
Section 6: Distributions of Functions of Random Variables
Distribution Functions

1h 7m 35s

Intro
0:00
Premise
0:44
Premise
0:45
Goal
1:38
Goal Number 1: Find the Full Distribution Function
1:39
Goal Number 2: Find the Density Function
1:55
Goal Number 3: Calculate Probabilities
2:17
Three Methods
3:05
Method 1: Distribution Functions
3:06
Method 2: Transformations
3:38
Method 3: Moment-generating Functions
3:47
Distribution Functions
4:03
Distribution Functions
4:04
Example I: Find the Density Function
6:41
Step 1: Find the Distribution Function
6:42
Step 2: Find the Density Function
10:20
Summary
11:51
Example II: Find the Density Function
14:36
Step 1: Find the Distribution Function
14:37
Step 2: Find the Density Function
18:19
Summary
19:22
Example III: Find the Cumulative Distribution & Density Functions
20:39
Step 1: Find the Cumulative Distribution
20:40
Step 2: Find the Density Function
28:58
Summary
30:20
Example IV: Find the Density Function
33:01
Step 1: Setting Up the Equation & Graph
33:02
Step 2: If u ≤ 1
38:32
Step 3: If u ≥ 1
41:02
Step 4: Find the Distribution Function
42:40
Step 5: Find the Density Function
43:11
Summary
45:03
Example V: Find the Density Function
48:32
Step 1: Exponential
48:33
Step 2: Independence
50:48
Step 2: Find the Distribution Function
51:47
Step 3: Find the Density Function
1:00:17
Summary
1:02:05
Transformations

1h 16s

Intro
0:00
Premise
0:32
Premise
0:33
Goal
1:37
Goal Number 1: Find the Full Distribution Function
1:38
Goal Number 2: Find the Density Function
1:49
Goal Number 3: Calculate Probabilities
2:04
Three Methods
2:34
Method 1: Distribution Functions
2:35
Method 2: Transformations
2:57
Method 3: Moment-generating Functions
3:05
Requirements for Transformation Method
3:22
The Transformation Method Only Works for Single-variable Situations
3:23
Must be a Strictly Monotonic Function
3:50
Example: Strictly Monotonic Function
4:50
If the Function is Monotonic, Then It is Invertible
5:30
Formula for Transformations
7:09
Formula for Transformations
7:11
Example I: Determine whether the Function is Monotonic, and if so, Find Its Inverse
8:26
Example II: Find the Density Function
12:07
Example III: Determine whether the Function is Monotonic, and if so, Find Its Inverse
17:12
Example IV: Find the Density Function for the Magnitude of the Next Earthquake
21:30
Example V: Find the Expected Magnitude of the Next Earthquake
33:20
Example VI: Find the Density Function, Including the Range of Possible Values for u
47:42
Moment-Generating Functions

1h 18m 52s

Intro
0:00
Premise
0:30
Premise
0:31
Goal
1:40
Goal Number 1: Find the Full Distribution Function
1:41
Goal Number 2: Find the Density Function
1:51
Goal Number 3: Calculate Probabilities
2:01
Three Methods
2:39
Method 1: Distribution Functions
2:40
Method 2: Transformations
2:50
Method 3: Moment-Generating Functions
2:55
Review of Moment-Generating Functions
3:04
Recall: The Moment-Generating Function for a Random Variable Y
3:05
The Moment-Generating Function is a Function of t (Not y)
3:45
Moment-Generating Functions for the Discrete Distributions
4:31
Binomial
4:50
Geometric
5:12
Negative Binomial
5:24
Hypergeometric
5:33
Poisson
5:42
Moment-Generating Functions for the Continuous Distributions
6:08
Uniform
6:09
Normal
6:17
Gamma
6:29
Exponential
6:34
Chi-square
7:05
Beta
7:48
Useful Formulas with the Moment-Generating Functions
8:48
Useful Formula 1
8:49
Useful Formula 2
9:51
How to Use Moment-Generating Functions
10:41
How to Use Moment-Generating Functions
10:42
Example I: Find the Density Function
12:22
Example II: Find the Density Function
30:58
Example III: Find the Probability Function
43:29
Example IV: Find the Probability Function
51:43
Example V: Find the Distribution
1:00:14
Example VI: Find the Density Function
1:12:10
Order Statistics

1h 4m 56s

Intro
0:00
Premise
0:11
Example Question: How Tall Will the Tallest Student in My Next Semester's Probability Class Be?
0:12
Setting
0:56
Definition 1
1:49
Definition 2
2:01
Question: What are the Distributions & Densities?
4:08
Formulas
4:47
Distribution of Max
5:11
Density of Max
6:00
Distribution of Min
7:08
Density of Min
7:18
Example I: Distribution & Density Functions
8:29
Example I: Distribution
8:30
Example I: Density
11:07
Example I: Summary
12:33
Example II: Distribution & Density Functions
14:25
Example II: Distribution
14:26
Example II: Density
17:21
Example II: Summary
19:00
Example III: Mean & Variance
20:32
Example III: Mean
20:33
Example III: Variance
25:48
Example III: Summary
30:57
Example IV: Distribution & Density Functions
35:43
Example IV: Distribution
35:44
Example IV: Density
43:03
Example IV: Summary
46:11
Example V: Find the Expected Time Until the Team's First Injury
51:14
Example V: Solution
51:15
Example V: Summary
1:01:11
Sampling from a Normal Distribution

1h 7s

Intro
0:00
Setting
0:36
Setting
0:37
Assumptions and Notation
2:18
Assumption Forever
2:19
Assumption for this Lecture Only
3:21
Notation
3:49
The Sample Mean
4:15
Statistic We'll Study the Sample Mean
4:16
Theorem
5:40
Standard Normal Distribution
7:03
Standard Normal Distribution
7:04
Converting to Standard Normal
10:11
Recall
10:12
Corollary to Theorem
10:41
Example I: Heights of Students
13:18
Example II: What Happens to This Probability as n → ∞
22:36
Example III: Units at a University
32:24
Example IV: Probability of Sample Mean
40:53
Example V: How Many Samples Should We Take?
48:34
The Central Limit Theorem

1h 9m 55s

Intro
0:00
Setting
0:52
Setting
0:53
Assumptions and Notation
2:53
Our Samples are Independent (Independent Identically Distributed)
2:54
No Longer Assume that the Population is Normally Distributed
3:30
The Central Limit Theorem
4:36
The Central Limit Theorem Overview
4:38
The Central Limit Theorem in Practice
6:24
Standard Normal Distribution
8:09
Standard Normal Distribution
8:13
Converting to Standard Normal
10:13
Recall: If Y is Normal, Then …
10:14
Corollary to Theorem
11:09
Example I: Probability of Finishing Your Homework
12:56
Example I: Solution
12:57
Example I: Summary
18:20
Example I: Confirming with the Standard Normal Distribution Chart
20:18
Example II: Probability of Selling Muffins
21:26
Example II: Solution
21:27
Example II: Summary
29:09
Example II: Confirming with the Standard Normal Distribution Chart
31:09
Example III: Probability that a Soda Dispenser Gives the Correct Amount of Soda
32:41
Example III: Solution
32:42
Example III: Summary
38:03
Example III: Confirming with the Standard Normal Distribution Chart
40:58
Example IV: How Many Samples Should She Take?
42:06
Example IV: Solution
42:07
Example IV: Summary
49:18
Example IV: Confirming with the Standard Normal Distribution Chart
51:57
Example V: Restaurant Revenue
54:41
Example V: Solution
54:42
Example V: Summary
1:04:21
Example V: Confirming with the Standard Normal Distribution Chart
1:06:48
Loading...
This is a quick preview of the lesson. For full access, please Log In or Sign up.
For more information, please see full course syllabus of Probability
Bookmark & Share Embed

Share this knowledge with your friends!

Copy & Paste this embed code into your website’s HTML

Please ensure that your website editor is in text mode when you paste the code.
(In Wordpress, the mode button is on the top right corner.)
  ×
  • - Allow users to view the embedded video in full-size.
Since this lesson is not free, only the preview will appear on your website.
  • Discussion

  • Answer Engine

  • Study Guides

  • Download Lecture Slides

  • Table of Contents

  • Transcription

Lecture Comments (7)

1 answer

Last reply by: Dr. William Murray
Mon Oct 24, 2016 11:34 AM

Post by YILEI GE on October 21, 2016

Hi professor, about the example 2, they are four possible numbers that y bigger and equal to 9, they are 9,10,11,12.
And total possible numbers are 5 to 12 includeing 5, so i have 4/8, equal to 1/2. Could you point out where am i wrong? Thanks

1 answer

Last reply by: Dr. William Murray
Mon Mar 9, 2015 9:42 PM

Post by Ash Niazi on March 7, 2015

Love your lectures - they're really helping me understand the material.
Question, for Ex 3: I did it at first a bit differently:

My Arrival Time: P = 10 - 0 / 15 - 0 = 10 / 15 = 2/3.
Friend Arrival Time: P = 10-0 / 10- 0 = 1.
P[Friends Time] - P[My Time] = 1 - 2/3 = 1/3.

Is that acceptable?

1 answer

Last reply by: Dr. William Murray
Thu Mar 5, 2015 5:47 PM

Post by Nick Nick on March 4, 2015

Thanks

Uniform Distribution

Download Quick Notes

Uniform Distribution

Lecture Slides are screen-captured images of important points in the lecture. Students can download and print out these lecture slide images to do practice problems as well as take notes while watching the lecture.

  • Intro 0:00
  • Uniform Distribution 0:15
    • Uniform Distribution
    • Each Part of the Region is Equally Probable
  • Key Properties of the Uniform Distribution 2:45
    • Mean
    • Variance
    • Standard Deviation
  • Example I: Newspaper Delivery 5:25
  • Example II: Picking a Real Number from a Uniform Distribution 8:21
  • Example III: Dinner Date 11:02
  • Example IV: Proving that a Variable is Uniformly Distributed 18:50
  • Example V: Ice Cream Serving 27:22

Transcription: Uniform Distribution

Hi, welcome back to www.educator.com lectures on probability, my name is Will Murray.0000

Today, we are going to talk about the first of several continuous distributions.0004

This is probably the easiest one, it is called the uniform distribution.0010

We will see very soon why it is called that.0013

Let us jump right on in.0016

The idea of the uniform distribution is you have a finite range from θ1 to θ2.0017

Let me go ahead and draw a graph of this, as I'm talking about this.0023

You have 2 constant values, here is θ1 and then you have θ2, somewhere a bit bigger than θ1.0028

And then, you just divide your density evenly, you distribute it evenly over that range0037

which means you just take a completely horizontal line over that range.0044

What that means is, remember the total density always has to be 1,0051

that the total area under a density function always has to be 1.0055

In order to have that area be 1, the width is θ2 – θ1, the height has to be the constant 1/θ2 – θ1.0060

By the way, this triple equal sign, that means always equals 2.0070

It means that the density function is constant.0079

That is much different from all of the other density functions that we will be studying later.0082

That is what makes the uniform distribution a lot easier than some of the later ones.0087

It is that the density function is always equal to a constant.0091

That constant has to be 111, in order to give your total area 1.0094

Each part of the region is equally possible or equally probable.0101

It is very easy to calculate probabilities with the uniform distribution, if you have two values A and B here.0106

Let me go ahead and draw them in on my graph A and B.0114

If you have two values A and B, and we want to find the probability that your random variable0118

land somewhere between A and B.0124

It is very simple, you just have to calculate the distance between A and B.0126

Essentially, you are calculating this area right here.0131

And that black area is just going to be B – A/θ2 – tθ1, because it has width B - A and it has height of 111.0136

It is very easy to calculate probabilities using uniform distribution.0149

You just look at the two ranges that you are interested in, subtract them,0153

and then divide that by the appropriate constant which is always θ2 – θ1.0157

Let us see how that plays out.0164

The key properties in uniform distribution, they mean should be kind of intuitively obvious.0167

Let me draw again the graph of the uniform distribution.0173

There is θ1 and there is θ2.0178

Remember that, we are distributing the density completely evenly between θ1 and θ2.0181

You would expect the mean to be halfway between them.0187

In fact, that is where it turns out to be.0191

The mean is exactly the average of θ1 and θ2.0193

You just get θ1 + θ2/2 for the μ.0198

That is really intuitively clear, it should not be hard to remember because it should be obvious.0202

The variance is less obvious, the variance turns out to be θ2 – θ1²/12.0207

I think that is not something that you would guess.0215

You probably would guess the mean if you give a little bit of thought.0218

The variance is not something that you probably guessed.0221

You probably have to calculate it out or just memorize it.0225

The standard deviation, remember, it is always the square root of the variance.0229

If you take the square root of the variance here, you get θ2 – θ1 ÷ √ 12 which is 2 × √ 3.0233

That is the standard deviation of the uniform distribution.0243

It should still make a rough intuitive sense because it is really a measure of how spread out the interval is.0249

Remember that, variance and standard deviation measure how spread out your dataset is.0257

In this case, since we got a uniform distribution, if it is spread out over a wider area0264

then you should have a higher variance or a higher standard deviation.0270

If it is squished into a smaller area then you should have a smaller variance or a smaller standard deviation.0274

In this case, since we have the term of θ2 – θ1, that is the width of the interval there, θ2 – tθ1.0282

What we are saying here is that the standard deviation is proportional to the width of the interval.0291

If you have a wider interval, if you spread your data out more then you have a larger standard deviation.0297

If you compress your interval, then you will have a smaller standard deviation.0303

It should be intuitively plausible.0307

What is not so obvious I think is, the constant 12 for the variance, 2√ 3 for the standard deviation.0310

I think that part would not be obvious unless, you actually calculated them out.0317

Let us go ahead and look at some problems involving the uniform distribution.0322

They generally tend to be fairly easy to calculate.0328

The first example here is, you are sitting on your front doorstep, waiting for your morning newspaper to arrive.0330

It always arrives sometime between 7:00 AM and the noon.0337

The time at which it arrives follows a uniform distribution.0342

We want to find that probability that it will arrive during an odd number hour,0347

which means we want it to arrive between 7 and 8, not between 8 and 9 because that would be even number hour.0352

Between 9 and 10 we are qualified, 10 and 11 would not qualify, and 11 to 12 that would also qualify as an odd numbered hour.0360

It looks like there is 3 hours here that would qualify.0371

Our total range here is 7 to noon, 7 to 12, that is θ1 is 7 and θ2 is 12.0374

Θ2- θ1 is 12 – 7 is 5, that is our denominator here.0385

We want to talk about what range we are interested in.0401

We have all these odd number hours, 8 -7 + 10 - 9 + 12 -11, which of course is just 3 separate hours on that 5 hour interval.0404

We have a total probability of 3/5.0423

If your newspaper is going to arrive sometime between 7:00 AM and 12, and it is uniformly distributed in that interval,0431

then there is a 60% chance that it will be an odd numbered hour.0440

Let me recap that.0447

We got this newspaper arriving in a 5 hour interval, that is where I got that denominator of 5 because it is a 5 hour interval.0449

If you want think about it in terms of θ2 – θ1, that is 12 -7, that is where that 5 comes from.0457

The 3 comes from the 3 hours that have odd numbers.0463

The 7:00 hour, the 9:00 hour, and 11:00 hour, gives you 3 different hours that have odd numbers.0469

Our total fraction is 3/5, the probability that it will arrive during an odd numbered hour is exactly 60%.0477

Remember, I told you that the uniform distribution is one of the easiest to deal with.0484

Problems involving uniform distribution are often very easy computationally.0489

This certainly qualifies as an easy one computationally but if you stick around, I got a couple of harder once coming up.0494

We will see something a little harder coming up.0500

Example 2 is also going to be an easy one, trust me, example 3 will be a little more tricky.0504

But example 2, we are going to pick a real number Y from the uniform distribution on the interval from 5 to 12.0511

Let me go ahead and graph this out.0519

There is 5, there is 6, 7, 8, 9, 10, 11, there is 12.0521

We are going to pick a real number Y from somewhere in this interval on the uniform distribution.0531

We want to find the probability that that value of Y will turn out to be bigger than 9.0537

The probability that Y would be bigger than or equal to 9, is equal to, on the denominator we have θ2 – θ1.0544

In the numerator, we have B - A being the interval that we are interested in.0555

Let me go ahead and draw that in.0561

Θ1 is 5, θ2 is 12, the interval that we are interested in is from 9 to 12 because we want Y to be bigger than 9.0563

6, 7, 8, 9, there is 9 right there.0573

There is A is 9, and B is the same as θ2, B is 12.0577

B - A is 12 – 9, θ2 – θ1 is 12 -5.0583

What we have there is 12 – 9 is 3, 12 -5 is 7.0592

The probability that our number will be bigger than 9 is exactly 3/7.0598

Again, very easy computations for the uniform distribution, it is just a matter of saying how wide is your interval.0604

In this case, our interval is 7 units wide, the interval from 5 to 12, that is where we got that 7 in the denominator.0612

How wide is the region that you are interested in, the region that you might call success?0621

In this case, success is defined as Y is bigger than 9.0626

That region of success will be the interval from 9 to 12.0631

The width of that interval from 9 to 12 is just 12 - 9 is 3, that is where we got that 3.0637

Our total answer, our total probability there is just 3/7.0643

I guess you could convert that into a decimal, I think that would turn out to be about 42%.0649

But that does not come out neatly, I’m going to leave it as a fraction as 3/7 there.0655

Example 3 is a bit trickier, what is happening here is that you have a dinner date with your best friend.0663

You are planning to meet at 6 pm at the restaurant but the problem is that, both of you tend to run a little late.0670

In fact, even though you are planning to me at 6:00 pm, you might be a little bit late, your friend might be a little bit late.0678

One of you is probably going to end up waiting a little bit for the other one.0687

The way it works is you tend to arrive between 0 and 15 minutes late.0690

You are never later than 15 minutes and you are never early.0696

You are always maybe 7 minutes, 8 minutes, 11 minutes late, somewhere between 0 and 15.0700

Your friend is always between 0 and 10 minutes late.0706

Your friend is a little bit more prompt than you are.0709

The question we are asking here is the probability that you will arrive before your friend.0713

In other words, will you be the one who gets there first and has to find the table at dinner0718

and will be sitting around waiting for your friend, or will it be the other way around?0724

Your friend arrives first and has to deal with the waiter, and your friend would be waiting for you.0728

Let me show you how to solve this one.0733

We are going to set up two variables here because there are two independent things going on.0736

There is you arriving to the restaurant and there is your friend arriving to the restaurant.0740

I set up the variable for X which is your arrival time which could be anywhere from 0 to 15 minutes late.0746

Y is going to be your friends arrival time which could be anywhere from 0 to 10 minutes late.0759

A really useful way to think about this problem is to graph it.0770

Let me go ahead and draw a graph of the possibilities here.0774

I will put X on the X axis and Y on the Y axis.0780

Your arrival time can be anywhere from 0 to, there is you being 5 minutes late, here you are being 10 minutes late,0785

and here you are coming in 15 minutes late.0794

We know you are not going to be later than that.0797

There is your friend arriving 5 minutes late and here is your friend arriving 10 minutes late.0800

We know that your friend would not be later than 10 minutes.0806

What that means is your combined arrival time, the combination of arrival times0809

is going to be somewhere in this rectangle, depending on when you arrive and depending on when your friend arrives.0817

Somewhere, you will arrive a certain number of minutes late and your friend will arrive a certain number of minutes late.0825

That will give us a point somewhere in this rectangle.0830

Then, we will look at that and say did you arrive first or did your friend arrive first.0834

The way we want to think about that is we want to calculate the probability that you will arrive before your friend.0841

In other words, we want the probability that X is less than or equal to Y, that is you arriving before your friend.0848

Let me just turn those variables around because I think it will be easier to graph that way.0859

The probability that Y is greater than or equal to X, that is saying the same thing.0864

Let me graph the region in which Y is greater than X.0869

A little bit of algebra review here.0873

Maybe, I will do this in black.0877

The line Y equals X is that line right there.0881

That is the line Y is equal to X.0885

Y greater than X means you go above the lines.0888

It is all this triangular region above the line here.0894

That is the region where you arrive before your friend.0900

Anywhere below the line, that means your friend arrives first and you arrived afterwards.0907

Let us try to calculate that the probability of being in that black shaded region.0915

It is the total shaded area ÷ the total area in the rectangle.0921

Let us try to figure out what those areas are.0932

The shaded area, I see I have a triangle with 10 units on a side here.0934

That is base × height/2, that is 10 × 10/2, 100/2 is 50.0942

50 units in your shaded triangle.0950

The total area is a rectangle, it is 10 by 15 on the side, that is 10 × 15 is 150.0955

Here is very nice, it simplifies lovely to 1/3.0963

The probability that you will arrive before your friend is exactly 1/3.0970

If you make lots and lots of dates with the same friend, and you guys both follow the same habits over the years,0977

what will happen is 1/3 of the time you will be sitting around waiting for your friend.0985

2/3 of the time your friend will be sitting around waiting for you.0990

That is really the result of the fact that your friend is a little bit more prompt than you.0995

Most of the time your friend will end up waiting for you, at some of the time, 1/3 of the time,1003

you will wait for your friend.1007

Let me recap that.1009

The way we approach this is, we noticed first that we really had two independent uniform distributions.1011

There is one for your arrival time and there is one for your friend’s arrival time.1017

That is the first thing I did was to set up variables to indicate your arrival time and your friend’s arrival time.1023

Your arrival time, I put on the X axis that goes from 0 to 15 because you can be anywhere from 0 to 15 minutes late.1030

Your friend is anywhere from 0 to 10 minutes late.1040

I got those from the stem of the problem here.1043

I graph those here and I got this nice rectangle of possible combinations of arrival ×.1048

Once I have this rectangle, I know that while you arrive at a particular time, your friend will arrive at a particular time,1057

that means essentially we are choosing a point at random in this rectangle.1063

And then, we have to ask whether you will arrive before your friend?1069

You arriving before your friend means your arrival time is less than or equal to your friend’s arrival time,1074

which can be rewritten as Y greater than X.1081

We graph the shaded region, represents the region where Y is greater than or equal to X.1084

And then, it was just a matter of calculating the areas which turned into1091

a little old fashion geometry of calculating the area of a triangle.1094

The triangle was ½ base × height, ½ × 10 × 10.1099

The rectangle was base × height, that is 15 × 10.1105

We get 50/150 simplifies down to a probability of 1/3.1113

That represents the chance that you have arrived at the restaurant before your friend.1119

You will be the one who has to sit around and wait.1125

In example 4, we have a problem that is a great interest to computer programmers.1132

The reason is that most computer languages have a random number function.1139

It uses something rand or random, or something like that.1146

If you type rand into a computer program and the appropriate language, it will give you a number between 0 and 1.1151

We usually try to arrange it so that the random numbers are uniformly distributed between 0 and 1,1162

which is very good if you need a number between 0 and 1.1168

In lots of cases, when you need a random number in a computer program,1172

you need a random number on another range which might be from θ1 to θ2.1177

What this problem really does is, it shows you how to convert a uniform distribution on 0-11184

into a uniform distribution onto θ1 and θ2.1193

That is the point of this problem, it is very useful for computer programmers1197

but that is not actually what we are doing here.1203

What we are doing is we are making a little transformation and we are starting with Y1205

being uniformly distributed on the interval from 0 to 1.1210

We are looking at another variable X which is defined to be, by the way that colon means define to be.1215

X is defined to be θ2 – tθ1 Y + θ1.1230

We want to show that that is a uniform distribution on the interval from θ1 to θ2.1236

To show that, let me first find the range of values for X.1243

Notice that, the range for Y is from 0 to 1, I have Y =0 to Y =1, if I plug those values into X,1250

if I plug Y =0 in to X, I get X = θ2 – θ1 × 0, that drops out.1260

I just get X = θ1.1268

If I plug Y = 1 into X, I get X = θ2- θ1 × 1 + θ1.1270

That simplifies down to θ2.1282

That tells me the range for X, X goes from θ1 to θ2.1287

That is hopeful, at least I know that X is distributed somehow on the range from θ1 to θ2.1294

But I want to really show that X is a uniform distribution.1302

I want to calculate the probability of the line between any 2 values.1305

Let me find that probability here.1313

The probability that is X is between any two values A and B, I can calculate that as the probability that,1317

X, just by definition is θ2 – θ1 × Y + θ1, that should be between A and B.1330

What I want to do is to solve this into a set of probabilities for Y.1343

First, subtract off θ1 from all 3 sides there.1350

The probability of A – θ1 being less than or equal to θ2 -θ1 × Y, less than or equal to B – θ1.1354

I’m trying to solve for Y, I'm trying to get Y by itself.1368

Next, I’m going to divide by θ2 – θ1.1372

This is the probability of A – θ1/θ2- θ1.1377

Less than or equal to just Y by itself now, less than or equal to B –θ1/θ2 – θ1.1385

I'm remembering that Y itself is a uniformly distributed random variable.1397

The probability that Y would be between any two bounds is just the difference between those two bounds.1404

You just subtract those two bounds.1411

I will just do B – θ1/θ2- θ1- A – θ1/θ2 – θ1.1413

I see now that I have a common denominator, θ2- θ1.1425

I got –θ1 in the first term and - and -, + θ1 in the second term.1434

Those θ1 cancel with each other and I just get down to B – A.1441

If you look at that, what I did was I started out with the probability that X is going to be between A and B.1448

What I came up with is, the probability is equal to exactly B - A ÷ θ2 – θ1.1458

That is exactly the formula for a uniform distribution.1468

X has a uniform distribution, X is uniformly distributed, distribution on the interval from θ1 to θ2.1475

I should start with θ1 and go to θ2.1501

X has a uniform distribution because the probability of X falling in any interval is exactly equal to the width of that interval B – A.1507

To recap what I did there, first, I looked at the range for Y, Y goes from 0 to 1.1519

Based on that, I calculated the range for X.1526

I plugged in those values of Y 0 and 1 into the formula for X here, and calculated the bounds for X being θ1 to θ2.1529

I know that X takes on the right range of values.1541

And then, I found the probability of any particular sub interval from A to B by converting X into terms of Y.1545

Solving out to isolate the variable Y, and then I use the fact that Y is uniformly distributed.1556

The probability of Y being between any two limits is just the width of those limits, the difference between those two limits.1563

B – θ1/θ2 –θ1- A – θ1/θ2 – θ1.1572

That simplify down to B –A /θ2 – θ1, which is exactly the formula for probability with a uniform distribution.1580

That tells me that X has a uniform distribution on the interval from θ1 to θ2.1594

That is very useful if you are computer programmer because that means1601

you can use the random number generator given by most computer programming languages.1605

And then, you can use this formula to convert it into a uniform distribution whatever range you want.1613

If you want for example, a random number between 80 and 100, and then you just plug in θ1 = 80 and θ2 =100.1620

You can use this formula to generate a random number between 80 and 100,1631

that will be uniformly distributed between 80 and 100.1638

In example 5, we are going leave the world of the random numbers behind.1644

We are going to look at the rough and tumble world of ice cream dispensary.1649

We have an ice cream machine which gives you servings of ice cream.1653

The servings vary a little bit, if you are unlucky the machine will be stingy with you, and it will give you just 206 ml of ice cream.1659

If it is a good day for you, if the machine is feeling generous, it will give you up to 238 ml.1668

Essentially, it picks a random amount in between 206 and 230, and they are uniformly distributed.1676

The question we are trying to answer is, if you go up there with your bowl and1684

you want to predict how much ice cream you will get, you want to describe what the expected amount of ice cream is in a serving,1691

and also the standard deviation in that quantity.1698

This is really asking, the expected value and mean are the same thing.1705

We are trying to calculate the expected value or the mean of this uniform distribution, and also the standard deviation.1710

I gave you formulas for those as few slides back, in a slide called key properties of the uniform distribution.1717

You can go back and look those up, I will remind you what they are here.1728

The mean which is always the same as the expected value, by definition those are the same thing.1732

For the uniform distribution is θ1 + tθ2 ÷ 2.1738

The θ1 and θ2 are the ranges of the endpoints of the interval.1746

In this case that is 206 + 230 ÷ 2, that is 436 ÷ 2 which is 218.1752

Our units here are ml, the average amount you expect to get when you fill up your bowl1767

at this ice cream machine is 218 ml of ice cream.1775

Of course that is not at all surprising, if you are going to get a random amount between 206 and 230,1780

it is not surprising that in the long run, you will get about halfway between 206 and 230 which is 218.1789

That is really not surprising at all.1798

The standard deviation, I also gave you on that slide, several slides ago.1801

It is θ2- θ1 ÷ 2 √s 3.1805

In this case, θ2, the big one is 230, Θ1 is 206, we want to divide that by 2 √3.1812

230 -206 is 24, the 200 part cancels, ÷ 2 and 3, that simplifies down to 12/3.1823

Since, 12 is 4 × 3, this just gives us 4 × √ 3 which I put that into a calculator,1837

it works out to be just a little bit less than 7 ml.1849

It is about 6.9 ml.1854

If you fill up your bowl at this ice cream machine, you expect on average to get about 218 ml.1859

The standard deviation on that will be 6.9 ml, about 7 ml + or - from 200 and 18.1867

To recap where these numbers came from.1876

The formulas for the mean and standard deviation, I give this to you on early slide in this talk.1879

It was called key properties of the uniform distribution.1885

They are fairly straightforward formulas.1888

In particular, the mean is what you would guess.1889

It is just the average of the upper and lower bounds, 206 + 230/2 gives you an average of 280 ml of ice cream per serving.1893

The standard deviation is probably not something you would guess but if you have a formula handy, it is θ2 – θ1/203.1905

I will just plug in the θ2 and θ1 into those values there.1914

And I simplified it down to 4 √3 and I have this decimal approximation that is about 6.9 or about 7 ml.1920

That is your standard deviation in an ice cream serving.1928

That is the last example and that wraps up this lecture on the uniform distribution.1932

The uniform distribution is just the first, it is the easiest of several continuous distributions.1938

We will be moving on from here and looking at the famous normal distribution, not the same as the uniform distribution,1944

and also the gamma distribution which includes the exponential distribution and chi square distribution.1951

Those are all coming up in the next few lectures here in the probabilities series on www.educator.com.1958

You are watching Will Murray with www.educator.com, and thank you very much for joining us, bye.1965

Educator®

Please sign in to participate in this lecture discussion.

Resetting Your Password?
OR

Start Learning Now

Our free lessons will get you started (Adobe Flash® required).
Get immediate access to our entire library.

Membership Overview

  • Available 24/7. Unlimited Access to Our Entire Library.
  • Search and jump to exactly what you want to learn.
  • *Ask questions and get answers from the community and our teachers!
  • Practice questions with step-by-step solutions.
  • Download lecture slides for taking notes.
  • Track your course viewing progress.
  • Accessible anytime, anywhere with our Android and iOS apps.