×
Start learning today, and be successful in your academic & professional career. Start Today!

William Murray

Mean & Variance for Continuous Distributions

Slide Duration:

Section 1: Probability by Counting
Experiments, Outcomes, Samples, Spaces, Events

59m 30s

Intro
0:00
Terminology
0:19
Experiment
0:26
Outcome
0:56
Sample Space
1:16
Event
1:55
Key Formula
2:47
Formula for Finding the Probability of an Event
2:48
Example: Drawing a Card
3:36
Example I
5:01
Experiment
5:38
Outcomes
5:54
Probability of the Event
8:11
Example II
12:00
Experiment
12:17
Outcomes
12:34
Probability of the Event
13:49
Example III
16:33
Experiment
17:09
Outcomes
17:33
Probability of the Event
18:25
Example IV
21:20
Experiment
21:21
Outcomes
22:00
Probability of the Event
23:22
Example V
31:41
Experiment
32:14
Outcomes
32:35
Probability of the Event
33:27
Alternate Solution
40:16
Example VI
43:33
Experiment
44:08
Outcomes
44:24
Probability of the Event
53:35

1h 2m 47s

Intro
0:00
Unions of Events
0:40
Unions of Events
0:41
Disjoint Events
3:42
Intersections of Events
4:18
Intersections of Events
4:19
Conditional Probability
5:47
Conditional Probability
5:48
Independence
8:20
Independence
8:21
Warning: Independent Does Not Mean Disjoint
9:53
If A and B are Independent
11:20
Example I: Choosing a Number at Random
12:41
Solving by Counting
12:52
Solving by Probability
17:26
Example II: Combination
22:07
Combination Deal at a Restaurant
22:08
Example III: Rolling Two Dice
24:18
Define the Events
24:20
Solving by Counting
27:35
Solving by Probability
29:32
Example IV: Flipping a Coin
35:07
Flipping a Coin Four Times
35:08
Example V: Conditional Probabilities
41:22
Define the Events
42:23
Calculate the Conditional Probabilities
46:21
Example VI: Independent Events
53:42
Define the Events
53:43
Are Events Independent?
55:21
Choices: Combinations & Permutations

56m 3s

Intro
0:00
Choices: With or Without Replacement?
0:12
Choices: With or Without Replacement?
0:13
Example: With Replacement
2:17
Example: Without Replacement
2:55
Choices: Ordered or Unordered?
4:10
Choices: Ordered or Unordered?
4:11
Example: Unordered
4:52
Example: Ordered
6:08
Combinations
9:23
Definition & Equation: Combinations
9:24
Example: Combinations
12:12
Permutations
13:56
Definition & Equation: Permutations
13:57
Example: Permutations
15:00
Key Formulas
17:19
Number of Ways to Pick r Things from n Possibilities
17:20
Example I: Five Different Candy Bars
18:31
Example II: Five Identical Candy Bars
24:53
Example III: Five Identical Candy Bars
31:56
Example IV: Five Different Candy Bars
39:21
Example V: Pizza & Toppings
45:03
Inclusion & Exclusion

43m 40s

Intro
0:00
Inclusion/Exclusion: Two Events
0:09
Inclusion/Exclusion: Two Events
0:10
Inclusion/Exclusion: Three Events
2:30
Inclusion/Exclusion: Three Events
2:31
Example I: Inclusion & Exclusion
6:24
Example II: Inclusion & Exclusion
11:01
Example III: Inclusion & Exclusion
18:41
Example IV: Inclusion & Exclusion
28:24
Example V: Inclusion & Exclusion
39:33
Independence

46m 9s

Intro
0:00
Formula and Intuition
0:12
Definition of Independence
0:19
Intuition
0:49
Common Misinterpretations
1:37
Myth & Truth 1
1:38
Myth & Truth 2
2:23
Combining Independent Events
3:56
Recall: Formula for Conditional Probability
3:58
Combining Independent Events
4:10
Example I: Independence
5:36
Example II: Independence
14:14
Example III: Independence
21:10
Example IV: Independence
32:45
Example V: Independence
41:13
Bayes' Rule

1h 2m 10s

Intro
0:00
When to Use Bayes' Rule
0:08
When to Use Bayes' Rule: Disjoint Union of Events
0:09
Bayes' Rule for Two Choices
2:50
Bayes' Rule for Two Choices
2:51
Bayes' Rule for Multiple Choices
5:03
Bayes' Rule for Multiple Choices
5:04
Example I: What is the Chance that She is Diabetic?
6:55
Example I: Setting up the Events
6:56
Example I: Solution
11:33
Example II: What is the chance that It Belongs to a Woman?
19:28
Example II: Setting up the Events
19:29
Example II: Solution
21:45
Example III: What is the Probability that She is a Democrat?
27:31
Example III: Setting up the Events
27:32
Example III: Solution
32:08
Example IV: What is the chance that the Fruit is an Apple?
39:11
Example IV: Setting up the Events
39:12
Example IV: Solution
43:50
Example V: What is the Probability that the Oldest Child is a Girl?
51:16
Example V: Setting up the Events
51:17
Example V: Solution
53:07
Section 2: Random Variables
Random Variables & Probability Distribution

38m 21s

Intro
0:00
Intuition
0:15
Intuition for Random Variable
0:16
Example: Random Variable
0:44
Intuition, Cont.
2:52
Example: Random Variable as Payoff
2:57
Definition
5:11
Definition of a Random Variable
5:13
Example: Random Variable in Baseball
6:02
Probability Distributions
7:18
Probability Distributions
7:19
Example I: Probability Distribution for the Random Variable
9:29
Example II: Probability Distribution for the Random Variable
14:52
Example III: Probability Distribution for the Random Variable
21:52
Example IV: Probability Distribution for the Random Variable
27:25
Example V: Probability Distribution for the Random Variable
34:12
Expected Value (Mean)

46m 14s

Intro
0:00
Definition of Expected Value
0:20
Expected Value of a (Discrete) Random Variable or Mean
0:21
Indicator Variables
3:03
Indicator Variable
3:04
Linearity of Expectation
4:36
Linearity of Expectation for Random Variables
4:37
Expected Value of a Function
6:03
Expected Value of a Function
6:04
Example I: Expected Value
7:30
Example II: Expected Value
14:14
Example III: Expected Value of Flipping a Coin
21:42
Example III: Part A
21:43
Example III: Part B
30:43
Example IV: Semester Average
36:39
Example V: Expected Value of a Function of a Random Variable
41:28
Variance & Standard Deviation

47m 23s

Intro
0:00
Definition of Variance
0:11
Variance of a Random Variable
0:12
Variance is a Measure of the Variability, or Volatility
1:06
Most Useful Way to Calculate Variance
2:46
Definition of Standard Deviation
3:44
Standard Deviation of a Random Variable
3:45
Example I: Which of the Following Sets of Data Has the Largest Variance?
5:34
Example II: Which of the Following Would be the Least Useful in Understanding a Set of Data?
9:02
Example III: Calculate the Mean, Variance, & Standard Deviation
11:48
Example III: Mean
12:56
Example III: Variance
14:06
Example III: Standard Deviation
15:42
Example IV: Calculate the Mean, Variance, & Standard Deviation
17:54
Example IV: Mean
18:47
Example IV: Variance
20:36
Example IV: Standard Deviation
25:34
Example V: Calculate the Mean, Variance, & Standard Deviation
29:56
Example V: Mean
30:13
Example V: Variance
33:28
Example V: Standard Deviation
34:48
Example VI: Calculate the Mean, Variance, & Standard Deviation
37:29
Example VI: Possible Outcomes
38:09
Example VI: Mean
39:29
Example VI: Variance
41:22
Example VI: Standard Deviation
43:28
Markov's Inequality

26m 45s

Intro
0:00
Markov's Inequality
0:25
Markov's Inequality: Definition & Condition
0:26
Markov's Inequality: Equation
1:15
Markov's Inequality: Reverse Equation
2:48
Example I: Money
4:11
Example II: Rental Car
9:23
Example III: Probability of an Earthquake
12:22
Example IV: Defective Laptops
16:52
Example V: Cans of Tuna
21:06
Tchebysheff's Inequality

42m 11s

Intro
0:00
Tchebysheff's Inequality (Also Known as Chebyshev's Inequality)
0:52
Tchebysheff's Inequality: Definition
0:53
Tchebysheff's Inequality: Equation
1:19
Tchebysheff's Inequality: Intuition
3:21
Tchebysheff's Inequality in Reverse
4:09
Tchebysheff's Inequality in Reverse
4:10
Intuition
5:13
Example I: Money
5:55
Example II: College Units
13:20
Example III: Using Tchebysheff's Inequality to Estimate Proportion
16:40
Example IV: Probability of an Earthquake
25:21
Example V: Using Tchebysheff's Inequality to Estimate Proportion
32:57
Section 3: Discrete Distributions
Binomial Distribution (Bernoulli Trials)

52m 36s

Intro
0:00
Binomial Distribution
0:29
Binomial Distribution (Bernoulli Trials) Overview
0:30
Prototypical Examples: Flipping a Coin n Times
1:36
Process with Two Outcomes: Games Between Teams
2:12
Process with Two Outcomes: Rolling a Die to Get a 6
2:42
Formula for the Binomial Distribution
3:45
Fixed Parameters
3:46
Formula for the Binomial Distribution
6:27
Key Properties of the Binomial Distribution
9:54
Mean
9:55
Variance
10:56
Standard Deviation
11:13
Example I: Games Between Teams
11:36
Example II: Exam Score
17:01
Example III: Expected Grade & Standard Deviation
25:59
Example IV: Pogo-sticking Championship, Part A
33:25
Example IV: Pogo-sticking Championship, Part B
38:24
Example V: Expected Championships Winning & Standard Deviation
45:22
Geometric Distribution

52m 50s

Intro
0:00
Geometric Distribution
0:22
Geometric Distribution: Definition
0:23
Prototypical Example: Flipping a Coin Until We Get a Head
1:08
Geometric Distribution vs. Binomial Distribution.
1:31
Formula for the Geometric Distribution
2:13
Fixed Parameters
2:14
Random Variable
2:49
Formula for the Geometric Distribution
3:16
Key Properties of the Geometric Distribution
6:47
Mean
6:48
Variance
7:10
Standard Deviation
7:25
Geometric Series
7:46
Recall from Calculus II: Sum of Infinite Series
7:47
Application to Geometric Distribution
10:10
Example I: Drawing Cards from a Deck (With Replacement) Until You Get an Ace
13:02
Example I: Question & Solution
13:03
Example II: Mean & Standard Deviation of Winning Pin the Tail on the Donkey
16:32
Example II: Mean
16:33
Example II: Standard Deviation
18:37
Example III: Rolling a Die
22:09
Example III: Setting Up
22:10
Example III: Part A
24:18
Example III: Part B
26:01
Example III: Part C
27:38
Example III: Summary
32:02
Example IV: Job Interview
35:16
Example IV: Setting Up
35:15
Example IV: Part A
37:26
Example IV: Part B
38:33
Example IV: Summary
39:37
Example V: Mean & Standard Deviation of Time to Conduct All the Interviews
41:13
Example V: Setting Up
42:50
Example V: Mean
46:05
Example V: Variance
47:37
Example V: Standard Deviation
48:22
Example V: Summary
49:36
Negative Binomial Distribution

51m 39s

Intro
0:00
Negative Binomial Distribution
0:11
Negative Binomial Distribution: Definition
0:12
Prototypical Example: Flipping a Coin Until We Get r Successes
0:46
Negative Binomial Distribution vs. Binomial Distribution
1:04
Negative Binomial Distribution vs. Geometric Distribution
1:33
Formula for Negative Binomial Distribution
3:39
Fixed Parameters
3:40
Random Variable
4:57
Formula for Negative Binomial Distribution
5:18
Key Properties of Negative Binomial
7:44
Mean
7:47
Variance
8:03
Standard Deviation
8:09
Example I: Drawing Cards from a Deck (With Replacement) Until You Get Four Aces
8:32
Example I: Question & Solution
8:33
Example II: Chinchilla Grooming
12:37
Example II: Mean
12:38
Example II: Variance
15:09
Example II: Standard Deviation
15:51
Example II: Summary
17:10
Example III: Rolling a Die Until You Get Four Sixes
18:27
Example III: Setting Up
19:38
Example III: Mean
19:38
Example III: Variance
20:31
Example III: Standard Deviation
21:21
Example IV: Job Applicants
24:00
Example IV: Setting Up
24:01
Example IV: Part A
26:16
Example IV: Part B
29:53
Example V: Mean & Standard Deviation of Time to Conduct All the Interviews
40:10
Example V: Setting Up
40:11
Example V: Mean
45:24
Example V: Variance
46:22
Example V: Standard Deviation
47:01
Example V: Summary
48:16
Hypergeometric Distribution

36m 27s

Intro
0:00
Hypergeometric Distribution
0:11
Hypergeometric Distribution: Definition
0:12
Random Variable
1:38
Formula for the Hypergeometric Distribution
1:50
Fixed Parameters
1:51
Formula for the Hypergeometric Distribution
2:53
Key Properties of Hypergeometric
6:14
Mean
6:15
Variance
6:42
Standard Deviation
7:16
Example I: Students Committee
7:30
Example II: Expected Number of Women on the Committee in Example I
11:08
Example III: Pairs of Shoes
13:49
Example IV: What is the Expected Number of Left Shoes in Example III?
20:46
Example V: Using Indicator Variables & Linearity of Expectation
25:40
Poisson Distribution

52m 19s

Intro
0:00
Poisson Distribution
0:18
Poisson Distribution: Definition
0:19
Formula for the Poisson Distribution
2:16
Fixed Parameter
2:17
Formula for the Poisson Distribution
2:59
Key Properties of the Poisson Distribution
5:30
Mean
5:34
Variance
6:07
Standard Deviation
6:27
Example I: Forest Fires
6:41
Example II: Call Center, Part A
15:56
Example II: Call Center, Part B
20:50
Example III: Confirming that the Mean of the Poisson Distribution is λ
26:53
Example IV: Find E (Y²) for the Poisson Distribution
35:24
Example V: Earthquakes, Part A
37:57
Example V: Earthquakes, Part B
44:02
Section 4: Continuous Distributions
Density & Cumulative Distribution Functions

57m 17s

Intro
0:00
Density Functions
0:43
Density Functions
0:44
Density Function to Calculate Probabilities
2:41
Cumulative Distribution Functions
4:28
Cumulative Distribution Functions
4:29
Using F to Calculate Probabilities
5:58
Properties of the CDF (Density & Cumulative Distribution Functions)
7:27
F(-∞) = 0
7:34
F(∞) = 1
8:30
F is Increasing
9:14
F'(y) = f(y)
9:21
Example I: Density & Cumulative Distribution Functions, Part A
9:43
Example I: Density & Cumulative Distribution Functions, Part B
14:16
Example II: Density & Cumulative Distribution Functions, Part A
21:41
Example II: Density & Cumulative Distribution Functions, Part B
26:16
Example III: Density & Cumulative Distribution Functions, Part A
32:17
Example III: Density & Cumulative Distribution Functions, Part B
37:08
Example IV: Density & Cumulative Distribution Functions
43:34
Example V: Density & Cumulative Distribution Functions, Part A
51:53
Example V: Density & Cumulative Distribution Functions, Part B
54:19
Mean & Variance for Continuous Distributions

36m 18s

Intro
0:00
Mean
0:32
Mean for a Continuous Random Variable
0:33
Expectation is Linear
2:07
Variance
2:55
Variance for Continuous random Variable
2:56
Easier to Calculate Via the Mean
3:26
Standard Deviation
5:03
Standard Deviation
5:04
Example I: Mean & Variance for Continuous Distributions
5:43
Example II: Mean & Variance for Continuous Distributions
10:09
Example III: Mean & Variance for Continuous Distributions
16:05
Example IV: Mean & Variance for Continuous Distributions
26:40
Example V: Mean & Variance for Continuous Distributions
30:12
Uniform Distribution

32m 49s

Intro
0:00
Uniform Distribution
0:15
Uniform Distribution
0:16
Each Part of the Region is Equally Probable
1:39
Key Properties of the Uniform Distribution
2:45
Mean
2:46
Variance
3:27
Standard Deviation
3:48
Example I: Newspaper Delivery
5:25
Example II: Picking a Real Number from a Uniform Distribution
8:21
Example III: Dinner Date
11:02
Example IV: Proving that a Variable is Uniformly Distributed
18:50
Example V: Ice Cream Serving
27:22
Normal (Gaussian) Distribution

1h 3m 54s

Intro
0:00
Normal (Gaussian) Distribution
0:35
Normal (Gaussian) Distribution & The Bell Curve
0:36
Fixed Parameters
0:55
Formula for the Normal Distribution
1:32
Formula for the Normal Distribution
1:33
Calculating on the Normal Distribution can be Tricky
3:32
Standard Normal Distribution
5:12
Standard Normal Distribution
5:13
Graphing the Standard Normal Distribution
6:13
Standard Normal Distribution, Cont.
8:30
Standard Normal Distribution Chart
8:31
Nonstandard Normal Distribution
14:44
Nonstandard Normal Variable & Associated Standard Normal
14:45
Finding Probabilities for Z
15:39
Example I: Chance that Standard Normal Variable Will Land Between 1 and 2?
16:46
Example I: Setting Up the Equation & Graph
16:47
Example I: Solving for z Using the Standard Normal Chart
19:05
Example II: What Proportion of the Data Lies within Two Standard Deviations of the Mean?
20:41
Example II: Setting Up the Equation & Graph
20:42
Example II: Solving for z Using the Standard Normal Chart
24:38
Example III: Scores on an Exam
27:34
Example III: Setting Up the Equation & Graph, Part A
27:35
Example III: Setting Up the Equation & Graph, Part B
33:48
Example III: Solving for z Using the Standard Normal Chart, Part A
38:23
Example III: Solving for z Using the Standard Normal Chart, Part B
40:49
Example IV: Temperatures
42:54
Example IV: Setting Up the Equation & Graph
42:55
Example IV: Solving for z Using the Standard Normal Chart
47:03
Example V: Scores on an Exam
48:41
Example V: Setting Up the Equation & Graph, Part A
48:42
Example V: Setting Up the Equation & Graph, Part B
53:20
Example V: Solving for z Using the Standard Normal Chart, Part A
57:45
Example V: Solving for z Using the Standard Normal Chart, Part B
59:17
Gamma Distribution (with Exponential & Chi-square)

1h 8m 27s

Intro
0:00
Gamma Function
0:49
The Gamma Function
0:50
Properties of the Gamma Function
2:07
Formula for the Gamma Distribution
3:50
Fixed Parameters
3:51
Density Function for Gamma Distribution
4:07
Key Properties of the Gamma Distribution
7:13
Mean
7:14
Variance
7:25
Standard Deviation
7:30
Exponential Distribution
8:03
Definition of Exponential Distribution
8:04
Density
11:23
Mean
13:26
Variance
13:48
Standard Deviation
13:55
Chi-square Distribution
14:34
Chi-square Distribution: Overview
14:35
Chi-square Distribution: Mean
16:27
Chi-square Distribution: Variance
16:37
Chi-square Distribution: Standard Deviation
16:55
Example I: Graphing Gamma Distribution
17:30
Example I: Graphing Gamma Distribution
17:31
Example I: Describe the Effects of Changing α and β on the Shape of the Graph
23:33
Example II: Exponential Distribution
27:11
Example II: Using the Exponential Distribution
27:12
Example II: Summary
35:34
Example III: Earthquake
37:05
Example III: Estimate Using Markov's Inequality
37:06
Example III: Estimate Using Tchebysheff's Inequality
40:13
Example III: Summary
44:13
Example IV: Finding Exact Probability of Earthquakes
46:45
Example IV: Finding Exact Probability of Earthquakes
46:46
Example IV: Summary
51:44
Example V: Prove and Interpret Why the Exponential Distribution is Called 'Memoryless'
52:51
Example V: Prove
52:52
Example V: Interpretation
57:44
Example V: Summary
1:03:54
Beta Distribution

52m 45s

Intro
0:00
Beta Function
0:29
Fixed parameters
0:30
Defining the Beta Function
1:19
Relationship between the Gamma & Beta Functions
2:02
Beta Distribution
3:31
Density Function for the Beta Distribution
3:32
Key Properties of the Beta Distribution
6:56
Mean
6:57
Variance
7:16
Standard Deviation
7:37
Example I: Calculate B(3,4)
8:10
Example II: Graphing the Density Functions for the Beta Distribution
12:25
Example III: Show that the Uniform Distribution is a Special Case of the Beta Distribution
24:57
Example IV: Show that this Triangular Distribution is a Special Case of the Beta Distribution
31:20
Example V: Morning Commute
37:39
Example V: Identify the Density Function
38:45
Example V: Morning Commute, Part A
42:22
Example V: Morning Commute, Part B
44:19
Example V: Summary
49:13
Moment-Generating Functions

51m 58s

Intro
0:00
Moments
0:30
Definition of Moments
0:31
Moment-Generating Functions (MGFs)
3:53
Moment-Generating Functions
3:54
Using the MGF to Calculate the Moments
5:21
Moment-Generating Functions for the Discrete Distributions
8:22
Moment-Generating Functions for Binomial Distribution
8:36
Moment-Generating Functions for Geometric Distribution
9:06
Moment-Generating Functions for Negative Binomial Distribution
9:28
Moment-Generating Functions for Hypergeometric Distribution
9:43
Moment-Generating Functions for Poisson Distribution
9:57
Moment-Generating Functions for the Continuous Distributions
11:34
Moment-Generating Functions for the Uniform Distributions
11:43
Moment-Generating Functions for the Normal Distributions
12:24
Moment-Generating Functions for the Gamma Distributions
12:36
Moment-Generating Functions for the Exponential Distributions
12:44
Moment-Generating Functions for the Chi-square Distributions
13:11
Moment-Generating Functions for the Beta Distributions
13:48
Useful Formulas with Moment-Generating Functions
15:02
Useful Formulas with Moment-Generating Functions 1
15:03
Useful Formulas with Moment-Generating Functions 2
16:21
Example I: Moment-Generating Function for the Binomial Distribution
17:33
Example II: Use the MGF for the Binomial Distribution to Find the Mean of the Distribution
24:40
Example III: Find the Moment Generating Function for the Poisson Distribution
29:28
Example IV: Use the MGF for Poisson Distribution to Find the Mean and Variance of the Distribution
36:27
Example V: Find the Moment-generating Function for the Uniform Distribution
44:47
Section 5: Multivariate Distributions
Bivariate Density & Distribution Functions

50m 52s

Intro
0:00
Bivariate Density Functions
0:21
Two Variables
0:23
Bivariate Density Function
0:52
Properties of the Density Function
1:57
Properties of the Density Function 1
1:59
Properties of the Density Function 2
2:20
We Can Calculate Probabilities
2:53
If You Have a Discrete Distribution
4:36
Bivariate Distribution Functions
5:25
Bivariate Distribution Functions
5:26
Properties of the Bivariate Distribution Functions 1
7:19
Properties of the Bivariate Distribution Functions 2
7:36
Example I: Bivariate Density & Distribution Functions
8:08
Example II: Bivariate Density & Distribution Functions
14:40
Example III: Bivariate Density & Distribution Functions
24:33
Example IV: Bivariate Density & Distribution Functions
32:04
Example V: Bivariate Density & Distribution Functions
40:26
Marginal Probability

42m 38s

Intro
0:00
Discrete Case
0:48
Marginal Probability Functions
0:49
Continuous Case
3:07
Marginal Density Functions
3:08
Example I: Compute the Marginal Probability Function
5:58
Example II: Compute the Marginal Probability Function
14:07
Example III: Marginal Density Function
24:01
Example IV: Marginal Density Function
30:47
Example V: Marginal Density Function
36:05
Conditional Probability & Conditional Expectation

1h 2m 24s

Intro
0:00
Review of Marginal Probability
0:46
Recall the Marginal Probability Functions & Marginal Density Functions
0:47
Conditional Probability, Discrete Case
3:14
Conditional Probability, Discrete Case
3:15
Conditional Probability, Continuous Case
4:15
Conditional Density of Y₁ given that Y₂ = y₂
4:16
Interpret This as a Density on Y₁ & Calculate Conditional Probability
5:03
Conditional Expectation
6:44
Conditional Expectation: Continuous
6:45
Conditional Expectation: Discrete
8:03
Example I: Conditional Probability
8:29
Example II: Conditional Probability
23:59
Example III: Conditional Probability
34:28
Example IV: Conditional Expectation
43:16
Example V: Conditional Expectation
48:28
Independent Random Variables

51m 39s

Intro
0:00
Intuition
0:55
Experiment with Two Random Variables
0:56
Intuition Formula
2:17
Definition and Formulas
4:43
Definition
4:44
Short Version: Discrete
5:10
Short Version: Continuous
5:48
Theorem
9:33
For Continuous Random Variables, Y₁ & Y₂ are Independent If & Only If: Condition 1
9:34
For Continuous Random Variables, Y₁ & Y₂ are Independent If & Only If: Condition 2
11:22
Example I: Use the Definition to Determine if Y₁ and Y₂ are Independent
12:49
Example II: Use the Definition to Determine if Y₁ and Y₂ are Independent
21:33
Example III: Are Y₁ and Y₂ Independent?
27:01
Example IV: Are Y₁ and Y₂ Independent?
34:51
Example V: Are Y₁ and Y₂ Independent?
43:44
Expected Value of a Function of Random Variables

37m 7s

Intro
0:00
Review of Single Variable Case
0:29
Expected Value of a Single Variable
0:30
Expected Value of a Function g(Y)
1:12
Bivariate Case
2:11
Expected Value of a Function g(Y₁, Y₂)
2:12
Linearity of Expectation
3:24
Linearity of Expectation 1
3:25
Linearity of Expectation 2
3:38
4:03
Example I: Calculate E (Y₁ + Y₂)
4:39
Example II: Calculate E (Y₁Y₂)
14:47
Example III: Calculate E (U₁) and E(U₂)
19:33
Example IV: Calculate E (Y₁) and E(Y₂)
22:50
Example V: Calculate E (2Y₁ + 3Y₂)
33:05
Covariance, Correlation & Linear Functions

59m 50s

Intro
0:00
Definition and Formulas for Covariance
0:38
Definition of Covariance
0:39
Formulas to Calculate Covariance
1:36
Intuition for Covariance
3:54
Covariance is a Measure of Dependence
3:55
Dependence Doesn't Necessarily Mean that the Variables Do the Same Thing
4:12
If Variables Move Together
4:47
If Variables Move Against Each Other
5:04
Both Cases Show Dependence!
5:30
Independence Theorem
8:10
Independence Theorem
8:11
The Converse is Not True
8:32
Correlation Coefficient
9:33
Correlation Coefficient
9:34
Linear Functions of Random Variables
11:57
Linear Functions of Random Variables: Expected Value
11:58
Linear Functions of Random Variables: Variance
12:58
Linear Functions of Random Variables, Cont.
14:30
Linear Functions of Random Variables: Covariance
14:35
Example I: Calculate E (Y₁), E (Y₂), and E (Y₁Y₂)
15:31
Example II: Are Y₁ and Y₂ Independent?
29:16
Example III: Calculate V (U₁) and V (U₂)
36:14
Example IV: Calculate the Covariance Correlation Coefficient
42:12
Example V: Find the Mean and Variance of the Average
52:19
Section 6: Distributions of Functions of Random Variables
Distribution Functions

1h 7m 35s

Intro
0:00
Premise
0:44
Premise
0:45
Goal
1:38
Goal Number 1: Find the Full Distribution Function
1:39
Goal Number 2: Find the Density Function
1:55
Goal Number 3: Calculate Probabilities
2:17
Three Methods
3:05
Method 1: Distribution Functions
3:06
Method 2: Transformations
3:38
Method 3: Moment-generating Functions
3:47
Distribution Functions
4:03
Distribution Functions
4:04
Example I: Find the Density Function
6:41
Step 1: Find the Distribution Function
6:42
Step 2: Find the Density Function
10:20
Summary
11:51
Example II: Find the Density Function
14:36
Step 1: Find the Distribution Function
14:37
Step 2: Find the Density Function
18:19
Summary
19:22
Example III: Find the Cumulative Distribution & Density Functions
20:39
Step 1: Find the Cumulative Distribution
20:40
Step 2: Find the Density Function
28:58
Summary
30:20
Example IV: Find the Density Function
33:01
Step 1: Setting Up the Equation & Graph
33:02
Step 2: If u ≤ 1
38:32
Step 3: If u ≥ 1
41:02
Step 4: Find the Distribution Function
42:40
Step 5: Find the Density Function
43:11
Summary
45:03
Example V: Find the Density Function
48:32
Step 1: Exponential
48:33
Step 2: Independence
50:48
Step 2: Find the Distribution Function
51:47
Step 3: Find the Density Function
1:00:17
Summary
1:02:05
Transformations

1h 16s

Intro
0:00
Premise
0:32
Premise
0:33
Goal
1:37
Goal Number 1: Find the Full Distribution Function
1:38
Goal Number 2: Find the Density Function
1:49
Goal Number 3: Calculate Probabilities
2:04
Three Methods
2:34
Method 1: Distribution Functions
2:35
Method 2: Transformations
2:57
Method 3: Moment-generating Functions
3:05
Requirements for Transformation Method
3:22
The Transformation Method Only Works for Single-variable Situations
3:23
Must be a Strictly Monotonic Function
3:50
Example: Strictly Monotonic Function
4:50
If the Function is Monotonic, Then It is Invertible
5:30
Formula for Transformations
7:09
Formula for Transformations
7:11
Example I: Determine whether the Function is Monotonic, and if so, Find Its Inverse
8:26
Example II: Find the Density Function
12:07
Example III: Determine whether the Function is Monotonic, and if so, Find Its Inverse
17:12
Example IV: Find the Density Function for the Magnitude of the Next Earthquake
21:30
Example V: Find the Expected Magnitude of the Next Earthquake
33:20
Example VI: Find the Density Function, Including the Range of Possible Values for u
47:42
Moment-Generating Functions

1h 18m 52s

Intro
0:00
Premise
0:30
Premise
0:31
Goal
1:40
Goal Number 1: Find the Full Distribution Function
1:41
Goal Number 2: Find the Density Function
1:51
Goal Number 3: Calculate Probabilities
2:01
Three Methods
2:39
Method 1: Distribution Functions
2:40
Method 2: Transformations
2:50
Method 3: Moment-Generating Functions
2:55
Review of Moment-Generating Functions
3:04
Recall: The Moment-Generating Function for a Random Variable Y
3:05
The Moment-Generating Function is a Function of t (Not y)
3:45
Moment-Generating Functions for the Discrete Distributions
4:31
Binomial
4:50
Geometric
5:12
Negative Binomial
5:24
Hypergeometric
5:33
Poisson
5:42
Moment-Generating Functions for the Continuous Distributions
6:08
Uniform
6:09
Normal
6:17
Gamma
6:29
Exponential
6:34
Chi-square
7:05
Beta
7:48
Useful Formulas with the Moment-Generating Functions
8:48
Useful Formula 1
8:49
Useful Formula 2
9:51
How to Use Moment-Generating Functions
10:41
How to Use Moment-Generating Functions
10:42
Example I: Find the Density Function
12:22
Example II: Find the Density Function
30:58
Example III: Find the Probability Function
43:29
Example IV: Find the Probability Function
51:43
Example V: Find the Distribution
1:00:14
Example VI: Find the Density Function
1:12:10
Order Statistics

1h 4m 56s

Intro
0:00
Premise
0:11
Example Question: How Tall Will the Tallest Student in My Next Semester's Probability Class Be?
0:12
Setting
0:56
Definition 1
1:49
Definition 2
2:01
Question: What are the Distributions & Densities?
4:08
Formulas
4:47
Distribution of Max
5:11
Density of Max
6:00
Distribution of Min
7:08
Density of Min
7:18
Example I: Distribution & Density Functions
8:29
Example I: Distribution
8:30
Example I: Density
11:07
Example I: Summary
12:33
Example II: Distribution & Density Functions
14:25
Example II: Distribution
14:26
Example II: Density
17:21
Example II: Summary
19:00
Example III: Mean & Variance
20:32
Example III: Mean
20:33
Example III: Variance
25:48
Example III: Summary
30:57
Example IV: Distribution & Density Functions
35:43
Example IV: Distribution
35:44
Example IV: Density
43:03
Example IV: Summary
46:11
Example V: Find the Expected Time Until the Team's First Injury
51:14
Example V: Solution
51:15
Example V: Summary
1:01:11
Sampling from a Normal Distribution

1h 7s

Intro
0:00
Setting
0:36
Setting
0:37
Assumptions and Notation
2:18
Assumption Forever
2:19
Assumption for this Lecture Only
3:21
Notation
3:49
The Sample Mean
4:15
Statistic We'll Study the Sample Mean
4:16
Theorem
5:40
Standard Normal Distribution
7:03
Standard Normal Distribution
7:04
Converting to Standard Normal
10:11
Recall
10:12
Corollary to Theorem
10:41
Example I: Heights of Students
13:18
Example II: What Happens to This Probability as n → ∞
22:36
Example III: Units at a University
32:24
Example IV: Probability of Sample Mean
40:53
Example V: How Many Samples Should We Take?
48:34
The Central Limit Theorem

1h 9m 55s

Intro
0:00
Setting
0:52
Setting
0:53
Assumptions and Notation
2:53
Our Samples are Independent (Independent Identically Distributed)
2:54
No Longer Assume that the Population is Normally Distributed
3:30
The Central Limit Theorem
4:36
The Central Limit Theorem Overview
4:38
The Central Limit Theorem in Practice
6:24
Standard Normal Distribution
8:09
Standard Normal Distribution
8:13
Converting to Standard Normal
10:13
Recall: If Y is Normal, Then …
10:14
Corollary to Theorem
11:09
Example I: Probability of Finishing Your Homework
12:56
Example I: Solution
12:57
Example I: Summary
18:20
Example I: Confirming with the Standard Normal Distribution Chart
20:18
Example II: Probability of Selling Muffins
21:26
Example II: Solution
21:27
Example II: Summary
29:09
Example II: Confirming with the Standard Normal Distribution Chart
31:09
Example III: Probability that a Soda Dispenser Gives the Correct Amount of Soda
32:41
Example III: Solution
32:42
Example III: Summary
38:03
Example III: Confirming with the Standard Normal Distribution Chart
40:58
Example IV: How Many Samples Should She Take?
42:06
Example IV: Solution
42:07
Example IV: Summary
49:18
Example IV: Confirming with the Standard Normal Distribution Chart
51:57
Example V: Restaurant Revenue
54:41
Example V: Solution
54:42
Example V: Summary
1:04:21
Example V: Confirming with the Standard Normal Distribution Chart
1:06:48

• ## Transcription

 1 answerLast reply by: Dr. William MurraySun Jan 4, 2015 7:21 PMPost by Joseph Szmulewicz on January 1, 2015sorry. I just figured out what happened to the 2 in the second integral. the integral of y is y squared over 2 and the the 2 from 2/3 gets cancelled by the 2 in the denominator. Silly of me to miss this. 1 answerLast reply by: Dr. William MurraySun Jan 4, 2015 7:21 PMPost by Joseph Szmulewicz on January 1, 2015also on example 1, it says above that the variance is also calculated, but only the mean got calculated. misprint? 1 answerLast reply by: Dr. William MurraySun Jan 4, 2015 7:21 PMPost by Joseph Szmulewicz on January 1, 2015I found a mistake on example 1, I think. Correct me if I am wrong, but you pulled out the 1/3 from the first integral which is right, but the second integral has a 2/3 constant, not 1/3. When you integrate the second part, shouldn't it be 2 times y squared, not just y squared?

### Mean & Variance for Continuous Distributions

Lecture Slides are screen-captured images of important points in the lecture. Students can download and print out these lecture slide images to do practice problems as well as take notes while watching the lecture.

• Intro 0:00
• Mean 0:32
• Mean for a Continuous Random Variable
• Expectation is Linear
• Variance 2:55
• Variance for Continuous random Variable
• Easier to Calculate Via the Mean
• Standard Deviation 5:03
• Standard Deviation
• Example I: Mean & Variance for Continuous Distributions 5:43
• Example II: Mean & Variance for Continuous Distributions 10:09
• Example III: Mean & Variance for Continuous Distributions 16:05
• Example IV: Mean & Variance for Continuous Distributions 26:40
• Example V: Mean & Variance for Continuous Distributions 30:12

### Transcription: Mean & Variance for Continuous Distributions

Hi, welcome back to the probability lectures here on www.educator.com.0000

My name is Will Murray, we are learning about continuous probability right now.0004

Today, we are going to study the mean and variance for continuous distributions.0010

I will teach you how to calculate the mean and variance.0015

I also mentioned standard deviation.0018

Usually, we do not look at standard deviation so often for the continuous distributions,0023

but I will go ahead and show you how to calculate it, if you need to.0025

I'm going to spend most of the time though on the mean and variance.0029

Let me get started here.0034

I will remind you how we calculated the mean for discrete distributions which was as follows.0036

For discrete distributions, we calculated the mean, the expected value of our random variable0044

was the sum of all possible values of Y of Y × P of Y.0054

The mean for continuous distributions is essentially the same thing.0061

You still have a Y, instead of P of Y, you have the density function F of Y.0066

By the way, remember, there is a density function which we use f to denote the density function.0071

There is also a cumulative distribution function for which we use F.0076

This is definitely the density function not the cumulative distribution function.0083

It is essentially the same formula that we had for discrete distributions.0090

The Y is the same, the P of Y is turned into F of Y, and the summation has turned into an integral.0094

Really, you do not need to memorize a new formula.0102

If you can remember one, it is just kind of translating it into the language of continuous distributions0106

to find the expected value for a continuous distribution.0112

Remember that, mean and expected value are the same thing, those are completely synonymous.0116

If someone asks for the mean, they are asking you for the expected value, and vice versa.0122

It is very useful to know that just like with discreet distributions, expectation is linear.0128

The expected value of a constant is just a constant.0137

If you have two things added together, you can split them up and calculate their expected values separately.0141

If you have a constant × a random variable then you can pull a constant outside.0146

It is a very useful property of expectation.0153

We will be using it again and again, the linearity of expectation.0156

By the way, that does not work for variance.0161

Variance is not linear, you have to be very careful not to assume that variance is linear.0164

You can get yourself in a lot of trouble that way.0170

I will go ahead and show you the definition of variance.0173

The definition of variance is the same definition that we had for discrete distributions.0177

It is the expected value of Y - μ², where that μ was the mean of the distribution,0183

same as the expected value of the original variable.0191

If you want to calculate that, you would multiply y × μ² × the density function F of Y, and calculate an integral.0195

Now, that is not usually how you are going to calculate the variance of a continuous distribution.0207

This version of the formula is not usually very useful.0214

What is usually easier is to calculate this little formula, the expected value of Y² - the expected value of (Y).0220

That is just like we had with the discreet distributions.0231

It works the exact same way and it is usually easier to calculate that way.0235

The way you calculate it, remember the expected value of Y is just the mean,0242

The expected value of Y², what you do is you multiply Y² × the density function, and then you have to integrate that.0253

You do have to do an integral but it is usually a simpler one than you would have done,0261

if you had calculated this Y - μ² × F of Y.0266

This is usually the better way to find the variance of a continuous distribution is, to use this formula down here.0271

We will practice that and you will see in the examples.0281

We will probably be using this version of the variance formula.0284

Usually, it seems to me when the variance problems, with continuous problems,0291

you are used to calculating the variance and not often the standard deviation.0296

Let me go ahead and give you the formula for standard deviation, because it is very easy.0300

It is exactly like the discrete case, if you know the variance, the standard deviation by definition,0304

is just the square root of the variance.0310

Basically, you just calculate the variance and at the end, if you want standard deviation,0312

you calculate the square root of that.0317

There is really nothing more and nothing less than that.0319

That is possibly why all the problems that you encountered in probability classes just say,0323

calculate the variance and I do not even bother to ask you to calculate the standard deviation.0329

But if we do ask you to calculate the standard deviation, just calculate the variance first and then take its square root.0334

Let us try some examples.0342

First example here, it is the same density function that we had in example 3 of the previous video.0343

I do not think you really need to understand example 3 of the previous.0351

Just in case this looks familiar, this is the density function that you saw before.0355

Let Y have the function 1/3 between 0 and 1, 2/3 between 1 and 2.0361

That is not a good vertical line at all, I will do a little graph here.0368

There is 1, there is 2, and what we have is a density function like that.0375

There is our density function and we want to find the expected value of Y.0384

We are just going to use the definition of this.0390

The expected value of Y is equal to, for any continuous distribution,0392

it is equal to the integral from -infinity to infinity of Y × the density function F of Y DY.0400

Now, I set the integral from infinity to -infinity but here, there is really no density anywhere outside of the range between 0 and 2.0408

I’m just going to integrate from 0 to 2.0418

And then, I'm going to break it up from 0 to 1 and 1 to 2 because0420

we have two different density functions on those two ranges.0426

It is Y × 1/3 DY from 0 to 1 and Y × 2/3 DY from 1 to 2.0429

Those are both easy integrals, in fact, you know I'm going to pull out 1/3 from everything because0440

that just makes me have to write now less fractions, fewer fractions.0446

The integral of Y DY is Y²/2.0452

I have to write with that from 0 to 1.0457

The integral of 2Y DY is just Y².0461

I have to evaluate that from 1 to 2.0465

I will keep going here, 1/3 Y²/2, when Y is 1 is 1, Y0 is nothing, + Y² from 1 to 2 is 2² is 4 -1² is 1, 7 + 4 - 1 is 3.0472

My mistake, the Y²/2 should have been a ½ not 1, when I plug in Y = 1 before.0492

I get 1/3 × 3 1/2 is 7/2.0504

I get my expected value of that distribution is 7/6.0511

By the way, we can check this, you will get at least an approximate check, if we look at the graph.0519

If you look at this graph up here, you will notice that most of the density is concentrated over there between 1 and 2.0525

What we found here is that the mean or the expected value is just a little bit bigger than 1, about 1/6.0535

That is kind of the balancing point of that function.0545

It is not surprising that it is a little bit bigger than 1 because there is a little more area to the right of 1 than to the left of 1.0549

Our average there or expected value μ = 7/6.0556

To recap the steps there, I just used my definition of expected value.0562

The integral of Y × the density function F of Y.0567

I have to split that up into two different ranges, the range from 0 to 1 and the range from 1 to 2.0572

Factor the 1/3 out of everything and I dropped in my two different functions for the density functions.0578

Factor out a 1/3, did a couple of easy integrals, plug in the values, the limits, and simplify it down to a number of 7/6.0587

When I got that number, it really was a surprise looking at the graph because0596

it seem to balance a bit bigger than 1 because more of the area is concentrated bigger than 1 there.0600

Let us move on, example 2, we got θ1 and θ2 are constants.0608

I guess θ2 is the bigger one, I will go ahead and graph this, as we introduce the problem.0614

There is θ1 and there is θ2.0621

And then, we are going to consider the uniform density function F of Y is just the constant 1/θ2 – θ1.0625

It was just constant because there is no Y in there.0633

From Y going from θ1 to θ2.0636

There is our density function.0641

By assumption, that means that it is 0 everywhere else.0643

That is our density function, our density is uniformly distributed.0648

We will talk more about the uniform distribution, later on.0654

We have us some more videos that you will see, if you scroll down about to the different continuous distributions,0657

the continuous probability distributions.0665

Uniform will be one of them, this is kind of a warm up to that video later on.0667

The value here is the constant value 1/θ2 – θ1.0673

By the way, it has to be that constant value, in order to make the total area equal to 1.0679

Because the width is θ2 – θ1, that means the height must be 1/θ2 – θ1.0684

You have to know that, that is the constant value there, if it is going to be constant.0690

Let us find E of Y.0697

Μ is our expected value, or mean of our distribution.0701

E of Y, now I'm going to use again, the generic formula for expected value of -infinity to infinity of Y F of Y DY.0707

In this case, the only place where we have positive density is from θ1 to θ2.0719

I will fill those in as limits and not worry about everything else, that infinity of Y.0727

F of Y is just the constant, it is 1/θ2 – θ1 DY.0734

I will pull out our constant out.0742

I will just go ahead and do the integral because it is an easy integral.0745

Y²/2 × θ2 – θ1.0749

I have to evaluate that from θ1 to θ2.0755

It is a form the algebra works out with this, stick with me on this.0760

I get θ2² – θ1²/2 × θ2 – θ1.0764

The numerator factors θ2 + θ1 and θ2 – θ, that is a difference of squares.0775

On the bottom, we just get θ2 – θ1.0787

That factors cancel and I’m going to write this in a different order.0791

I will just write it as θ1 + θ2/2, that is my expected value of the uniform distribution.0796

It is nice to work that out arithmetically, but it is also nice to realize that that is a completely intuitive result,0809

if you look at the graph.0817

Because if you look at the graph, exactly half the area is to the left and exactly half of the area is to the right.0819

It is a uniform distribution.0826

You are really not surprised at all, to find that the mean is just halfway between θ1 and θ2.0827

It is the average of the two values.0835

Let me write that a little bigger so you can actually see it.0839

Θ1 + θ2/2, it is not surprising at all to get that mean to be halfway between the two endpoints.0842

But, because it is uniformly distributed, the density is uniformly distributed, if they were not uniformly distributed,0852

we might not expect the mean to fall exactly on the halfway mark.0857

But since, it is uniformly distributed, not at all surprising that0862

we end up with our mean being just halfway down the middle θ1 + θ2.0866

Let me show you how we calculated that quickly.0873

Use the definition of the mean or expected value, same thing by the way, of a continuous distribution0875

which is the integral from -infinity to infinity of Y × the density function DY.0882

Since, our only positive density was between θ1 and θ2, I through away those infinities and just looked at the range from θ1 to θ2.0889

I filled in my F of Y is 1/θ2 – θ1.0902

That is a constant, it easily comes through the integral.0906

The integral of Y is Y²/2.0909

Plug in θ2 as Y² and θ1 as Y.0912

We get θ2² – θ1².0918

It is a nice algebra here, we factored a difference of squares formula.0921

We get θ2 – θ1 canceling, we end up with θ1 + θ2/2 which completely confirms the suspicion that I hope you had when I first graph this,0925

which is that the mean would come out to be right in the middle, because the density is uniformly distributed.0935

In the next example, we are going to calculate the variance of the uniform distribution.0942

Part of that calculation, we will be using the mean.0947

I want you to remember this mean.0951

We will just plug it right in, when we get to the right point in the formula for the variance on the next problem.0956

In example 3 here, it is kind of a continuation of example 2.0967

We are still looking at 2 constant values, θ1 and θ2.0972

We have the uniform density function between θ1 and θ2.0976

We are finding the variance.0982

In example 2, we found the expected value E of Y.0984

Now, we are finding V of Y.0988

We are going to be using the answer from example 2 in examples 3.0989

If you did not just watch example 2, if you are just joining us for today for example 3,0995

you really want to go back and watch example 2, and make sure you understand the answer to example 2.1001

We will be using it at a key step here in examples 3.1006

For example 3, we want to find the variance.1011

The first step of that is to find the expected value of Y².1014

By definition, that means the integral from -infinity to infinity of Y² F of Y DY.1021

Just like in example 2, the only range we have here for Y is θ1 to θ2.1032

Everywhere else, we can assume that the density function is 0.1039

I'm going to cut off those infinities and just integrate from θ1 to θ2.1043

My Y², my F of Y is 1/θ2 – θ1.1054

Those are all constants, 1/θ2 – θ1 DY.1060

That is a pretty easy integral, I get Y³/3 is the integral of Y².1064

I still have that constant θ2 – θ1.1074

I'm integrating this from Y = θ1 to Y = θ2.1079

Let me plug those in.1086

Θ2³ – θ1³/3 × θ2 – θ1.1088

Just like in example 2, there is a really nice algebra that works out here, if you remember your difference of cubes formula.1097

In example 2 was, the difference of squares formula which everybody remembers.1106

In example 3, it is the difference of cubes formula which is not quite as well known, it is θ2 – θ1.1110

I will remind you A³ – B³.1118

A² + AB, not 2AB, + B².1121

That is the difference of cubes formula.1134

I will write that up here because you might not be so familiar, people who are a little rusty on algebra.1137

A - B × A² + AB + B².1143

That is what I use right here to factor this.1150

That is really nice because in the denominator, we also have θ2 – θ1.1153

Those cancel, E of Y² here is equal to θ2² + θ1 θ2 + θ1².1160

Let me switch the roles of θ1 and θ2.1175

I will write θ1² + θ1 θ2 + θ2².1178

That will be a little easier to keep track of in the next step.1184

I did say there is a next step, we have not yet found the variance.1187

Let me remind you that there is a 2 step procedure for finding the variance.1191

The variance of Y which is the same as σ² is always the expected value of Y² - the expected value of (Y)².1195

We figure out the expected value of Y² right here, it is θ1² + θ1 θ2 + θ2² ÷ 3.1215

The expected value of Y, we figured this out in example 2.1230

That is why I said go back and watch example 2.1234

Let me remind you of the answer from example 2, I will not work it out again.1237

But example 2, we figured out that E of Y is θ1 + θ2/2.1241

That is not surprising for the uniform distribution because you just get the average of the 2 endpoints.1250

Let me plug that in, θ1 + θ2/2.1258

We want to square that whole thing.1263

I accidentally changed my - to +.1266

Let me quickly correct that before anybody notices.1272

Let me expand this θ1 + θ2² because we are going to do some algebra and combine this.1277

This is going to work out really nicely, I practiced this and it worked out just great.1285

This is θ1² + 2θ1 θ2 + θ2²/4 because we have to square both top and bottom here.1289

We want to combine these two, it looks like it is going to be really messy but it will simplify nicely, I promise.1304

I have a common denominator, I got a 3 and 4.1310

My common denominator is going to be a 12 and that means I have to multiply the first set of terms by 4.1312

I will go ahead and do that.1320

4 θ1² + 4 θ1 θ2 + 4 θ2².1321

The next sets of terms are all negative and I need to multiply them all by 3.1330

-3 θ1² – 3 × 2 is 6 θ1 θ2 – 2 θ2².1335

I did not multiply that by 3, not a 2 there but 3 θ2².1348

I can simplify this down, I get 4.1355

Θ1² – 3 θ1², I got θ1² σ of θ1².1358

4 θ1 θ2 – 6 θ1 θ2 – 2 θ1 θ2 + θ2²/12.1364

What you will notice is that numerator is exactly the perfect square of either θ1 - θ2 or θ2 – θ1.1380

I’m going to say θ2 – θ1.1391

It does not matter but I like that, because I have been told that θ1 is less than θ2.1396

Θ2- θ1 is positive, let us keep it positive.1402

Θ2 – θ1²/12 is my variance, that is quite a nice formula.1404

I like the fact that it simplifies that way.1423

Unlike the mean, the answer from example 2, I do not think the variance is totally obvious.1425

I could not have just look at that and told you off the top of my head that that was going to be the variance.1431

It is not totally surprising because it is dependent on the distance from θ1 to θ2.1438

Remember, the variance kind of measures how spread out your distribution is.1447

If θ1 and θ2 are far apart, then, we get a bigger variance here.1451

That makes intuitive sense but I do not think I could have looked at that and just say,1455

it is definitely going to be θ2 – θ1²/12.1460

That would not have been so obvious.1463

Whereas with the expected value, if you are really on top of your game,1465

you could probably eyeball that distribution and say, I know it is going to be θ1 + θ2/2.1469

Let me show you the steps that we use for that.1475

We started out finding E of Y².1478

The reason I did that was because I was looking forward to this formula for the variance, E of Y² – E of (Y)².1482

I’m going to find me E of (Y)² first.1491

By definition, that is Y² × the density function and we will integrate that.1494

The range we want to integrate on is θ1 and θ2.1500

The density function is just this constant 1/θ2 – θ1.1504

That integrates easily Y² integrates to Y³/3, drop in the values of θ, the values of Y at the endpoints.1507

Using this nice difference of cubes formula, it expands out and cancels.1519

We get θ2 – θ1 canceling from the top and bottom.1526

We get a fairly nice little expression for the expected value of Y².1530

But that is not the variance yet, you have to subtract off the expected value of (Y)².1536

And the expected value of Y is what we figure out in example 2, that is θ1 + θ2/2.1542

That is where I’m getting this from.1549

Because I want to put everything over a common denominator, I expanded this out.1553

It is quite a mess here, especially when we put it over a common denominator of 12,1559

and in particular, it simplify down to this very nice formula θ2 – θ1²/12.1563

I could have said θ1 – θ2² as well, but I use θ2 – θ1 because that will be a positive number,1572

that is because θ2 is the bigger one.1579

That is the variance of the uniform distribution.1582

We are going to do a whole video on the uniform distribution later on,1585

if you just scroll down you should be able to see that.1588

You will see that we will be using this mean and variance, that we worked out here in examples 2 and 3.1591

Let us look at example 4 here.1602

We have Y has density function ½ × 2 - Y on the range 0 to 2, and 0 elsewhere.1604

We want to find the expected value of Y.1613

In example 5, we are going to keep going with the same random variable.1615

We will find the variance of Y.1621

You want to make sure that you watch these two examples in tandem because1624

we are going to be using the results from one in the next one.1629

Let us find the expected value of Y.1632

The expected value of Y by definition, remember, it is always the integral from -infinity to infinity of Y ×1635

the density function f of Y DY.1644

In this case, I do not need to integrate from infinity to infinity because the only place where the density is positive is from 0 to 2.1647

I’m just going to integrate from 0 to 2 of Y.1656

I will fill in the density function ½ × 2 – Y DY.1660

I think I will distribute that to make it a little easier to integrate.1666

Y × ½ × 2 is just Y – Y × ½ × Y.1670

That is ½ Y², just distributing those.1680

I cannot put off the calculus any longer, integrate those, I get Y²/2 - the integral of Y² is Y³/3 × ½ is Y³/6.1685

I need to evaluate that from Y = 0 to Y = 2.1702

I will plug those values in, I get 2²/2 is 4/2 is 2 – Y³/6 is 8/6.1708

That is 2 - 4/3 and that is 2 and 6/3 - 4/3 is 2/3.1719

That is my answer for the expected value.1740

That was fairly straightforward, it came straight out of the definition here.1742

The expected value of a random variable is, you integrate Y × the density function from -infinity to infinity.1746

But in practice, you end up integrating your density function is defined to be positive.1758

That is on this range right here 0 to 2 because everywhere else, it is 0.1764

Let me integrate Y ×, I just plug in the densely function from here.1769

I do not need to worry about the region on which it is 0.1774

Plug that in, distribute the terms to make it easier to integrate, do a little integral,1779

drop in the numbers and I get my expected value of 2/3.1786

You want to hang onto this value for the next example.1790

In example 5, we are going to come back to the same density function and we are going to calculate the variance.1794

Remember, a key step in calculating the variance is using the expected value.1801

Remember this value of 2/3, we are going to use it again in the next example.1806

In example 5, we are looking at the same density function that we had in example 4,1813

F of Y is ½ × 2 – Y on the range from 0 to 2, and at 0 everywhere else.1818

We want to find the variance.1825

Let me remind you of the useful way to calculate the variance.1827

The variance is E of Y² – E of (Y)².1832

What we will do first is we will calculate E of (Y)², and then we will come back and + it into this formula.1842

E of Y², by itself here, I’m not calculating the whole variance yet.1850

I’m just finding E of Y².1855

By definition, that is the integral from -infinity to infinity of Y² × whatever density function we have for this variable.1857

In this case, our density function is 0 everywhere outside the range is 0 to 2.1868

I just need to integrate from 0 to 2 Y².1874

We look at that density function is ½ × 2 – Y DY, that is the integral from 0 to 2.1879

If I distribute that Y² and ½, Y² × ½ × 2 is just Y².1888

And then, Y² × ½ × Y is ½ Y³.1896

I have to integrate all that with respect to Y.1903

My integral is, Y² integrates to Y³/3.1907

Y³ integrates to Y⁴/4, but I also got a ½ here.1913

I have to make that Y⁴/8.1919

I want to integrate all of that from 0 to 2, Y = 2.1923

I get to Y³/3 that is 8/3 – y⁴/8 that is 16/8.1934

If I plug in 0 nothing happens there, I do not need to worry about that.1944

That is 8/3 -2, 2 is 6/3, the expected value Y² is just 2/3.1948

That is not the answer to the problem yet, we are supposed to find the variance.1960

The expected value of Y² is just one step in finding the variance.1965

Here is the rest of it.1968

E of Y² is 2/3, we just calculated that.1970

The expected value of Y, I would have to do another whole calculation to find that.1975

But I did that calculation in example 4, I hope you did it with me.1981

If you have not watched example 4, now is the time when we are using the answer.1985

It came out to be the same thing, it was 2/3 by example 4.1993

That is really coincidence, the fact that it came out to be the same as E of Y².2001

I would not put too much stock in the fact that those numbers came out to be the same.2006

It is not going to happen usually, there is no sort of property that made those the same.2012

It is just the way that the integrals worked out.2018

This is to 2/3 - E of Y², -2/3².2020

This is 2/3 – 4/9, 2/3 is 6/9.2029

Multiply top and bottom by 3 - 4/9 and I finally get my answer here for the variance is 2/9,2037

the variance of this random variable.2047

In case you are a little fuzzy on any of the steps, let us check those out again.2055

My generic formula for the variance is always E of Y² – E of (Y)².2061

The E of Y by itself, I figure it out in example 4.2069

You can go back and see the video in example 4, where that came from.2073

We figure out that it was 2/3, that is where I dropped that in for E of Y.2076

This E of Y² takes more work.2081

E of Y², we actually have to calculate that using an integral of Y² × the density function.2084

I put in Y², I put in the density function that came from right here.2090

These bounds from 0 to 2 that came from these limits right here, because everywhere else the function is 0.2096

I really only need to do my integral from 0 to 2.2105

Once I got in that form, it is relatively easy algebra to distribute the terms to do the calculus there.2110

It is just the power rule, drop in the limits 0 and 2, and simplify it down to 2/3.2118

That 2/3 gives me that 2/3 for E of Y².2125

The other 2/3 came from E of Y, that was from example 4.2131

It is really a coincidence that those numbers were both 2/3.2135

That is not any property manifesting itself here, do not read too much into that.2138

Those could easily have been different numbers.2144

Now, I just simplify the fractions 2/3 - 4/9 simplifies down to 2/9.2147

That is my variance for that random variable.2150

That wraps up this lecture on mean and variance for continuous distributions.2156

This is part of the chapter on continuous probability which is part of our larger lecture series on probability.2162

You are watching the probability lectures here on www.educator.com, with your host Will Murray.2171

Thank you very much for joining me today, bye.2177

OR

### Start Learning Now

Our free lessons will get you started (Adobe Flash® required).