  William Murray

Conditional Probability & Conditional Expectation

Slide Duration:

Section 1: Probability by Counting
Experiments, Outcomes, Samples, Spaces, Events

59m 30s

Intro
0:00
Terminology
0:19
Experiment
0:26
Outcome
0:56
Sample Space
1:16
Event
1:55
Key Formula
2:47
Formula for Finding the Probability of an Event
2:48
Example: Drawing a Card
3:36
Example I
5:01
Experiment
5:38
Outcomes
5:54
Probability of the Event
8:11
Example II
12:00
Experiment
12:17
Outcomes
12:34
Probability of the Event
13:49
Example III
16:33
Experiment
17:09
Outcomes
17:33
Probability of the Event
18:25
Example IV
21:20
Experiment
21:21
Outcomes
22:00
Probability of the Event
23:22
Example V
31:41
Experiment
32:14
Outcomes
32:35
Probability of the Event
33:27
Alternate Solution
40:16
Example VI
43:33
Experiment
44:08
Outcomes
44:24
Probability of the Event
53:35

1h 2m 47s

Intro
0:00
Unions of Events
0:40
Unions of Events
0:41
Disjoint Events
3:42
Intersections of Events
4:18
Intersections of Events
4:19
Conditional Probability
5:47
Conditional Probability
5:48
Independence
8:20
Independence
8:21
Warning: Independent Does Not Mean Disjoint
9:53
If A and B are Independent
11:20
Example I: Choosing a Number at Random
12:41
Solving by Counting
12:52
Solving by Probability
17:26
Example II: Combination
22:07
Combination Deal at a Restaurant
22:08
Example III: Rolling Two Dice
24:18
Define the Events
24:20
Solving by Counting
27:35
Solving by Probability
29:32
Example IV: Flipping a Coin
35:07
Flipping a Coin Four Times
35:08
Example V: Conditional Probabilities
41:22
Define the Events
42:23
Calculate the Conditional Probabilities
46:21
Example VI: Independent Events
53:42
Define the Events
53:43
Are Events Independent?
55:21
Choices: Combinations & Permutations

56m 3s

Intro
0:00
Choices: With or Without Replacement?
0:12
Choices: With or Without Replacement?
0:13
Example: With Replacement
2:17
Example: Without Replacement
2:55
Choices: Ordered or Unordered?
4:10
Choices: Ordered or Unordered?
4:11
Example: Unordered
4:52
Example: Ordered
6:08
Combinations
9:23
Definition & Equation: Combinations
9:24
Example: Combinations
12:12
Permutations
13:56
Definition & Equation: Permutations
13:57
Example: Permutations
15:00
Key Formulas
17:19
Number of Ways to Pick r Things from n Possibilities
17:20
Example I: Five Different Candy Bars
18:31
Example II: Five Identical Candy Bars
24:53
Example III: Five Identical Candy Bars
31:56
Example IV: Five Different Candy Bars
39:21
Example V: Pizza & Toppings
45:03
Inclusion & Exclusion

43m 40s

Intro
0:00
Inclusion/Exclusion: Two Events
0:09
Inclusion/Exclusion: Two Events
0:10
Inclusion/Exclusion: Three Events
2:30
Inclusion/Exclusion: Three Events
2:31
Example I: Inclusion & Exclusion
6:24
Example II: Inclusion & Exclusion
11:01
Example III: Inclusion & Exclusion
18:41
Example IV: Inclusion & Exclusion
28:24
Example V: Inclusion & Exclusion
39:33
Independence

46m 9s

Intro
0:00
Formula and Intuition
0:12
Definition of Independence
0:19
Intuition
0:49
Common Misinterpretations
1:37
Myth & Truth 1
1:38
Myth & Truth 2
2:23
Combining Independent Events
3:56
Recall: Formula for Conditional Probability
3:58
Combining Independent Events
4:10
Example I: Independence
5:36
Example II: Independence
14:14
Example III: Independence
21:10
Example IV: Independence
32:45
Example V: Independence
41:13
Bayes' Rule

1h 2m 10s

Intro
0:00
When to Use Bayes' Rule
0:08
When to Use Bayes' Rule: Disjoint Union of Events
0:09
Bayes' Rule for Two Choices
2:50
Bayes' Rule for Two Choices
2:51
Bayes' Rule for Multiple Choices
5:03
Bayes' Rule for Multiple Choices
5:04
Example I: What is the Chance that She is Diabetic?
6:55
Example I: Setting up the Events
6:56
Example I: Solution
11:33
Example II: What is the chance that It Belongs to a Woman?
19:28
Example II: Setting up the Events
19:29
Example II: Solution
21:45
Example III: What is the Probability that She is a Democrat?
27:31
Example III: Setting up the Events
27:32
Example III: Solution
32:08
Example IV: What is the chance that the Fruit is an Apple?
39:11
Example IV: Setting up the Events
39:12
Example IV: Solution
43:50
Example V: What is the Probability that the Oldest Child is a Girl?
51:16
Example V: Setting up the Events
51:17
Example V: Solution
53:07
Section 2: Random Variables
Random Variables & Probability Distribution

38m 21s

Intro
0:00
Intuition
0:15
Intuition for Random Variable
0:16
Example: Random Variable
0:44
Intuition, Cont.
2:52
Example: Random Variable as Payoff
2:57
Definition
5:11
Definition of a Random Variable
5:13
Example: Random Variable in Baseball
6:02
Probability Distributions
7:18
Probability Distributions
7:19
Example I: Probability Distribution for the Random Variable
9:29
Example II: Probability Distribution for the Random Variable
14:52
Example III: Probability Distribution for the Random Variable
21:52
Example IV: Probability Distribution for the Random Variable
27:25
Example V: Probability Distribution for the Random Variable
34:12
Expected Value (Mean)

46m 14s

Intro
0:00
Definition of Expected Value
0:20
Expected Value of a (Discrete) Random Variable or Mean
0:21
Indicator Variables
3:03
Indicator Variable
3:04
Linearity of Expectation
4:36
Linearity of Expectation for Random Variables
4:37
Expected Value of a Function
6:03
Expected Value of a Function
6:04
Example I: Expected Value
7:30
Example II: Expected Value
14:14
Example III: Expected Value of Flipping a Coin
21:42
Example III: Part A
21:43
Example III: Part B
30:43
Example IV: Semester Average
36:39
Example V: Expected Value of a Function of a Random Variable
41:28
Variance & Standard Deviation

47m 23s

Intro
0:00
Definition of Variance
0:11
Variance of a Random Variable
0:12
Variance is a Measure of the Variability, or Volatility
1:06
Most Useful Way to Calculate Variance
2:46
Definition of Standard Deviation
3:44
Standard Deviation of a Random Variable
3:45
Example I: Which of the Following Sets of Data Has the Largest Variance?
5:34
Example II: Which of the Following Would be the Least Useful in Understanding a Set of Data?
9:02
Example III: Calculate the Mean, Variance, & Standard Deviation
11:48
Example III: Mean
12:56
Example III: Variance
14:06
Example III: Standard Deviation
15:42
Example IV: Calculate the Mean, Variance, & Standard Deviation
17:54
Example IV: Mean
18:47
Example IV: Variance
20:36
Example IV: Standard Deviation
25:34
Example V: Calculate the Mean, Variance, & Standard Deviation
29:56
Example V: Mean
30:13
Example V: Variance
33:28
Example V: Standard Deviation
34:48
Example VI: Calculate the Mean, Variance, & Standard Deviation
37:29
Example VI: Possible Outcomes
38:09
Example VI: Mean
39:29
Example VI: Variance
41:22
Example VI: Standard Deviation
43:28
Markov's Inequality

26m 45s

Intro
0:00
Markov's Inequality
0:25
Markov's Inequality: Definition & Condition
0:26
Markov's Inequality: Equation
1:15
Markov's Inequality: Reverse Equation
2:48
Example I: Money
4:11
Example II: Rental Car
9:23
Example III: Probability of an Earthquake
12:22
Example IV: Defective Laptops
16:52
Example V: Cans of Tuna
21:06
Tchebysheff's Inequality

42m 11s

Intro
0:00
Tchebysheff's Inequality (Also Known as Chebyshev's Inequality)
0:52
Tchebysheff's Inequality: Definition
0:53
Tchebysheff's Inequality: Equation
1:19
Tchebysheff's Inequality: Intuition
3:21
Tchebysheff's Inequality in Reverse
4:09
Tchebysheff's Inequality in Reverse
4:10
Intuition
5:13
Example I: Money
5:55
Example II: College Units
13:20
Example III: Using Tchebysheff's Inequality to Estimate Proportion
16:40
Example IV: Probability of an Earthquake
25:21
Example V: Using Tchebysheff's Inequality to Estimate Proportion
32:57
Section 3: Discrete Distributions
Binomial Distribution (Bernoulli Trials)

52m 36s

Intro
0:00
Binomial Distribution
0:29
Binomial Distribution (Bernoulli Trials) Overview
0:30
Prototypical Examples: Flipping a Coin n Times
1:36
Process with Two Outcomes: Games Between Teams
2:12
Process with Two Outcomes: Rolling a Die to Get a 6
2:42
Formula for the Binomial Distribution
3:45
Fixed Parameters
3:46
Formula for the Binomial Distribution
6:27
Key Properties of the Binomial Distribution
9:54
Mean
9:55
Variance
10:56
Standard Deviation
11:13
Example I: Games Between Teams
11:36
Example II: Exam Score
17:01
Example III: Expected Grade & Standard Deviation
25:59
Example IV: Pogo-sticking Championship, Part A
33:25
Example IV: Pogo-sticking Championship, Part B
38:24
Example V: Expected Championships Winning & Standard Deviation
45:22
Geometric Distribution

52m 50s

Intro
0:00
Geometric Distribution
0:22
Geometric Distribution: Definition
0:23
Prototypical Example: Flipping a Coin Until We Get a Head
1:08
Geometric Distribution vs. Binomial Distribution.
1:31
Formula for the Geometric Distribution
2:13
Fixed Parameters
2:14
Random Variable
2:49
Formula for the Geometric Distribution
3:16
Key Properties of the Geometric Distribution
6:47
Mean
6:48
Variance
7:10
Standard Deviation
7:25
Geometric Series
7:46
Recall from Calculus II: Sum of Infinite Series
7:47
Application to Geometric Distribution
10:10
Example I: Drawing Cards from a Deck (With Replacement) Until You Get an Ace
13:02
Example I: Question & Solution
13:03
Example II: Mean & Standard Deviation of Winning Pin the Tail on the Donkey
16:32
Example II: Mean
16:33
Example II: Standard Deviation
18:37
Example III: Rolling a Die
22:09
Example III: Setting Up
22:10
Example III: Part A
24:18
Example III: Part B
26:01
Example III: Part C
27:38
Example III: Summary
32:02
Example IV: Job Interview
35:16
Example IV: Setting Up
35:15
Example IV: Part A
37:26
Example IV: Part B
38:33
Example IV: Summary
39:37
Example V: Mean & Standard Deviation of Time to Conduct All the Interviews
41:13
Example V: Setting Up
42:50
Example V: Mean
46:05
Example V: Variance
47:37
Example V: Standard Deviation
48:22
Example V: Summary
49:36
Negative Binomial Distribution

51m 39s

Intro
0:00
Negative Binomial Distribution
0:11
Negative Binomial Distribution: Definition
0:12
Prototypical Example: Flipping a Coin Until We Get r Successes
0:46
Negative Binomial Distribution vs. Binomial Distribution
1:04
Negative Binomial Distribution vs. Geometric Distribution
1:33
Formula for Negative Binomial Distribution
3:39
Fixed Parameters
3:40
Random Variable
4:57
Formula for Negative Binomial Distribution
5:18
Key Properties of Negative Binomial
7:44
Mean
7:47
Variance
8:03
Standard Deviation
8:09
Example I: Drawing Cards from a Deck (With Replacement) Until You Get Four Aces
8:32
Example I: Question & Solution
8:33
Example II: Chinchilla Grooming
12:37
Example II: Mean
12:38
Example II: Variance
15:09
Example II: Standard Deviation
15:51
Example II: Summary
17:10
Example III: Rolling a Die Until You Get Four Sixes
18:27
Example III: Setting Up
19:38
Example III: Mean
19:38
Example III: Variance
20:31
Example III: Standard Deviation
21:21
Example IV: Job Applicants
24:00
Example IV: Setting Up
24:01
Example IV: Part A
26:16
Example IV: Part B
29:53
Example V: Mean & Standard Deviation of Time to Conduct All the Interviews
40:10
Example V: Setting Up
40:11
Example V: Mean
45:24
Example V: Variance
46:22
Example V: Standard Deviation
47:01
Example V: Summary
48:16
Hypergeometric Distribution

36m 27s

Intro
0:00
Hypergeometric Distribution
0:11
Hypergeometric Distribution: Definition
0:12
Random Variable
1:38
Formula for the Hypergeometric Distribution
1:50
Fixed Parameters
1:51
Formula for the Hypergeometric Distribution
2:53
Key Properties of Hypergeometric
6:14
Mean
6:15
Variance
6:42
Standard Deviation
7:16
Example I: Students Committee
7:30
Example II: Expected Number of Women on the Committee in Example I
11:08
Example III: Pairs of Shoes
13:49
Example IV: What is the Expected Number of Left Shoes in Example III?
20:46
Example V: Using Indicator Variables & Linearity of Expectation
25:40
Poisson Distribution

52m 19s

Intro
0:00
Poisson Distribution
0:18
Poisson Distribution: Definition
0:19
Formula for the Poisson Distribution
2:16
Fixed Parameter
2:17
Formula for the Poisson Distribution
2:59
Key Properties of the Poisson Distribution
5:30
Mean
5:34
Variance
6:07
Standard Deviation
6:27
Example I: Forest Fires
6:41
Example II: Call Center, Part A
15:56
Example II: Call Center, Part B
20:50
Example III: Confirming that the Mean of the Poisson Distribution is λ
26:53
Example IV: Find E (Y²) for the Poisson Distribution
35:24
Example V: Earthquakes, Part A
37:57
Example V: Earthquakes, Part B
44:02
Section 4: Continuous Distributions
Density & Cumulative Distribution Functions

57m 17s

Intro
0:00
Density Functions
0:43
Density Functions
0:44
Density Function to Calculate Probabilities
2:41
Cumulative Distribution Functions
4:28
Cumulative Distribution Functions
4:29
Using F to Calculate Probabilities
5:58
Properties of the CDF (Density & Cumulative Distribution Functions)
7:27
F(-∞) = 0
7:34
F(∞) = 1
8:30
F is Increasing
9:14
F'(y) = f(y)
9:21
Example I: Density & Cumulative Distribution Functions, Part A
9:43
Example I: Density & Cumulative Distribution Functions, Part B
14:16
Example II: Density & Cumulative Distribution Functions, Part A
21:41
Example II: Density & Cumulative Distribution Functions, Part B
26:16
Example III: Density & Cumulative Distribution Functions, Part A
32:17
Example III: Density & Cumulative Distribution Functions, Part B
37:08
Example IV: Density & Cumulative Distribution Functions
43:34
Example V: Density & Cumulative Distribution Functions, Part A
51:53
Example V: Density & Cumulative Distribution Functions, Part B
54:19
Mean & Variance for Continuous Distributions

36m 18s

Intro
0:00
Mean
0:32
Mean for a Continuous Random Variable
0:33
Expectation is Linear
2:07
Variance
2:55
Variance for Continuous random Variable
2:56
Easier to Calculate Via the Mean
3:26
Standard Deviation
5:03
Standard Deviation
5:04
Example I: Mean & Variance for Continuous Distributions
5:43
Example II: Mean & Variance for Continuous Distributions
10:09
Example III: Mean & Variance for Continuous Distributions
16:05
Example IV: Mean & Variance for Continuous Distributions
26:40
Example V: Mean & Variance for Continuous Distributions
30:12
Uniform Distribution

32m 49s

Intro
0:00
Uniform Distribution
0:15
Uniform Distribution
0:16
Each Part of the Region is Equally Probable
1:39
Key Properties of the Uniform Distribution
2:45
Mean
2:46
Variance
3:27
Standard Deviation
3:48
Example I: Newspaper Delivery
5:25
Example II: Picking a Real Number from a Uniform Distribution
8:21
Example III: Dinner Date
11:02
Example IV: Proving that a Variable is Uniformly Distributed
18:50
Example V: Ice Cream Serving
27:22
Normal (Gaussian) Distribution

1h 3m 54s

Intro
0:00
Normal (Gaussian) Distribution
0:35
Normal (Gaussian) Distribution & The Bell Curve
0:36
Fixed Parameters
0:55
Formula for the Normal Distribution
1:32
Formula for the Normal Distribution
1:33
Calculating on the Normal Distribution can be Tricky
3:32
Standard Normal Distribution
5:12
Standard Normal Distribution
5:13
Graphing the Standard Normal Distribution
6:13
Standard Normal Distribution, Cont.
8:30
Standard Normal Distribution Chart
8:31
Nonstandard Normal Distribution
14:44
Nonstandard Normal Variable & Associated Standard Normal
14:45
Finding Probabilities for Z
15:39
Example I: Chance that Standard Normal Variable Will Land Between 1 and 2?
16:46
Example I: Setting Up the Equation & Graph
16:47
Example I: Solving for z Using the Standard Normal Chart
19:05
Example II: What Proportion of the Data Lies within Two Standard Deviations of the Mean?
20:41
Example II: Setting Up the Equation & Graph
20:42
Example II: Solving for z Using the Standard Normal Chart
24:38
Example III: Scores on an Exam
27:34
Example III: Setting Up the Equation & Graph, Part A
27:35
Example III: Setting Up the Equation & Graph, Part B
33:48
Example III: Solving for z Using the Standard Normal Chart, Part A
38:23
Example III: Solving for z Using the Standard Normal Chart, Part B
40:49
Example IV: Temperatures
42:54
Example IV: Setting Up the Equation & Graph
42:55
Example IV: Solving for z Using the Standard Normal Chart
47:03
Example V: Scores on an Exam
48:41
Example V: Setting Up the Equation & Graph, Part A
48:42
Example V: Setting Up the Equation & Graph, Part B
53:20
Example V: Solving for z Using the Standard Normal Chart, Part A
57:45
Example V: Solving for z Using the Standard Normal Chart, Part B
59:17
Gamma Distribution (with Exponential & Chi-square)

1h 8m 27s

Intro
0:00
Gamma Function
0:49
The Gamma Function
0:50
Properties of the Gamma Function
2:07
Formula for the Gamma Distribution
3:50
Fixed Parameters
3:51
Density Function for Gamma Distribution
4:07
Key Properties of the Gamma Distribution
7:13
Mean
7:14
Variance
7:25
Standard Deviation
7:30
Exponential Distribution
8:03
Definition of Exponential Distribution
8:04
Density
11:23
Mean
13:26
Variance
13:48
Standard Deviation
13:55
Chi-square Distribution
14:34
Chi-square Distribution: Overview
14:35
Chi-square Distribution: Mean
16:27
Chi-square Distribution: Variance
16:37
Chi-square Distribution: Standard Deviation
16:55
Example I: Graphing Gamma Distribution
17:30
Example I: Graphing Gamma Distribution
17:31
Example I: Describe the Effects of Changing α and β on the Shape of the Graph
23:33
Example II: Exponential Distribution
27:11
Example II: Using the Exponential Distribution
27:12
Example II: Summary
35:34
Example III: Earthquake
37:05
Example III: Estimate Using Markov's Inequality
37:06
Example III: Estimate Using Tchebysheff's Inequality
40:13
Example III: Summary
44:13
Example IV: Finding Exact Probability of Earthquakes
46:45
Example IV: Finding Exact Probability of Earthquakes
46:46
Example IV: Summary
51:44
Example V: Prove and Interpret Why the Exponential Distribution is Called 'Memoryless'
52:51
Example V: Prove
52:52
Example V: Interpretation
57:44
Example V: Summary
1:03:54
Beta Distribution

52m 45s

Intro
0:00
Beta Function
0:29
Fixed parameters
0:30
Defining the Beta Function
1:19
Relationship between the Gamma & Beta Functions
2:02
Beta Distribution
3:31
Density Function for the Beta Distribution
3:32
Key Properties of the Beta Distribution
6:56
Mean
6:57
Variance
7:16
Standard Deviation
7:37
Example I: Calculate B(3,4)
8:10
Example II: Graphing the Density Functions for the Beta Distribution
12:25
Example III: Show that the Uniform Distribution is a Special Case of the Beta Distribution
24:57
Example IV: Show that this Triangular Distribution is a Special Case of the Beta Distribution
31:20
Example V: Morning Commute
37:39
Example V: Identify the Density Function
38:45
Example V: Morning Commute, Part A
42:22
Example V: Morning Commute, Part B
44:19
Example V: Summary
49:13
Moment-Generating Functions

51m 58s

Intro
0:00
Moments
0:30
Definition of Moments
0:31
Moment-Generating Functions (MGFs)
3:53
Moment-Generating Functions
3:54
Using the MGF to Calculate the Moments
5:21
Moment-Generating Functions for the Discrete Distributions
8:22
Moment-Generating Functions for Binomial Distribution
8:36
Moment-Generating Functions for Geometric Distribution
9:06
Moment-Generating Functions for Negative Binomial Distribution
9:28
Moment-Generating Functions for Hypergeometric Distribution
9:43
Moment-Generating Functions for Poisson Distribution
9:57
Moment-Generating Functions for the Continuous Distributions
11:34
Moment-Generating Functions for the Uniform Distributions
11:43
Moment-Generating Functions for the Normal Distributions
12:24
Moment-Generating Functions for the Gamma Distributions
12:36
Moment-Generating Functions for the Exponential Distributions
12:44
Moment-Generating Functions for the Chi-square Distributions
13:11
Moment-Generating Functions for the Beta Distributions
13:48
Useful Formulas with Moment-Generating Functions
15:02
Useful Formulas with Moment-Generating Functions 1
15:03
Useful Formulas with Moment-Generating Functions 2
16:21
Example I: Moment-Generating Function for the Binomial Distribution
17:33
Example II: Use the MGF for the Binomial Distribution to Find the Mean of the Distribution
24:40
Example III: Find the Moment Generating Function for the Poisson Distribution
29:28
Example IV: Use the MGF for Poisson Distribution to Find the Mean and Variance of the Distribution
36:27
Example V: Find the Moment-generating Function for the Uniform Distribution
44:47
Section 5: Multivariate Distributions
Bivariate Density & Distribution Functions

50m 52s

Intro
0:00
Bivariate Density Functions
0:21
Two Variables
0:23
Bivariate Density Function
0:52
Properties of the Density Function
1:57
Properties of the Density Function 1
1:59
Properties of the Density Function 2
2:20
We Can Calculate Probabilities
2:53
If You Have a Discrete Distribution
4:36
Bivariate Distribution Functions
5:25
Bivariate Distribution Functions
5:26
Properties of the Bivariate Distribution Functions 1
7:19
Properties of the Bivariate Distribution Functions 2
7:36
Example I: Bivariate Density & Distribution Functions
8:08
Example II: Bivariate Density & Distribution Functions
14:40
Example III: Bivariate Density & Distribution Functions
24:33
Example IV: Bivariate Density & Distribution Functions
32:04
Example V: Bivariate Density & Distribution Functions
40:26
Marginal Probability

42m 38s

Intro
0:00
Discrete Case
0:48
Marginal Probability Functions
0:49
Continuous Case
3:07
Marginal Density Functions
3:08
Example I: Compute the Marginal Probability Function
5:58
Example II: Compute the Marginal Probability Function
14:07
Example III: Marginal Density Function
24:01
Example IV: Marginal Density Function
30:47
Example V: Marginal Density Function
36:05
Conditional Probability & Conditional Expectation

1h 2m 24s

Intro
0:00
Review of Marginal Probability
0:46
Recall the Marginal Probability Functions & Marginal Density Functions
0:47
Conditional Probability, Discrete Case
3:14
Conditional Probability, Discrete Case
3:15
Conditional Probability, Continuous Case
4:15
Conditional Density of Y₁ given that Y₂ = y₂
4:16
Interpret This as a Density on Y₁ & Calculate Conditional Probability
5:03
Conditional Expectation
6:44
Conditional Expectation: Continuous
6:45
Conditional Expectation: Discrete
8:03
Example I: Conditional Probability
8:29
Example II: Conditional Probability
23:59
Example III: Conditional Probability
34:28
Example IV: Conditional Expectation
43:16
Example V: Conditional Expectation
48:28
Independent Random Variables

51m 39s

Intro
0:00
Intuition
0:55
Experiment with Two Random Variables
0:56
Intuition Formula
2:17
Definition and Formulas
4:43
Definition
4:44
Short Version: Discrete
5:10
Short Version: Continuous
5:48
Theorem
9:33
For Continuous Random Variables, Y₁ & Y₂ are Independent If & Only If: Condition 1
9:34
For Continuous Random Variables, Y₁ & Y₂ are Independent If & Only If: Condition 2
11:22
Example I: Use the Definition to Determine if Y₁ and Y₂ are Independent
12:49
Example II: Use the Definition to Determine if Y₁ and Y₂ are Independent
21:33
Example III: Are Y₁ and Y₂ Independent?
27:01
Example IV: Are Y₁ and Y₂ Independent?
34:51
Example V: Are Y₁ and Y₂ Independent?
43:44
Expected Value of a Function of Random Variables

37m 7s

Intro
0:00
Review of Single Variable Case
0:29
Expected Value of a Single Variable
0:30
Expected Value of a Function g(Y)
1:12
Bivariate Case
2:11
Expected Value of a Function g(Y₁, Y₂)
2:12
Linearity of Expectation
3:24
Linearity of Expectation 1
3:25
Linearity of Expectation 2
3:38
4:03
Example I: Calculate E (Y₁ + Y₂)
4:39
Example II: Calculate E (Y₁Y₂)
14:47
Example III: Calculate E (U₁) and E(U₂)
19:33
Example IV: Calculate E (Y₁) and E(Y₂)
22:50
Example V: Calculate E (2Y₁ + 3Y₂)
33:05
Covariance, Correlation & Linear Functions

59m 50s

Intro
0:00
Definition and Formulas for Covariance
0:38
Definition of Covariance
0:39
Formulas to Calculate Covariance
1:36
Intuition for Covariance
3:54
Covariance is a Measure of Dependence
3:55
Dependence Doesn't Necessarily Mean that the Variables Do the Same Thing
4:12
If Variables Move Together
4:47
If Variables Move Against Each Other
5:04
Both Cases Show Dependence!
5:30
Independence Theorem
8:10
Independence Theorem
8:11
The Converse is Not True
8:32
Correlation Coefficient
9:33
Correlation Coefficient
9:34
Linear Functions of Random Variables
11:57
Linear Functions of Random Variables: Expected Value
11:58
Linear Functions of Random Variables: Variance
12:58
Linear Functions of Random Variables, Cont.
14:30
Linear Functions of Random Variables: Covariance
14:35
Example I: Calculate E (Y₁), E (Y₂), and E (Y₁Y₂)
15:31
Example II: Are Y₁ and Y₂ Independent?
29:16
Example III: Calculate V (U₁) and V (U₂)
36:14
Example IV: Calculate the Covariance Correlation Coefficient
42:12
Example V: Find the Mean and Variance of the Average
52:19
Section 6: Distributions of Functions of Random Variables
Distribution Functions

1h 7m 35s

Intro
0:00
Premise
0:44
Premise
0:45
Goal
1:38
Goal Number 1: Find the Full Distribution Function
1:39
Goal Number 2: Find the Density Function
1:55
Goal Number 3: Calculate Probabilities
2:17
Three Methods
3:05
Method 1: Distribution Functions
3:06
Method 2: Transformations
3:38
Method 3: Moment-generating Functions
3:47
Distribution Functions
4:03
Distribution Functions
4:04
Example I: Find the Density Function
6:41
Step 1: Find the Distribution Function
6:42
Step 2: Find the Density Function
10:20
Summary
11:51
Example II: Find the Density Function
14:36
Step 1: Find the Distribution Function
14:37
Step 2: Find the Density Function
18:19
Summary
19:22
Example III: Find the Cumulative Distribution & Density Functions
20:39
Step 1: Find the Cumulative Distribution
20:40
Step 2: Find the Density Function
28:58
Summary
30:20
Example IV: Find the Density Function
33:01
Step 1: Setting Up the Equation & Graph
33:02
Step 2: If u ≤ 1
38:32
Step 3: If u ≥ 1
41:02
Step 4: Find the Distribution Function
42:40
Step 5: Find the Density Function
43:11
Summary
45:03
Example V: Find the Density Function
48:32
Step 1: Exponential
48:33
Step 2: Independence
50:48
Step 2: Find the Distribution Function
51:47
Step 3: Find the Density Function
1:00:17
Summary
1:02:05
Transformations

1h 16s

Intro
0:00
Premise
0:32
Premise
0:33
Goal
1:37
Goal Number 1: Find the Full Distribution Function
1:38
Goal Number 2: Find the Density Function
1:49
Goal Number 3: Calculate Probabilities
2:04
Three Methods
2:34
Method 1: Distribution Functions
2:35
Method 2: Transformations
2:57
Method 3: Moment-generating Functions
3:05
Requirements for Transformation Method
3:22
The Transformation Method Only Works for Single-variable Situations
3:23
Must be a Strictly Monotonic Function
3:50
Example: Strictly Monotonic Function
4:50
If the Function is Monotonic, Then It is Invertible
5:30
Formula for Transformations
7:09
Formula for Transformations
7:11
Example I: Determine whether the Function is Monotonic, and if so, Find Its Inverse
8:26
Example II: Find the Density Function
12:07
Example III: Determine whether the Function is Monotonic, and if so, Find Its Inverse
17:12
Example IV: Find the Density Function for the Magnitude of the Next Earthquake
21:30
Example V: Find the Expected Magnitude of the Next Earthquake
33:20
Example VI: Find the Density Function, Including the Range of Possible Values for u
47:42
Moment-Generating Functions

1h 18m 52s

Intro
0:00
Premise
0:30
Premise
0:31
Goal
1:40
Goal Number 1: Find the Full Distribution Function
1:41
Goal Number 2: Find the Density Function
1:51
Goal Number 3: Calculate Probabilities
2:01
Three Methods
2:39
Method 1: Distribution Functions
2:40
Method 2: Transformations
2:50
Method 3: Moment-Generating Functions
2:55
Review of Moment-Generating Functions
3:04
Recall: The Moment-Generating Function for a Random Variable Y
3:05
The Moment-Generating Function is a Function of t (Not y)
3:45
Moment-Generating Functions for the Discrete Distributions
4:31
Binomial
4:50
Geometric
5:12
Negative Binomial
5:24
Hypergeometric
5:33
Poisson
5:42
Moment-Generating Functions for the Continuous Distributions
6:08
Uniform
6:09
Normal
6:17
Gamma
6:29
Exponential
6:34
Chi-square
7:05
Beta
7:48
Useful Formulas with the Moment-Generating Functions
8:48
Useful Formula 1
8:49
Useful Formula 2
9:51
How to Use Moment-Generating Functions
10:41
How to Use Moment-Generating Functions
10:42
Example I: Find the Density Function
12:22
Example II: Find the Density Function
30:58
Example III: Find the Probability Function
43:29
Example IV: Find the Probability Function
51:43
Example V: Find the Distribution
1:00:14
Example VI: Find the Density Function
1:12:10
Order Statistics

1h 4m 56s

Intro
0:00
Premise
0:11
Example Question: How Tall Will the Tallest Student in My Next Semester's Probability Class Be?
0:12
Setting
0:56
Definition 1
1:49
Definition 2
2:01
Question: What are the Distributions & Densities?
4:08
Formulas
4:47
Distribution of Max
5:11
Density of Max
6:00
Distribution of Min
7:08
Density of Min
7:18
Example I: Distribution & Density Functions
8:29
Example I: Distribution
8:30
Example I: Density
11:07
Example I: Summary
12:33
Example II: Distribution & Density Functions
14:25
Example II: Distribution
14:26
Example II: Density
17:21
Example II: Summary
19:00
Example III: Mean & Variance
20:32
Example III: Mean
20:33
Example III: Variance
25:48
Example III: Summary
30:57
Example IV: Distribution & Density Functions
35:43
Example IV: Distribution
35:44
Example IV: Density
43:03
Example IV: Summary
46:11
Example V: Find the Expected Time Until the Team's First Injury
51:14
Example V: Solution
51:15
Example V: Summary
1:01:11
Sampling from a Normal Distribution

1h 7s

Intro
0:00
Setting
0:36
Setting
0:37
Assumptions and Notation
2:18
Assumption Forever
2:19
Assumption for this Lecture Only
3:21
Notation
3:49
The Sample Mean
4:15
Statistic We'll Study the Sample Mean
4:16
Theorem
5:40
Standard Normal Distribution
7:03
Standard Normal Distribution
7:04
Converting to Standard Normal
10:11
Recall
10:12
Corollary to Theorem
10:41
Example I: Heights of Students
13:18
Example II: What Happens to This Probability as n → ∞
22:36
Example III: Units at a University
32:24
Example IV: Probability of Sample Mean
40:53
Example V: How Many Samples Should We Take?
48:34
The Central Limit Theorem

1h 9m 55s

Intro
0:00
Setting
0:52
Setting
0:53
Assumptions and Notation
2:53
Our Samples are Independent (Independent Identically Distributed)
2:54
No Longer Assume that the Population is Normally Distributed
3:30
The Central Limit Theorem
4:36
The Central Limit Theorem Overview
4:38
The Central Limit Theorem in Practice
6:24
Standard Normal Distribution
8:09
Standard Normal Distribution
8:13
Converting to Standard Normal
10:13
Recall: If Y is Normal, Then …
10:14
Corollary to Theorem
11:09
Example I: Probability of Finishing Your Homework
12:56
Example I: Solution
12:57
Example I: Summary
18:20
Example I: Confirming with the Standard Normal Distribution Chart
20:18
Example II: Probability of Selling Muffins
21:26
Example II: Solution
21:27
Example II: Summary
29:09
Example II: Confirming with the Standard Normal Distribution Chart
31:09
Example III: Probability that a Soda Dispenser Gives the Correct Amount of Soda
32:41
Example III: Solution
32:42
Example III: Summary
38:03
Example III: Confirming with the Standard Normal Distribution Chart
40:58
Example IV: How Many Samples Should She Take?
42:06
Example IV: Solution
42:07
Example IV: Summary
49:18
Example IV: Confirming with the Standard Normal Distribution Chart
51:57
Example V: Restaurant Revenue
54:41
Example V: Solution
54:42
Example V: Summary
1:04:21
Example V: Confirming with the Standard Normal Distribution Chart
1:06:48
Bookmark & Share Embed

## Copy & Paste this embed code into your website’s HTML

Please ensure that your website editor is in text mode when you paste the code.
(In Wordpress, the mode button is on the top right corner.)
×
• - Allow users to view the embedded video in full-size.
Since this lesson is not free, only the preview will appear on your website.

• ## Transcription

### Start Learning Now

Our free lessons will get you started (Adobe Flash® required).

### Membership Overview

• *Ask questions and get answers from the community and our teachers!
• Practice questions with step-by-step solutions.
• Track your course viewing progress.
• Learn at your own pace... anytime, anywhere!

### Conditional Probability & Conditional Expectation

Lecture Slides are screen-captured images of important points in the lecture. Students can download and print out these lecture slide images to do practice problems as well as take notes while watching the lecture.

• Intro 0:00
• Review of Marginal Probability 0:46
• Recall the Marginal Probability Functions & Marginal Density Functions
• Conditional Probability, Discrete Case 3:14
• Conditional Probability, Discrete Case
• Conditional Probability, Continuous Case 4:15
• Conditional Density of Y₁ given that Y₂ = y₂
• Interpret This as a Density on Y₁ & Calculate Conditional Probability
• Conditional Expectation 6:44
• Conditional Expectation: Continuous
• Conditional Expectation: Discrete
• Example I: Conditional Probability 8:29
• Example II: Conditional Probability 23:59
• Example III: Conditional Probability 34:28
• Example IV: Conditional Expectation 43:16
• Example V: Conditional Expectation 48:28

### Transcription: Conditional Probability & Conditional Expectation

Hi, welcome back to the probability videos here on www.educator.com.0000

Today, we are going to talk about conditional probability and conditional expectation.0006

We are going to be using some of the concepts of marginal probability which I talked about in the previous video.0011

If you are not familiar with marginal probability, I’m going to give you a really quick review of that,0018

at the beginning of this video.0022

If you really have not worked through the previous video then, it is probably better if you go back0025

and look at that previous video on marginal probability.0030

Get yourself really solid on that, and then come back and0033

we will talk about condition probability and conditional expectation.0036

You will see how the marginal probability is used to calculate conditional probability and conditional expectation.0040

Let me just remind you what we learned in the last video about marginal probability.0048

Again, if you have not watched that video, you probably do want to watch that video from scratch0053

because there is a lot of information in there, and also, a lot of good practice in actually calculating marginal probability.0058

This is just a quick review so that you will see how these concepts go into conditional probability.0063

In all these ideas, we have an experiment with two random variables Y1 and Y2.0070

The marginal probability functions that we learned about in the last video, for discreet probability,0076

we have P1 of Y1 is the sum over Y2 of all the probabilities of all the combinations of Y1 and Y2.0084

Notice that, there is this variable change in the subscripts.0093

When you are finding the probability function for Y1, what you end up doing for Y1 is you sum over Y2.0098

And then, vice versa, when you find the marginal probability function from Y2, you sum over Y1.0108

Those are the marginal probability functions in the discreet case.0118

In the continuous case, you have a marginal density function for Y1.0122

And then, you integrate over all possible values of Y2.0127

Not always be for -infinity to infinity because sometimes the domain you are interested in, is much smaller than that.0131

But, I just wrote the most general one to introduce it here.0139

The important thing here is, when you are finding the function for Y1, you look at all the possible values for Y2,0143

and then you integrate the joint density function over Y2.0151

Conversely, when you are finding the marginal density function for Y2, you look at all the values for Y1 and you integrate over Y1.0155

We practiced in the previous lecture, we practice calculating some marginal probability functions and some marginal density functions.0168

If you do not remember how to calculate those, just jump back one video and check those out.0177

We are going to be using some of the answers to the examples in the previous lecture,0182

as part of the examples in this lecture.0187

You really want to be solid on that, before we move forward and learn some new stuff.0190

The big new idea for this video is conditional probability.0196

We will have a discreet case and a continuous case.0200

The probability of Y1 condition on Y2, what that really means is that,0204

if you know that Y2 has a particular value of y2, given that Y2 is y2, what is the probability that Y1 comes out to be y1.0209

The way you calculate it is, you use the probability function, the joint probability function of Y1 and Y2.0224

And then, you divide by the marginal probability function.0231

Remember, this, we figure out before is the sum over Y1 of the probability of Y1 and Y2.0236

Remember, you have that variable switch always.0245

That is the conditional probability formula in the discreet case.0248

The conditional probability in the continuous case is a little more complicated.0252

I can spell that out for you.0259

Given that Y2, given that we know the value for Y2, the way we find the conditional density function0262

is we write it as F of Y1 condition on Y2.0269

You take the joint density function and then you divide it by the marginal density function of Y2.0274

Just a reminder, since it is the conditional function, since it is the marginal density function of Y2,0283

it is the integral on Y1 of F of Y1 Y2 DY1.0291

We have a lot of variables to keep track here, it tend to get quite confusing.0301

What you want to do with this formula is, you want to interpret it as a density function on Y1.0305

And then, you can calculate conditional probability.0314

Suppose, you know you are given that Y2 has a particular value.0317

If you are given or you know somehow that Y2 has a particular value,0322

you want to ask what is the probability that Y1 will be in a particular range?0327

The idea here is that you have a known value of Y2.0333

The question we are asking is, what is the probability that Y1 will be in this particular range?0345

The way you answer that is, you use this conditional density formula.0355

You do F of Y1 condition on Y2 and then, you integrate that with respect to Y1 from the two values that you are interested in.0360

That comes from this A and this B, give you the two limits on the integral.0370

Quite confusing actually, but we will practice this in the examples, and I hope it will start to make some sense.0375

There is one more thing that we need to learn which is conditional expectation.0382

You want to remember this formula when we jump to the next slide0388

because the conditional expectation is going to look very much like this formula.0391

It is going to have one extra factor and that extra factor is going to go in right there.0396

Keep an eye out for that, in the next slide.0403

The next topic that we are learning here is conditional expectation.0406

Let me go ahead and talk about skip with the discreet case, and talk about the continuous case0413

because I think it is a little easier to follow.0418

The idea here is that you have a known value of Y2.0420

Someone has told you the value of Y2 and you are trying to predict the value of Y1.0424

What is your expected value of Y1, based on a particular known value of Y2?0431

What you do is, you integrate the conditional density function except here is that extra term,0439

that extra factor that I warned you about on the previous side.0447

That was not there on the formula from the previous side, everything else looks the same.0451

But you stick in that extra Y1 and then, you integrate it over Y1,0456

just as we did in the previous slide, when we are calculating continuous probability.0461

This is kind of reflecting the fact that we are finding the expected value of Y1.0467

That is why we put that extra term in there, it is all based on knowing a value of Y2.0472

I hope that is starting to make some sense, if it does, let us go up and look at the discreet formula.0480

It essentially looks exactly the same except I just changed the continuous F to a discreet P0486

and the continuous integral, to a discreet sum.0494

It was just sort of the discreet analogue of that continuous formula.0497

I think a lot of these will make more sense, after we do some examples.0502

Come along with me and let us work out some examples together.0505

In the first example here, we are given the joint density function F of Y1 Y2 is 6 × 1 - Y2,0511

on the triangle with corner 0-0, 0-1, and 1-1.0520

I think right away, what am I going to do is graph that because that is the first challenge,0524

is just to visualize these things.0531

0-0 is right here, think of 0-1 as being Y2, this is Y1, there is 1 and there is 1 on the Y2 axis.0534

So, 0-1 is right there and 1-1 is the point right there.0548

We want the triangle with those 3 corners, I will connect those up and make myself a nice triangle.0555

We want to find the probability that Y1 is less than or equal to Y2,0565

given that Y2 is less than or equal ½, given that Y1 is less that or equal ¾.0571

This is kind of a trick question to be throwing it into this lecture,0578

because this is not a problem of conditional and marginal probability.0583

That is because there are less than or equal to, in both of these.0591

Let me describe what is going on with this and hopefully I can sort it out for you.0596

We are going to use the original conditional probability formula from several lectures ago.0603

This goes way back to the beginning of the course.0610

You have to scroll way up to find this conditional probability formula.0612

The probability of A given B is the probability of A intersect B divided by the probability of B.0615

That was our original conditional probability formula.0625

What we want here is, the probability of Y2 being less than or equal to ½ and0628

Y1 being less than or equal to ¾ divided by the probability of Y1, that is the B being less than or equal to ¾.0641

That is what we calculate here.0655

Both the numerator and denominator are describing regions inside this triangle.0657

Let me look at the numerator first.0665

Maybe, I will look at the denominator first.0667

I think that one is a little more challenging.0669

Let me fill in some values here.0671

There is ½, there is ½, and I know I'm going to look at Y1 = ¾, there is ¾ there.0673

Maybe, that is enough there.0682

Let us look at the denominator.0686

It is the probability that Y1 is less than or equal to ¾.0688

My Y1 looks a little bit messy here, let me see if I can write that a little cleaner.0695

Y1 being less than or equal to ¾ is, there is the Y1 = ¾.0702

The blue region right there, that is where Y1 is less than or equal of ¾.0713

That looks a little tricky to describe.0724

I think, what I want to do is describe that as Y1 goes from 0 to 3/4 and Y2,0728

if I can describe that in red, Y2 goes from Y1 up to 1.0738

Y2 goes from Y1 up to 1, this is really lots and lots of multivariable calculus.0747

If you have not watched these lectures on multivariable calculus recently,0756

it is a great time to review those, lots of review of that subject coming in.0759

This denominator region is the integral from Y1 = 0 to Y1 = ¾.0765

The integral from Y2 = Y1, Y2 = 1.0777

And then, we have our joint density function 6 × 1 - Y2 DY2 and DY1.0784

That is going to be a double integral, we can calculate it out.0797

It would not be a whole lot of fun but it is nothing too difficult either.0800

The numerator region, Y2 is less than ½ and Y1 is less than or equal to ¾.0805

Let me just graph Y2 less than ½, that is everything below that line.0812

Let me make that a dotted line and make that a little easier to separate out there.0821

Y2 = ½, there it is.0829

I want everything below that line and to the left of the other dotted line.0834

That really just means below the dotted line for ½.0839

If I describe that region, this was the denominator region here.0846

The numerator region is, I think it will be easier if I describe Y2 first.0850

And then, I can describe Y1 going that way.0857

That will give you 0 for Y1, it will be a nicer there.0861

If I say Y2 goes from 0 to ½ and then Y1 goes from 0 to the diagonal line Y2.0864

What we will be integrating is Y2 = 0 to Y2 = ½.0878

And then, the integral of Y1 = 0 to Y1 = Y2.0887

The same density function 6 × 1 - Y2 DY1 DY2.0894

I have given myself a nice pile of double integrals to solve.0903

You would be forgiven, if you did not want to plow through the multivariable calculus with me,0908

because it is all just multivariable calculus from here.0912

There is really not too much more probability to be learned from this.0915

I do want to go ahead and solve them.0920

If you do not want to solve the integrals with me, feel free to skip to the end and just check out the numbers.0922

Maybe solve one on your own and see if you agree.0927

Or you can take these double integrals and plug them into your favorite online double integral solver.0929

This integral 6 × 1 - Y2 in the numerator, we are integrating that with respect to Y1.0937

That is 6 × 1 - Y2 × Y1 integrated Y1 = 01, Y1 = Y2.0945

That is going to give me, we are still working in the numerator.0958

6 × 1 - Y2 × Y2, 6Y2 - 6Y2².0963

And then, we are still integrating that from Y2 = 0 to Y2 = ½ DY2.0981

That was my numerator and let me go ahead and keep solving that.0991

6Y2 integrates to 3Y2², 6Y2² integrates to 2Y2³.0997

-2Y2³ integrate that from 0 to ½, Y2 = 0 to Y2 = ½.1010

We get 3 × ½² is ¾ - 2 × ½³, that is 2 × 1/8 is ¼, that is ½ for my numerator.1021

Now, that was the easy one.1040

I think the denominator actually turned out to be a bit worse, unfortunately.1042

Let me go ahead and see what we get in the denominator here.1046

6 × 1 - Y2 integrate it with respect to Y2, that is 6Y2 - 6Y2 integrated is 3Y2².1050

We want to integrate that from Y2 = Y1 to Y2 = 1.1067

That is going to give me 6 -3 - 6Y1 + 3Y1².1077

We still have to integrate that.1096

That is in the numerator, that should have been a DY2.1100

We are still integrating the denominator DY1.1103

That is the integral of 3Y1² is Y1³ -6Y1 integrates to 3Y1² -3Y1².1109

6 -3 is 3 that integrates to 3Y1, and we want to integrate that.1126

What are my bounds, Y1 goes from 0 to ¾, Y1 = 0 to Y1 = ¾.1134

I still have ½ in my numerator, it is waiting for denominator.1145

If I plug in ¾ for the denominator, this is pretty nasty.1150

Y1³ will be 27/64, 3Y1² is 3 × 9/16 so 27/16, and 3Y1 is 9/4.1154

Let us see, my common denominator there is /64 so I get 27 -27/16 is 108.1173

9/4, I have to multiply that by 16 because I got 64 in the denominator.1190

That is a 144, 9 × 16, and this all simplifies down to ½/27 -108 + 144 is 63.1196

63/64, I got 64, do the flip there, 64/63 × ½.1211

And, I finally come up with 32/63.1221

What an unpleasant stage of integration.1226

Let me show you the key steps in setting up those integrals because that is really what I want you to understand.1231

Solving the integrals is just kind of a tedious exercise in multivariable calculus.1238

Setting up the integrals is where it is really vital here.1242

The first thing I did was, to graph the triangle with these corners 0-0, 0-1, and 1-1.1247

And then, I wanted to find conditional probability, since, these are both describing regions and not constants.1256

If they were describing constants, I’m going to jump to my conditional probability formulas1264

that I learned in this lecture.1268

But since, they are describing regions, I’m just going to use my old conditional probability formula, right here.1270

The probability that they are both true divided by the probability of B.1277

Let me show you that, that gives us that formula right here.1285

And then, I had to translate each one of these regions into regions on the graph, that I can use to set up integrals.1293

Y2 less than ½ and Y1 less than ¾ means that, Y2 less than ½ is that region right there.1302

Everything south of that region Y1 less than ¾ is everything to the left of that line, right there.1316

My numerator is that region right there.1329

Y2 goes from 0 to ½, Y1 goes from 0 to Y2.1335

That is where I got my bounds for the integral here.1339

My denominator is just Y1 less than ¾, that is everything to the left of this line.1343

That is the line Y1 = ¾.1351

I got to describe that region that kind of looks like a backwards state of Nevada.1354

That is my denominator region, I described as Y1 goes from 0 to 3/4 and Y2 goes from Y1 up to 1.1361

That is where I got these limits for this denominator integral.1372

That is all kind of messy but from there, it just turns into a multivariable calculus exercise.1379

The key thing here is that, you have to keep straight which variable you are integrating with respect to.1386

In this numerator integral, we are integrating with respect to Y11395

which means that 6 × 1 - Y2 was a constant, that is why I integrated to 6 × 1 - Y2 × Y1.1399

In the denominator, we are integrating first with respect to Y2 which is why when we integrate 6 × 1 – Y2,1407

we get something very different 6Y2 -3Y².1414

I do not think I’m going to continue to pursue all the details of the integration.1420

I just kind of work through all these integrals, plug in all the fractions, it got fairly hairy here,1427

but then it simplify down to 32/63.1433

In example 2, we are starting out with the same joint density function on the same region.1441

We are also finding a conditional probability but now,1448

we are finding a conditional probability over condition on a constant value of Y1.1452

The big difference here was, in example 1, we had a less than or equal to here.1457

In example 2, we have an equal to here, which means we are going to use our new conditional expectation formulas.1463

I'm going to go ahead and draw the region that we are talking about.1472

But, we are going to be using our conditional expectation formulas that we learn in this lecture.1476

I’m trying to make my axis a little bit straighter.1483

This is the same region we had before.1486

I know the general shape of the region is this triangle here at 0-0, 0-1, and 1-1, there is my region.1488

What we are given now is that, Y1 is equal to ½.1500

There is Y1 is the horizontal axis, Y2 is the vertical axis.1509

We are given that Y1 is equal to ½.1513

We are given that we are sort of living on this red vertical line, right here.1519

We want to find the probability that Y2 is bigger than ¾.1528

Let me draw that, there is ½ and there is ¾.1534

We want to find the probability that we are on this part of the line, that upper half of that line.1539

What we are going to do is, solve this using our conditional expectation formula.1549

Let me remind you what that was, this is coming straight from one of the early slides in this lecture.1555

I think it was the third slide in this lecture.1562

Remember, what we want to use there is our conditional probability formula.1566

We are going to use the integral from Y2 = ¾.1573

Our biggest value of Y2 it could be is 1.1579

Y2 = 1 of the conditional density formula F of Y2 condition on Y1.1583

And then, we are going to integrate that with respect to Y2.1595

Let me remind you what this conditional density formula was.1600

The conditional density formula is F of Y2 condition on Y1.1605

I think this is the opposite way, from the way I had it in the first slide or the third slide of the lecture.1612

I think I had Y1 condition on Y2, you have to be very careful here.1620

This is F of Y1, Y2 divided by F1 of Y1.1626

We also have to figure out F of Y1,Y2, that is just the joint density function here.1638

But, we also need to know this marginal density function F1 of Y1.1646

F1 of Y1, I will remind you, is the integral on Y2 of F of Y1 Y2 DY2.1652

Now, we actually worked this one out in one of the examples in the previous lecture.1667

It was example 3, in the previous lecture.1673

There was in the previous lecture, that lecture was called marginal probability.1685

If you just look back, you will see this one worked out, at least the marginal probability function.1689

I'm not going to work that out again, but I'm just going to quote the answer there and where was it.1697

I have it written down here, it is 3Y1² - 6Y1 + 3, we work that out in example 3.1703

If you do not remember how that went, you can you try to redo the integral yourself right now and1717

just check that you get the right answer.1722

If that is still not making sense, just go back and look in example 3, in the marginal probability lecture,1724

you will see it all worked out.1729

Remember that, we are given that Y1 is equal to ½.1733

Let me go ahead and plug in F1 of ½ is 3 × ½², that is 3 × ¼ -6 × ½ which is 3 + 3.1737

That is just ¾ -3 + 3 is ¾.1754

I’m going to use that in here, as my denominator over on the left.1759

F1 of Y1 is ¾ and F of Y1 Y2 is 6 × 1 - Y2, that is the joint density function that is given to us.1764

6 divided by ¾, do the flip on the denominator and you get 6 × 4/3 which is 8 × 1 – Y2.1777

That is our conditional density function where Y1 is ½.1789

What we want to do is we want to integrate that, the probability that we are looking for1796

is the integral from Y2 = ¾ to Y2 = 1 of 8 × 1 - Y2 DY2.1802

Notice that, there is no Y1’s left in here anywhere because we I plug in Y1 = ½.1815

That is a pretty easy integral now.1820

8Y2 - the integral of 8Y2 is 4Y2².1822

We want to integrate that from Y2 = ¾ to Y2 = 1.1834

My 3, kind of disappeared, trying to do nothing here.1842

Let me write that a little bigger, there we go.1845

This is 8 – 4, -8 × Y2, that is 8 × ¾, -6.1849

+ 4 × Y2², 4 × ¾² is 4 × 9/16 which is, let me go ahead and write this down.1861

4 × 9/16 which is 9/4.1876

-4 -6 is 8 -10 is -2 + 9/4 is + 2 and ¼, that gives me ¼.1884

That is my answer, that is my probability.1896

If Y1 is ½ then the probability that Y2 is bigger than ¾ is exactly ¼.1899

A lot of steps involved there and some of it may be rather confusing.1910

Let me go back and go over those again, just quickly, so that everybody is on board here.1915

The first thing I did was, I graphed the triangle.1921

There is my triangle right there, we are describing this triangle.1923

We are given that Y1 is ½ which means we are kind of stuck at that red line where Y1 is equal to ½.1930

We are trying to find the probability that Y2 is bigger than ¾ .1940

What I'm going to use, since that is not inequality there, that is equality, that is different from example 1.1946

In example 1, we had an inequality there which that we are looking at these1953

two dimensional regions and setting up double integrals.1957

This is really quite different, even it looks very similar to example 1.1959

Because it is an inequality, we want to use our conditional density formula F of Y2 given a value of Y1.1966

And then, we want to find the probability that Y2 is bigger than ¾.1982

And that is why I integrated from 3/4 to 1.1986

Now, I need to figure out this formula F of Y2 given Y1.1989

By definition, I gave you this earlier on in this lecture, it is the joint density formula divided by the marginal density formula.1995

The marginal density formula is something we learn about in the previous lecture.2004

We actually worked out this example in the previous lecture, the marginal probability lecture.2009

Just check back and look at example 3 in that lecture.2015

You will see that we worked it out to be this expression in terms of Y1.2019

Or you can just work out the integral yourself right now and make sure that checks.2025

Given that Y1 is ½, that is why I plug in that value of ½, worked it through and got ¾.2029

That is my ¾ there, and the joint density formula comes from the stem of the problem.2039

That is where that comes from, the ¾ comes in there, and that simplifies down to 8 × 1 - Y2.2046

I just plug that into my integral, plug that into this formula right here.2053

Solve out the integral, simplify the fractions, and I get my probability of ¼.2061

In example 3, we are given a joint density function.2070

It is on a square, a little bit easier than what we had to deal with in examples 1 and 2.2074

We got to graph that out.2082

There is Y2, here is Y1, and we have a square.2084

Y1 and Y2 are both trapped between 0 and 1, there is my region.2091

What we want to do is, find the probability that Y1 is greater than ¾ given that Y2 is equal to ½.2102

Let me graph what we are really calculating here.2112

We are given that Y2 is equal to ½.2115

There is Y2 equal to ½, let me draw a nice, thick red line.2120

We are just looking at that red line region, right there.2125

We want to find the probability that Y1 is bigger than ¾.2131

There is ½, there is ¾, and we want to find the probability of being in that black dotted region, if we are on the red line.2137

That is what we are really calculating.2150

If we know we are on the red line, sum these totals that Y2 is ½, what is the probability that Y1 is bigger than ¾?2152

I mislabel my axis, my mistake there.2161

That should have been a Y1 and that should have been a Y2.2165

I always put Y1 on the horizontal axis, but for some reason, I just wrote them down this time.2170

Since, we have an inequality here and not an equality,2177

we are going to use the marginal density function here and the conditional density function.2183

We want the integral from Y1 = ¾ to Y1 = 1 of the conditional density function F of Y1 given Y2, given a value of Y2.2193

We want to integrate that DY1.2214

I got to figure out what that conditional density function is.2218

F of Y1 condition on Y2 is equal to the joint density function F of Y1 Y2 divided by F2 of Y2.2223

A lot of ingredients that I have to put in here.2238

I need to figure out what F2 of Y2 is.2241

That is the marginal density function, that we learn about in the previous lecture.2244

One thing you learned in the previous lecture is that, you always switch the variables.2251

F2 of Y2 is the integral on Y1 of F of Y1, Y2.2255

And then, you integrate that with respect to Y1.2264

In this case, the range on Y1 is from 0 to 1, Y1 = 0 to Y1 = 1.2268

The joint density function that we are given is 4Y1 Y2 DY1.2277

Now, that is an easy integral.2286

Remember, we are integrating with respect to Y1.2288

Y2 just comes along for the Y as a constant.2292

The integral of 4Y1 is 2Y1².2296

And then, we still have that Y2, integrate from Y1 = 0 to Y1 = 1.2303

We just get 2Y2 as the marginal density function.2311

Remember that, that should always be a function of Y2, if we are looking for the marginal density function of Y2.2318

2Y2 is a function of Y2, that is reassuring.2326

F of Y1 condition on Y2 is our joint density function 2Y1, Y2.2332

I’m sorry, 4Y1 Y2, that was our joint density function divided by 2Y2 and that simplifies down to 2Y1.2346

By the way, notice that the Y2 was canceled out.2358

That is kind of a freak of nature for this problem.2361

If the Y2’s had not cancel out, if we still had a Y2 in there,2364

then we would have plugged in the value of Y2 that we are given, Y2 = ½.2368

Let me say, we plugged in Y2 = ½, if necessary.2381

We did not have to do that because it just cancel each other out, in this particular problem.2391

But that would not always happen.2395

We really want to get a function of Y1 here.2397

The reason is, because we are going to take that and plug that back into our original integral, and integrate over Y1.2401

Y1 = ¾ to Y1 = 1 of 2Y1 DY1.2409

Now, that is an easy integral, it is just Y1².2419

We want to integrate that from Y1 = ¾ to 1.2424

That is just 1 – 3/4² is 9/16.2431

What we will get here is 7/16, that is our probability.2437

What that really means is that, if we are choosing a value according to this joint density function,2443

and somebody tells us that Y2 is definitely equal to ½, then our probability of Y1 being bigger than ¾ is exactly 7/16.2450

Let me recap the steps involved there.2464

First, I graphed the region Y1 goes from 0 to 1, Y2 goes from 0 to 1, that just gives me the square right here.2467

And then, I tried to look at the region we are interested in which is where Y2 is equal to ½.2475

That is where I got this horizontal line, at Y2 is ½.2480

And then, in particular, we are wondering whether Y1 is bigger than ¾.2485

That is why I graphed this, it is a little hard to see here but this dotted, black line at ¾.2490

We are asking, if we are on the red horizontal line, what is our chance of being in the black part of the red line?2499

In order to calculate that, we set up the integral on Y1 of the conditional density function.2509

I had to figure out what the conditional density function was.2519

I started out with the joint density function divided by the marginal density function,2522

which means I had to figure out what the marginal density function was.2527

F2 of Y2, remember the variables switch, that is the integral over Y1 of the joint density function.2530

I integrate the joint density function over my range of Y1.2537

It turns out to be 2Y2, which is reassuring that it is a function of Y2.2543

I take that and I plug it back in for F2 of Y2.2553

I plug in the joint density function for my numerator and it simplifies down to 2Y1, that is already a function of Y1.2562

But, if there had been a Y2, I would have plugged in the given value of Y2 in there, if I needed to.2571

I take that to Y1, I plug it back in here because I have now figure out the conditional density function.2580

Now, I just have an easy integral in terms of Y1 and I just calculated that integral, I got my 7/16.2588

In example 4, we are going to keep looking at that same setup from example 3.2598

Let me go ahead and graph that, as I talk about it.2603

We have Y1 and Y2, they are both between 0 and 1.2606

There is my Y1 on the horizontal axis, as always.2614

Y2 on the vertical axis, there is 1.2617

We are in this square and we want to find the conditional expectation of Y1 given that Y2 is equal to ½.2622

We know that Y2 is equal to ½ and we know that we are on this red line, right here, that is where Y2 is equal to ½.2631

I should have labeled that as ½, right there.2643

Y2 is ½, I know I’m on that red line.2648

I want to find the conditional expectation of Y1.2650

I’m going to use my formula for conditional expectation.2655

I gave you this formula back on the third slide of this lecture.2659

Just scroll back a few slides in the video and you will see the formula for conditional expectation.2663

It is the integral on Y1 of Y1 × the conditional density formula F of Y1 condition on Y2.2670

And then, we integrate that with respect to Y1.2686

This looks just like the formula for calculating conditional probability, except, the difference is this extra factor of Y1,2689

because we are trying to find an expected value right there.2699

It is just like back in the single variable case, the expected value of Y was the integral of Y × F of Y DY.2702

We have this extra factor of Y in there.2713

Here, we are trying to find the expected value of Y1, so we put in that extra factor of Y1.2719

This is the same setup that we had for example 3.2728

I’m do not want to use a marker there, I want to use a thin pen.2731

We already calculated the conditional density function which was 2Y1, by example 3.2741

If you did not just watched example 3 in this lecture, just scroll back one slide and take a peek at example 3,2753

where we went through some work to calculate the conditional density formula.2767

We figure out that it was 2Y1.2772

We can now integrate it, we are supposed to integrate it on the whole range of Y1 which is from 0 to 1.2777

Y1 = 0 to Y1 = 1, Y1 × 2Y1 is 2Y1² DY1.2783

The integral of 2Y1² is 2 × 1/3 Y1³, 2/3 Y1³.2796

We want to integrate that from Y1 = 0 to Y1 = 1.2804

Not integrate that but evaluate that from Y1 = 0 to Y1 = 1, and that is just 2/3.2811

What that means is that, if you are leaving in this joint density function and you have been told that2820

Y2 is equal to ½, you are given that Y2 is equal ½.2829

Your expected value for Y1 then is 2/3.2835

Let me recap the steps here.2841

We are working on a square region because that is what is given in the stem of the problem,2844

Y1 and Y2 are both between 0 and 1.2850

We are going to use the conditional expectations.2855

Since we know that Y2 is ½, the formula for conditional expectation that I gave you in the third slide of the same video is,2858

the integral of Y1 that is kind of the new factor there because2868

we are looking for expected value of the conditional density formula.2872

The conditional density formula is what we worked out back in example 3.2880

Example 3 was the same setup and we did work out the conditional density formula2884

sort of a route to finding the probability.2888

That part, the 2Y1 was the same as in examples 3.2892

This Y1 was new and we combine them, we get 2Y1².2896

And then, we get a very easy integral that just solves out to 2/3.2900

In example 5, we are given the joint density function F of Y1 Y2 is E ⁻Y2.2911

We are given that on a particular region, I think I better start by graphing that region2924

because otherwise, there will be some confusion.2928

There is Y1 on my horizontal axis, always, there is Y2.2933

I think that might run slightly off to the side, let us move that a little bit.2939

We are told that they both go from 0 to infinity but Y2 was always bigger than Y1.2944

Let me graph the line Y2 = Y1.2953

We are looking at this region sort of above that line.2958

This is the same setup that we had in one of the examples on the previous lecture.2965

Let me give you a reference for that.2974

This was in the lecture on marginal probability and it was example 5, in the previous lecture.2976

You might want to go back and look at our solution to example 5, in the lecture on marginal probability, the previous video.2993

If you just scroll up here, you will see it.3002

It is the same example but we are calculating something different.3005

What we did there was we calculated the marginal density function.3008

We calculated F1 of Y1.3016

The way we calculate it was, by doing the integral on Y2 of the joint density function F Y1 Y2 DY2.3020

The answer we got there was E ^- Y1.3030

It was a little bit of work to getting it.3034

I'm skipping over some of those details, when I talk about it now.3036

If you want to go back and check that out, or if you want to redo the integral then you will see where that comes from.3043

In today's example, what we are going to figure out is the expected value of Y2 given that Y1 is equal to 5.3049

Let me show you, how we are going to calculate that.3062

I’m going to use our formula for conditional expectation.3064

Let me see if I can graph this quickly.3069

Y1 is equal to 5 and that means we are on the line Y1 is equal to 5.3073

There is the line Y1 is equal to 5 and that is the line that I'm looking at.3085

I want to figure out what the expected value of Y2 will be, if I know that I'm fixed on that red line.3096

I’m going to use the formula for conditional expectation.3103

That is the integral on Y2 of, here is the new part, the new element is Y2 × F of Y2 given Y1, Y2 condition on Y1 DY2.3107

The new element there is that Y2, in order to calculate the conditional expectation.3127

That means, I have to figure out what F of Y2 condition on Y1 is.3136

Let me calculate that over on the side here.3143

F of Y2 condition on Y1, by definition, it is F of Y1, Y2 divided by the marginal density function F1 of Y1.3146

In turn, F of Y1 Y2, that is E ⁻Y2, that was given in the stem the problem.3162

F1 of Y1, that was what we figured out in example 5 of the previous lecture,3172

of the marginal probability lecture, that is E ⁻Y1.3179

If we put those together, E ^- Y1, if we flip it up to the numerator, that would be E ⁺Y1.3185

E ⁺Y1 - Y2, that is what I'm going to plug in there.3193

What is my range on Y2?3203

If you look at the range on Y2, it is all the range of this red line.3206

That red line starts at Y2 is equal to 5 and it goes on up to infinity.3212

My range on Y2 is going to be Y2 = 5 2Y2, I will take the limit as it goes to infinity.3219

I'm integrating Y2 × E ⁺Y1 - Y2 DY2.3230

I do not want to see a Y1 in here, I want to be integrating with respect to Y2 because I need to get an answer,3242

in terms of a numerical answer.3249

I’m integrating only with respect to Y2.3252

I'm not comfortable with that Y1 in there.3255

But, I'm given that Y1 is equal to 5.3258

I’m going to plug in Y1 is equal to 5.3261

I get the integral, I will write the bounds right now, Y2 × E⁵ - Y2 DY2.3267

That is much more reassuring now because I have only Y2 in there.3277

I know that, if I integrate this with respect to Y2, I will get a numerical answer.3283

Something I can do here is, I could pull E⁵.3289

That is E⁵ × Y2 × E ^- Y2 DY2.3293

That is a slightly unpleasant integral, I’m going to have to use integration by parts on that.3302

Let me do a quick integration by parts, I’m going to use tabular integration.3307

It is the cheater’s way of doing integration by parts quickly.3311

Y2 E ⁻Y2, if I take derivatives on the left, the derivative of Y2 is 1.3315

The derivative of 1 is 0, and the integral of E ⁻Y2 is –E ⁻Y2.3323

The integral of that is E ⁻Y2.3330

Draw my little diagonal lines and put a + and – there.3335

This is E⁵ × - Y2, E ⁻Y2 multiplying down the diagonal lines, - E ^- Y2.3340

That was integration by parts, borrowing some techniques from calculus 2.3354

If you are a little rusty on your integration by parts, guess what,3360

we have a video lecture series on college calculus level 2.3363

It is hosted by, non other than Will Murray, it is right here on www.educator.com.3369

You can figure out, you can review your integration by parts, if you are a little rusty on that.3374

In the meantime, we are going to plough forward with the probability.3381

We are trying to integrate this from Y2 = 5 to the limit as Y2 goes to infinity.3385

This is E⁵, I see I had E ⁻Y2 here.3396

If I plug in infinity, that is E ⁻infinity.3402

Even though, that is multiplied by infinity, if I did a little Patel’s rule on that, it would still go to 0.3406

E ⁻Y2 is still 0, when I have Y2 going to infinity, both of those terms dropout.3413

I have a term for Y2 is equal to 5, + 5 E⁻⁵ + ,3422

I’m putting + in here because I’m subtracting -.3431

E⁻⁵, and I see I got E⁵ × E⁻⁵, those cancel out.3434

This is 6 E⁻⁵ × E⁻⁵ cancels out, gives me a very nice answer here.3443

It is just 6 is my answer, that is my expected value of Y2 given that Y1 is equal to 5.3451

That is what I just calculated, the expected value of Y2 given that Y1 is equal to 5.3461

If somebody tells me that Y1 is equal to 5, that is going to be my guess for what Y2 is.3469

That is really the end of the problem but let me recap the steps here.3479

First, I looked at the region here, Y1 and Y2 go from 0 to infinity, and I just graph it.3483

I graphed that up here, it is this blue triangular region here.3490

The important thing to notice there is that, Y2 is bigger than Y1 which is why I have the region north of the line Y = X.3494

And then, I also noticed that we are told that Y1 is equal to 5.3504

I went ahead and graphed Y1 is equal to 5, it is this vertical red line here.3509

Since, we are calculating conditional expectation, I'm going to use the conditional expectation formula3514

which we learned about on the third slide of this lecture, the preamble to this lecture.3522

You integrate the conditional density formula which you put this extra factor Y2 in there, because it is the conditional expectations.3530

That comes from this term, right here, that is where we get that Y2 from there.3538

That is something we have to calculate as the joint density function divided by the marginal density function.3548

The marginal density function was something that we figure out back in the lecture on marginal probability.3557

We did this problem or at least that part of this problem, back in example 5 of the previous videos.3564

Just go back, scroll back, and look in example 5, if you do not know where that comes from.3571

The answer for the marginal density function was F1 of Y1 is E ⁻Y1.3577

Drop that in right here, for F1 of Y1, the numerator here F of Y1 Y2, that comes from this given function here.3586

That is where that comes from.3598

And then, I simplified that to E ⁺Y1 - Y2.3599

And then, I plugged that whole thing back in here, back into this integral.3604

I did not like the fact that there was a Y1 in there.3611

The reason I did not like it was because I’m integrating with respect to Y2.3614

I want to get a number at the end.3620

I do not have any way to get rid of that Y1, except to remember that I was told that Y1 is equal to 5.3622

That is where I plugged in 5 for Y1, which is kind of nice because it made it into E⁵ × E ⁻Y2.3632

I can pull it right out of the integral and now I have a nice little integral, in terms of Y2.3644

Fairly nice integral, it is something that I have to use integration by parts for.3651

If you are rusty on integration by parts, here is the quick and dirty way to do integration by parts, for this kind of problem.3655

If you really want to practice integration by parts, check the calculus level 2 lectures3662

here on www.educator.com, you will see a whole lecture on integration by parts.3666

You can get up to speed on that.3676

Here is what the answer gives me from integration by parts.3681

When I plug in Y2 going to infinity, but that kind of kills both terms, if you do a little Patel’s rule3684

you will see why that term gives you 0.3691

And then, I plugged in Y2 = 5 to both terms, that simplify down to 6 E⁻⁵ which very nice.3694

It cancel with E⁵ and gave me just 6 as my expected value.3702

That wraps up this lecture on conditional probability and conditional expectation.3711

I want to keep using some of the same concepts in the next lecture, which is on independent random variables.3716

We will see how that is connected to some of this.3722

This is part of the chapter on Bivariate density functions and Bivariate distribution, functions of two variables.3724

That in turn, is part of a larger lecture series on probability here on www.educator.com.3732

Your host, all the way, is Will Murray, thank you very much for joining me today.3739

We will see you next time, bye.3743

OR

### Start Learning Now

Our free lessons will get you started (Adobe Flash® required).