×
Start learning today, and be successful in your academic & professional career. Start Today!  William Murray

Marginal Probability

Slide Duration:

Section 1: Probability by Counting
Experiments, Outcomes, Samples, Spaces, Events

59m 30s

Intro
0:00
Terminology
0:19
Experiment
0:26
Outcome
0:56
Sample Space
1:16
Event
1:55
Key Formula
2:47
Formula for Finding the Probability of an Event
2:48
Example: Drawing a Card
3:36
Example I
5:01
Experiment
5:38
Outcomes
5:54
Probability of the Event
8:11
Example II
12:00
Experiment
12:17
Outcomes
12:34
Probability of the Event
13:49
Example III
16:33
Experiment
17:09
Outcomes
17:33
Probability of the Event
18:25
Example IV
21:20
Experiment
21:21
Outcomes
22:00
Probability of the Event
23:22
Example V
31:41
Experiment
32:14
Outcomes
32:35
Probability of the Event
33:27
Alternate Solution
40:16
Example VI
43:33
Experiment
44:08
Outcomes
44:24
Probability of the Event
53:35

1h 2m 47s

Intro
0:00
Unions of Events
0:40
Unions of Events
0:41
Disjoint Events
3:42
Intersections of Events
4:18
Intersections of Events
4:19
Conditional Probability
5:47
Conditional Probability
5:48
Independence
8:20
Independence
8:21
Warning: Independent Does Not Mean Disjoint
9:53
If A and B are Independent
11:20
Example I: Choosing a Number at Random
12:41
Solving by Counting
12:52
Solving by Probability
17:26
Example II: Combination
22:07
Combination Deal at a Restaurant
22:08
Example III: Rolling Two Dice
24:18
Define the Events
24:20
Solving by Counting
27:35
Solving by Probability
29:32
Example IV: Flipping a Coin
35:07
Flipping a Coin Four Times
35:08
Example V: Conditional Probabilities
41:22
Define the Events
42:23
Calculate the Conditional Probabilities
46:21
Example VI: Independent Events
53:42
Define the Events
53:43
Are Events Independent?
55:21
Choices: Combinations & Permutations

56m 3s

Intro
0:00
Choices: With or Without Replacement?
0:12
Choices: With or Without Replacement?
0:13
Example: With Replacement
2:17
Example: Without Replacement
2:55
Choices: Ordered or Unordered?
4:10
Choices: Ordered or Unordered?
4:11
Example: Unordered
4:52
Example: Ordered
6:08
Combinations
9:23
Definition & Equation: Combinations
9:24
Example: Combinations
12:12
Permutations
13:56
Definition & Equation: Permutations
13:57
Example: Permutations
15:00
Key Formulas
17:19
Number of Ways to Pick r Things from n Possibilities
17:20
Example I: Five Different Candy Bars
18:31
Example II: Five Identical Candy Bars
24:53
Example III: Five Identical Candy Bars
31:56
Example IV: Five Different Candy Bars
39:21
Example V: Pizza & Toppings
45:03
Inclusion & Exclusion

43m 40s

Intro
0:00
Inclusion/Exclusion: Two Events
0:09
Inclusion/Exclusion: Two Events
0:10
Inclusion/Exclusion: Three Events
2:30
Inclusion/Exclusion: Three Events
2:31
Example I: Inclusion & Exclusion
6:24
Example II: Inclusion & Exclusion
11:01
Example III: Inclusion & Exclusion
18:41
Example IV: Inclusion & Exclusion
28:24
Example V: Inclusion & Exclusion
39:33
Independence

46m 9s

Intro
0:00
Formula and Intuition
0:12
Definition of Independence
0:19
Intuition
0:49
Common Misinterpretations
1:37
Myth & Truth 1
1:38
Myth & Truth 2
2:23
Combining Independent Events
3:56
Recall: Formula for Conditional Probability
3:58
Combining Independent Events
4:10
Example I: Independence
5:36
Example II: Independence
14:14
Example III: Independence
21:10
Example IV: Independence
32:45
Example V: Independence
41:13
Bayes' Rule

1h 2m 10s

Intro
0:00
When to Use Bayes' Rule
0:08
When to Use Bayes' Rule: Disjoint Union of Events
0:09
Bayes' Rule for Two Choices
2:50
Bayes' Rule for Two Choices
2:51
Bayes' Rule for Multiple Choices
5:03
Bayes' Rule for Multiple Choices
5:04
Example I: What is the Chance that She is Diabetic?
6:55
Example I: Setting up the Events
6:56
Example I: Solution
11:33
Example II: What is the chance that It Belongs to a Woman?
19:28
Example II: Setting up the Events
19:29
Example II: Solution
21:45
Example III: What is the Probability that She is a Democrat?
27:31
Example III: Setting up the Events
27:32
Example III: Solution
32:08
Example IV: What is the chance that the Fruit is an Apple?
39:11
Example IV: Setting up the Events
39:12
Example IV: Solution
43:50
Example V: What is the Probability that the Oldest Child is a Girl?
51:16
Example V: Setting up the Events
51:17
Example V: Solution
53:07
Section 2: Random Variables
Random Variables & Probability Distribution

38m 21s

Intro
0:00
Intuition
0:15
Intuition for Random Variable
0:16
Example: Random Variable
0:44
Intuition, Cont.
2:52
Example: Random Variable as Payoff
2:57
Definition
5:11
Definition of a Random Variable
5:13
Example: Random Variable in Baseball
6:02
Probability Distributions
7:18
Probability Distributions
7:19
Example I: Probability Distribution for the Random Variable
9:29
Example II: Probability Distribution for the Random Variable
14:52
Example III: Probability Distribution for the Random Variable
21:52
Example IV: Probability Distribution for the Random Variable
27:25
Example V: Probability Distribution for the Random Variable
34:12
Expected Value (Mean)

46m 14s

Intro
0:00
Definition of Expected Value
0:20
Expected Value of a (Discrete) Random Variable or Mean
0:21
Indicator Variables
3:03
Indicator Variable
3:04
Linearity of Expectation
4:36
Linearity of Expectation for Random Variables
4:37
Expected Value of a Function
6:03
Expected Value of a Function
6:04
Example I: Expected Value
7:30
Example II: Expected Value
14:14
Example III: Expected Value of Flipping a Coin
21:42
Example III: Part A
21:43
Example III: Part B
30:43
Example IV: Semester Average
36:39
Example V: Expected Value of a Function of a Random Variable
41:28
Variance & Standard Deviation

47m 23s

Intro
0:00
Definition of Variance
0:11
Variance of a Random Variable
0:12
Variance is a Measure of the Variability, or Volatility
1:06
Most Useful Way to Calculate Variance
2:46
Definition of Standard Deviation
3:44
Standard Deviation of a Random Variable
3:45
Example I: Which of the Following Sets of Data Has the Largest Variance?
5:34
Example II: Which of the Following Would be the Least Useful in Understanding a Set of Data?
9:02
Example III: Calculate the Mean, Variance, & Standard Deviation
11:48
Example III: Mean
12:56
Example III: Variance
14:06
Example III: Standard Deviation
15:42
Example IV: Calculate the Mean, Variance, & Standard Deviation
17:54
Example IV: Mean
18:47
Example IV: Variance
20:36
Example IV: Standard Deviation
25:34
Example V: Calculate the Mean, Variance, & Standard Deviation
29:56
Example V: Mean
30:13
Example V: Variance
33:28
Example V: Standard Deviation
34:48
Example VI: Calculate the Mean, Variance, & Standard Deviation
37:29
Example VI: Possible Outcomes
38:09
Example VI: Mean
39:29
Example VI: Variance
41:22
Example VI: Standard Deviation
43:28
Markov's Inequality

26m 45s

Intro
0:00
Markov's Inequality
0:25
Markov's Inequality: Definition & Condition
0:26
Markov's Inequality: Equation
1:15
Markov's Inequality: Reverse Equation
2:48
Example I: Money
4:11
Example II: Rental Car
9:23
Example III: Probability of an Earthquake
12:22
Example IV: Defective Laptops
16:52
Example V: Cans of Tuna
21:06
Tchebysheff's Inequality

42m 11s

Intro
0:00
Tchebysheff's Inequality (Also Known as Chebyshev's Inequality)
0:52
Tchebysheff's Inequality: Definition
0:53
Tchebysheff's Inequality: Equation
1:19
Tchebysheff's Inequality: Intuition
3:21
Tchebysheff's Inequality in Reverse
4:09
Tchebysheff's Inequality in Reverse
4:10
Intuition
5:13
Example I: Money
5:55
Example II: College Units
13:20
Example III: Using Tchebysheff's Inequality to Estimate Proportion
16:40
Example IV: Probability of an Earthquake
25:21
Example V: Using Tchebysheff's Inequality to Estimate Proportion
32:57
Section 3: Discrete Distributions
Binomial Distribution (Bernoulli Trials)

52m 36s

Intro
0:00
Binomial Distribution
0:29
Binomial Distribution (Bernoulli Trials) Overview
0:30
Prototypical Examples: Flipping a Coin n Times
1:36
Process with Two Outcomes: Games Between Teams
2:12
Process with Two Outcomes: Rolling a Die to Get a 6
2:42
Formula for the Binomial Distribution
3:45
Fixed Parameters
3:46
Formula for the Binomial Distribution
6:27
Key Properties of the Binomial Distribution
9:54
Mean
9:55
Variance
10:56
Standard Deviation
11:13
Example I: Games Between Teams
11:36
Example II: Exam Score
17:01
Example III: Expected Grade & Standard Deviation
25:59
Example IV: Pogo-sticking Championship, Part A
33:25
Example IV: Pogo-sticking Championship, Part B
38:24
Example V: Expected Championships Winning & Standard Deviation
45:22
Geometric Distribution

52m 50s

Intro
0:00
Geometric Distribution
0:22
Geometric Distribution: Definition
0:23
Prototypical Example: Flipping a Coin Until We Get a Head
1:08
Geometric Distribution vs. Binomial Distribution.
1:31
Formula for the Geometric Distribution
2:13
Fixed Parameters
2:14
Random Variable
2:49
Formula for the Geometric Distribution
3:16
Key Properties of the Geometric Distribution
6:47
Mean
6:48
Variance
7:10
Standard Deviation
7:25
Geometric Series
7:46
Recall from Calculus II: Sum of Infinite Series
7:47
Application to Geometric Distribution
10:10
Example I: Drawing Cards from a Deck (With Replacement) Until You Get an Ace
13:02
Example I: Question & Solution
13:03
Example II: Mean & Standard Deviation of Winning Pin the Tail on the Donkey
16:32
Example II: Mean
16:33
Example II: Standard Deviation
18:37
Example III: Rolling a Die
22:09
Example III: Setting Up
22:10
Example III: Part A
24:18
Example III: Part B
26:01
Example III: Part C
27:38
Example III: Summary
32:02
Example IV: Job Interview
35:16
Example IV: Setting Up
35:15
Example IV: Part A
37:26
Example IV: Part B
38:33
Example IV: Summary
39:37
Example V: Mean & Standard Deviation of Time to Conduct All the Interviews
41:13
Example V: Setting Up
42:50
Example V: Mean
46:05
Example V: Variance
47:37
Example V: Standard Deviation
48:22
Example V: Summary
49:36
Negative Binomial Distribution

51m 39s

Intro
0:00
Negative Binomial Distribution
0:11
Negative Binomial Distribution: Definition
0:12
Prototypical Example: Flipping a Coin Until We Get r Successes
0:46
Negative Binomial Distribution vs. Binomial Distribution
1:04
Negative Binomial Distribution vs. Geometric Distribution
1:33
Formula for Negative Binomial Distribution
3:39
Fixed Parameters
3:40
Random Variable
4:57
Formula for Negative Binomial Distribution
5:18
Key Properties of Negative Binomial
7:44
Mean
7:47
Variance
8:03
Standard Deviation
8:09
Example I: Drawing Cards from a Deck (With Replacement) Until You Get Four Aces
8:32
Example I: Question & Solution
8:33
Example II: Chinchilla Grooming
12:37
Example II: Mean
12:38
Example II: Variance
15:09
Example II: Standard Deviation
15:51
Example II: Summary
17:10
Example III: Rolling a Die Until You Get Four Sixes
18:27
Example III: Setting Up
19:38
Example III: Mean
19:38
Example III: Variance
20:31
Example III: Standard Deviation
21:21
Example IV: Job Applicants
24:00
Example IV: Setting Up
24:01
Example IV: Part A
26:16
Example IV: Part B
29:53
Example V: Mean & Standard Deviation of Time to Conduct All the Interviews
40:10
Example V: Setting Up
40:11
Example V: Mean
45:24
Example V: Variance
46:22
Example V: Standard Deviation
47:01
Example V: Summary
48:16
Hypergeometric Distribution

36m 27s

Intro
0:00
Hypergeometric Distribution
0:11
Hypergeometric Distribution: Definition
0:12
Random Variable
1:38
Formula for the Hypergeometric Distribution
1:50
Fixed Parameters
1:51
Formula for the Hypergeometric Distribution
2:53
Key Properties of Hypergeometric
6:14
Mean
6:15
Variance
6:42
Standard Deviation
7:16
Example I: Students Committee
7:30
Example II: Expected Number of Women on the Committee in Example I
11:08
Example III: Pairs of Shoes
13:49
Example IV: What is the Expected Number of Left Shoes in Example III?
20:46
Example V: Using Indicator Variables & Linearity of Expectation
25:40
Poisson Distribution

52m 19s

Intro
0:00
Poisson Distribution
0:18
Poisson Distribution: Definition
0:19
Formula for the Poisson Distribution
2:16
Fixed Parameter
2:17
Formula for the Poisson Distribution
2:59
Key Properties of the Poisson Distribution
5:30
Mean
5:34
Variance
6:07
Standard Deviation
6:27
Example I: Forest Fires
6:41
Example II: Call Center, Part A
15:56
Example II: Call Center, Part B
20:50
Example III: Confirming that the Mean of the Poisson Distribution is λ
26:53
Example IV: Find E (Y²) for the Poisson Distribution
35:24
Example V: Earthquakes, Part A
37:57
Example V: Earthquakes, Part B
44:02
Section 4: Continuous Distributions
Density & Cumulative Distribution Functions

57m 17s

Intro
0:00
Density Functions
0:43
Density Functions
0:44
Density Function to Calculate Probabilities
2:41
Cumulative Distribution Functions
4:28
Cumulative Distribution Functions
4:29
Using F to Calculate Probabilities
5:58
Properties of the CDF (Density & Cumulative Distribution Functions)
7:27
F(-∞) = 0
7:34
F(∞) = 1
8:30
F is Increasing
9:14
F'(y) = f(y)
9:21
Example I: Density & Cumulative Distribution Functions, Part A
9:43
Example I: Density & Cumulative Distribution Functions, Part B
14:16
Example II: Density & Cumulative Distribution Functions, Part A
21:41
Example II: Density & Cumulative Distribution Functions, Part B
26:16
Example III: Density & Cumulative Distribution Functions, Part A
32:17
Example III: Density & Cumulative Distribution Functions, Part B
37:08
Example IV: Density & Cumulative Distribution Functions
43:34
Example V: Density & Cumulative Distribution Functions, Part A
51:53
Example V: Density & Cumulative Distribution Functions, Part B
54:19
Mean & Variance for Continuous Distributions

36m 18s

Intro
0:00
Mean
0:32
Mean for a Continuous Random Variable
0:33
Expectation is Linear
2:07
Variance
2:55
Variance for Continuous random Variable
2:56
Easier to Calculate Via the Mean
3:26
Standard Deviation
5:03
Standard Deviation
5:04
Example I: Mean & Variance for Continuous Distributions
5:43
Example II: Mean & Variance for Continuous Distributions
10:09
Example III: Mean & Variance for Continuous Distributions
16:05
Example IV: Mean & Variance for Continuous Distributions
26:40
Example V: Mean & Variance for Continuous Distributions
30:12
Uniform Distribution

32m 49s

Intro
0:00
Uniform Distribution
0:15
Uniform Distribution
0:16
Each Part of the Region is Equally Probable
1:39
Key Properties of the Uniform Distribution
2:45
Mean
2:46
Variance
3:27
Standard Deviation
3:48
Example I: Newspaper Delivery
5:25
Example II: Picking a Real Number from a Uniform Distribution
8:21
Example III: Dinner Date
11:02
Example IV: Proving that a Variable is Uniformly Distributed
18:50
Example V: Ice Cream Serving
27:22
Normal (Gaussian) Distribution

1h 3m 54s

Intro
0:00
Normal (Gaussian) Distribution
0:35
Normal (Gaussian) Distribution & The Bell Curve
0:36
Fixed Parameters
0:55
Formula for the Normal Distribution
1:32
Formula for the Normal Distribution
1:33
Calculating on the Normal Distribution can be Tricky
3:32
Standard Normal Distribution
5:12
Standard Normal Distribution
5:13
Graphing the Standard Normal Distribution
6:13
Standard Normal Distribution, Cont.
8:30
Standard Normal Distribution Chart
8:31
Nonstandard Normal Distribution
14:44
Nonstandard Normal Variable & Associated Standard Normal
14:45
Finding Probabilities for Z
15:39
Example I: Chance that Standard Normal Variable Will Land Between 1 and 2?
16:46
Example I: Setting Up the Equation & Graph
16:47
Example I: Solving for z Using the Standard Normal Chart
19:05
Example II: What Proportion of the Data Lies within Two Standard Deviations of the Mean?
20:41
Example II: Setting Up the Equation & Graph
20:42
Example II: Solving for z Using the Standard Normal Chart
24:38
Example III: Scores on an Exam
27:34
Example III: Setting Up the Equation & Graph, Part A
27:35
Example III: Setting Up the Equation & Graph, Part B
33:48
Example III: Solving for z Using the Standard Normal Chart, Part A
38:23
Example III: Solving for z Using the Standard Normal Chart, Part B
40:49
Example IV: Temperatures
42:54
Example IV: Setting Up the Equation & Graph
42:55
Example IV: Solving for z Using the Standard Normal Chart
47:03
Example V: Scores on an Exam
48:41
Example V: Setting Up the Equation & Graph, Part A
48:42
Example V: Setting Up the Equation & Graph, Part B
53:20
Example V: Solving for z Using the Standard Normal Chart, Part A
57:45
Example V: Solving for z Using the Standard Normal Chart, Part B
59:17
Gamma Distribution (with Exponential & Chi-square)

1h 8m 27s

Intro
0:00
Gamma Function
0:49
The Gamma Function
0:50
Properties of the Gamma Function
2:07
Formula for the Gamma Distribution
3:50
Fixed Parameters
3:51
Density Function for Gamma Distribution
4:07
Key Properties of the Gamma Distribution
7:13
Mean
7:14
Variance
7:25
Standard Deviation
7:30
Exponential Distribution
8:03
Definition of Exponential Distribution
8:04
Density
11:23
Mean
13:26
Variance
13:48
Standard Deviation
13:55
Chi-square Distribution
14:34
Chi-square Distribution: Overview
14:35
Chi-square Distribution: Mean
16:27
Chi-square Distribution: Variance
16:37
Chi-square Distribution: Standard Deviation
16:55
Example I: Graphing Gamma Distribution
17:30
Example I: Graphing Gamma Distribution
17:31
Example I: Describe the Effects of Changing α and β on the Shape of the Graph
23:33
Example II: Exponential Distribution
27:11
Example II: Using the Exponential Distribution
27:12
Example II: Summary
35:34
Example III: Earthquake
37:05
Example III: Estimate Using Markov's Inequality
37:06
Example III: Estimate Using Tchebysheff's Inequality
40:13
Example III: Summary
44:13
Example IV: Finding Exact Probability of Earthquakes
46:45
Example IV: Finding Exact Probability of Earthquakes
46:46
Example IV: Summary
51:44
Example V: Prove and Interpret Why the Exponential Distribution is Called 'Memoryless'
52:51
Example V: Prove
52:52
Example V: Interpretation
57:44
Example V: Summary
1:03:54
Beta Distribution

52m 45s

Intro
0:00
Beta Function
0:29
Fixed parameters
0:30
Defining the Beta Function
1:19
Relationship between the Gamma & Beta Functions
2:02
Beta Distribution
3:31
Density Function for the Beta Distribution
3:32
Key Properties of the Beta Distribution
6:56
Mean
6:57
Variance
7:16
Standard Deviation
7:37
Example I: Calculate B(3,4)
8:10
Example II: Graphing the Density Functions for the Beta Distribution
12:25
Example III: Show that the Uniform Distribution is a Special Case of the Beta Distribution
24:57
Example IV: Show that this Triangular Distribution is a Special Case of the Beta Distribution
31:20
Example V: Morning Commute
37:39
Example V: Identify the Density Function
38:45
Example V: Morning Commute, Part A
42:22
Example V: Morning Commute, Part B
44:19
Example V: Summary
49:13
Moment-Generating Functions

51m 58s

Intro
0:00
Moments
0:30
Definition of Moments
0:31
Moment-Generating Functions (MGFs)
3:53
Moment-Generating Functions
3:54
Using the MGF to Calculate the Moments
5:21
Moment-Generating Functions for the Discrete Distributions
8:22
Moment-Generating Functions for Binomial Distribution
8:36
Moment-Generating Functions for Geometric Distribution
9:06
Moment-Generating Functions for Negative Binomial Distribution
9:28
Moment-Generating Functions for Hypergeometric Distribution
9:43
Moment-Generating Functions for Poisson Distribution
9:57
Moment-Generating Functions for the Continuous Distributions
11:34
Moment-Generating Functions for the Uniform Distributions
11:43
Moment-Generating Functions for the Normal Distributions
12:24
Moment-Generating Functions for the Gamma Distributions
12:36
Moment-Generating Functions for the Exponential Distributions
12:44
Moment-Generating Functions for the Chi-square Distributions
13:11
Moment-Generating Functions for the Beta Distributions
13:48
Useful Formulas with Moment-Generating Functions
15:02
Useful Formulas with Moment-Generating Functions 1
15:03
Useful Formulas with Moment-Generating Functions 2
16:21
Example I: Moment-Generating Function for the Binomial Distribution
17:33
Example II: Use the MGF for the Binomial Distribution to Find the Mean of the Distribution
24:40
Example III: Find the Moment Generating Function for the Poisson Distribution
29:28
Example IV: Use the MGF for Poisson Distribution to Find the Mean and Variance of the Distribution
36:27
Example V: Find the Moment-generating Function for the Uniform Distribution
44:47
Section 5: Multivariate Distributions
Bivariate Density & Distribution Functions

50m 52s

Intro
0:00
Bivariate Density Functions
0:21
Two Variables
0:23
Bivariate Density Function
0:52
Properties of the Density Function
1:57
Properties of the Density Function 1
1:59
Properties of the Density Function 2
2:20
We Can Calculate Probabilities
2:53
If You Have a Discrete Distribution
4:36
Bivariate Distribution Functions
5:25
Bivariate Distribution Functions
5:26
Properties of the Bivariate Distribution Functions 1
7:19
Properties of the Bivariate Distribution Functions 2
7:36
Example I: Bivariate Density & Distribution Functions
8:08
Example II: Bivariate Density & Distribution Functions
14:40
Example III: Bivariate Density & Distribution Functions
24:33
Example IV: Bivariate Density & Distribution Functions
32:04
Example V: Bivariate Density & Distribution Functions
40:26
Marginal Probability

42m 38s

Intro
0:00
Discrete Case
0:48
Marginal Probability Functions
0:49
Continuous Case
3:07
Marginal Density Functions
3:08
Example I: Compute the Marginal Probability Function
5:58
Example II: Compute the Marginal Probability Function
14:07
Example III: Marginal Density Function
24:01
Example IV: Marginal Density Function
30:47
Example V: Marginal Density Function
36:05
Conditional Probability & Conditional Expectation

1h 2m 24s

Intro
0:00
Review of Marginal Probability
0:46
Recall the Marginal Probability Functions & Marginal Density Functions
0:47
Conditional Probability, Discrete Case
3:14
Conditional Probability, Discrete Case
3:15
Conditional Probability, Continuous Case
4:15
Conditional Density of Y₁ given that Y₂ = y₂
4:16
Interpret This as a Density on Y₁ & Calculate Conditional Probability
5:03
Conditional Expectation
6:44
Conditional Expectation: Continuous
6:45
Conditional Expectation: Discrete
8:03
Example I: Conditional Probability
8:29
Example II: Conditional Probability
23:59
Example III: Conditional Probability
34:28
Example IV: Conditional Expectation
43:16
Example V: Conditional Expectation
48:28
Independent Random Variables

51m 39s

Intro
0:00
Intuition
0:55
Experiment with Two Random Variables
0:56
Intuition Formula
2:17
Definition and Formulas
4:43
Definition
4:44
Short Version: Discrete
5:10
Short Version: Continuous
5:48
Theorem
9:33
For Continuous Random Variables, Y₁ & Y₂ are Independent If & Only If: Condition 1
9:34
For Continuous Random Variables, Y₁ & Y₂ are Independent If & Only If: Condition 2
11:22
Example I: Use the Definition to Determine if Y₁ and Y₂ are Independent
12:49
Example II: Use the Definition to Determine if Y₁ and Y₂ are Independent
21:33
Example III: Are Y₁ and Y₂ Independent?
27:01
Example IV: Are Y₁ and Y₂ Independent?
34:51
Example V: Are Y₁ and Y₂ Independent?
43:44
Expected Value of a Function of Random Variables

37m 7s

Intro
0:00
Review of Single Variable Case
0:29
Expected Value of a Single Variable
0:30
Expected Value of a Function g(Y)
1:12
Bivariate Case
2:11
Expected Value of a Function g(Y₁, Y₂)
2:12
Linearity of Expectation
3:24
Linearity of Expectation 1
3:25
Linearity of Expectation 2
3:38
4:03
Example I: Calculate E (Y₁ + Y₂)
4:39
Example II: Calculate E (Y₁Y₂)
14:47
Example III: Calculate E (U₁) and E(U₂)
19:33
Example IV: Calculate E (Y₁) and E(Y₂)
22:50
Example V: Calculate E (2Y₁ + 3Y₂)
33:05
Covariance, Correlation & Linear Functions

59m 50s

Intro
0:00
Definition and Formulas for Covariance
0:38
Definition of Covariance
0:39
Formulas to Calculate Covariance
1:36
Intuition for Covariance
3:54
Covariance is a Measure of Dependence
3:55
Dependence Doesn't Necessarily Mean that the Variables Do the Same Thing
4:12
If Variables Move Together
4:47
If Variables Move Against Each Other
5:04
Both Cases Show Dependence!
5:30
Independence Theorem
8:10
Independence Theorem
8:11
The Converse is Not True
8:32
Correlation Coefficient
9:33
Correlation Coefficient
9:34
Linear Functions of Random Variables
11:57
Linear Functions of Random Variables: Expected Value
11:58
Linear Functions of Random Variables: Variance
12:58
Linear Functions of Random Variables, Cont.
14:30
Linear Functions of Random Variables: Covariance
14:35
Example I: Calculate E (Y₁), E (Y₂), and E (Y₁Y₂)
15:31
Example II: Are Y₁ and Y₂ Independent?
29:16
Example III: Calculate V (U₁) and V (U₂)
36:14
Example IV: Calculate the Covariance Correlation Coefficient
42:12
Example V: Find the Mean and Variance of the Average
52:19
Section 6: Distributions of Functions of Random Variables
Distribution Functions

1h 7m 35s

Intro
0:00
Premise
0:44
Premise
0:45
Goal
1:38
Goal Number 1: Find the Full Distribution Function
1:39
Goal Number 2: Find the Density Function
1:55
Goal Number 3: Calculate Probabilities
2:17
Three Methods
3:05
Method 1: Distribution Functions
3:06
Method 2: Transformations
3:38
Method 3: Moment-generating Functions
3:47
Distribution Functions
4:03
Distribution Functions
4:04
Example I: Find the Density Function
6:41
Step 1: Find the Distribution Function
6:42
Step 2: Find the Density Function
10:20
Summary
11:51
Example II: Find the Density Function
14:36
Step 1: Find the Distribution Function
14:37
Step 2: Find the Density Function
18:19
Summary
19:22
Example III: Find the Cumulative Distribution & Density Functions
20:39
Step 1: Find the Cumulative Distribution
20:40
Step 2: Find the Density Function
28:58
Summary
30:20
Example IV: Find the Density Function
33:01
Step 1: Setting Up the Equation & Graph
33:02
Step 2: If u ≤ 1
38:32
Step 3: If u ≥ 1
41:02
Step 4: Find the Distribution Function
42:40
Step 5: Find the Density Function
43:11
Summary
45:03
Example V: Find the Density Function
48:32
Step 1: Exponential
48:33
Step 2: Independence
50:48
Step 2: Find the Distribution Function
51:47
Step 3: Find the Density Function
1:00:17
Summary
1:02:05
Transformations

1h 16s

Intro
0:00
Premise
0:32
Premise
0:33
Goal
1:37
Goal Number 1: Find the Full Distribution Function
1:38
Goal Number 2: Find the Density Function
1:49
Goal Number 3: Calculate Probabilities
2:04
Three Methods
2:34
Method 1: Distribution Functions
2:35
Method 2: Transformations
2:57
Method 3: Moment-generating Functions
3:05
Requirements for Transformation Method
3:22
The Transformation Method Only Works for Single-variable Situations
3:23
Must be a Strictly Monotonic Function
3:50
Example: Strictly Monotonic Function
4:50
If the Function is Monotonic, Then It is Invertible
5:30
Formula for Transformations
7:09
Formula for Transformations
7:11
Example I: Determine whether the Function is Monotonic, and if so, Find Its Inverse
8:26
Example II: Find the Density Function
12:07
Example III: Determine whether the Function is Monotonic, and if so, Find Its Inverse
17:12
Example IV: Find the Density Function for the Magnitude of the Next Earthquake
21:30
Example V: Find the Expected Magnitude of the Next Earthquake
33:20
Example VI: Find the Density Function, Including the Range of Possible Values for u
47:42
Moment-Generating Functions

1h 18m 52s

Intro
0:00
Premise
0:30
Premise
0:31
Goal
1:40
Goal Number 1: Find the Full Distribution Function
1:41
Goal Number 2: Find the Density Function
1:51
Goal Number 3: Calculate Probabilities
2:01
Three Methods
2:39
Method 1: Distribution Functions
2:40
Method 2: Transformations
2:50
Method 3: Moment-Generating Functions
2:55
Review of Moment-Generating Functions
3:04
Recall: The Moment-Generating Function for a Random Variable Y
3:05
The Moment-Generating Function is a Function of t (Not y)
3:45
Moment-Generating Functions for the Discrete Distributions
4:31
Binomial
4:50
Geometric
5:12
Negative Binomial
5:24
Hypergeometric
5:33
Poisson
5:42
Moment-Generating Functions for the Continuous Distributions
6:08
Uniform
6:09
Normal
6:17
Gamma
6:29
Exponential
6:34
Chi-square
7:05
Beta
7:48
Useful Formulas with the Moment-Generating Functions
8:48
Useful Formula 1
8:49
Useful Formula 2
9:51
How to Use Moment-Generating Functions
10:41
How to Use Moment-Generating Functions
10:42
Example I: Find the Density Function
12:22
Example II: Find the Density Function
30:58
Example III: Find the Probability Function
43:29
Example IV: Find the Probability Function
51:43
Example V: Find the Distribution
1:00:14
Example VI: Find the Density Function
1:12:10
Order Statistics

1h 4m 56s

Intro
0:00
Premise
0:11
Example Question: How Tall Will the Tallest Student in My Next Semester's Probability Class Be?
0:12
Setting
0:56
Definition 1
1:49
Definition 2
2:01
Question: What are the Distributions & Densities?
4:08
Formulas
4:47
Distribution of Max
5:11
Density of Max
6:00
Distribution of Min
7:08
Density of Min
7:18
Example I: Distribution & Density Functions
8:29
Example I: Distribution
8:30
Example I: Density
11:07
Example I: Summary
12:33
Example II: Distribution & Density Functions
14:25
Example II: Distribution
14:26
Example II: Density
17:21
Example II: Summary
19:00
Example III: Mean & Variance
20:32
Example III: Mean
20:33
Example III: Variance
25:48
Example III: Summary
30:57
Example IV: Distribution & Density Functions
35:43
Example IV: Distribution
35:44
Example IV: Density
43:03
Example IV: Summary
46:11
Example V: Find the Expected Time Until the Team's First Injury
51:14
Example V: Solution
51:15
Example V: Summary
1:01:11
Sampling from a Normal Distribution

1h 7s

Intro
0:00
Setting
0:36
Setting
0:37
Assumptions and Notation
2:18
Assumption Forever
2:19
Assumption for this Lecture Only
3:21
Notation
3:49
The Sample Mean
4:15
Statistic We'll Study the Sample Mean
4:16
Theorem
5:40
Standard Normal Distribution
7:03
Standard Normal Distribution
7:04
Converting to Standard Normal
10:11
Recall
10:12
Corollary to Theorem
10:41
Example I: Heights of Students
13:18
Example II: What Happens to This Probability as n → ∞
22:36
Example III: Units at a University
32:24
Example IV: Probability of Sample Mean
40:53
Example V: How Many Samples Should We Take?
48:34
The Central Limit Theorem

1h 9m 55s

Intro
0:00
Setting
0:52
Setting
0:53
Assumptions and Notation
2:53
Our Samples are Independent (Independent Identically Distributed)
2:54
No Longer Assume that the Population is Normally Distributed
3:30
The Central Limit Theorem
4:36
The Central Limit Theorem Overview
4:38
The Central Limit Theorem in Practice
6:24
Standard Normal Distribution
8:09
Standard Normal Distribution
8:13
Converting to Standard Normal
10:13
Recall: If Y is Normal, Then …
10:14
Corollary to Theorem
11:09
Example I: Probability of Finishing Your Homework
12:56
Example I: Solution
12:57
Example I: Summary
18:20
Example I: Confirming with the Standard Normal Distribution Chart
20:18
Example II: Probability of Selling Muffins
21:26
Example II: Solution
21:27
Example II: Summary
29:09
Example II: Confirming with the Standard Normal Distribution Chart
31:09
Example III: Probability that a Soda Dispenser Gives the Correct Amount of Soda
32:41
Example III: Solution
32:42
Example III: Summary
38:03
Example III: Confirming with the Standard Normal Distribution Chart
40:58
Example IV: How Many Samples Should She Take?
42:06
Example IV: Solution
42:07
Example IV: Summary
49:18
Example IV: Confirming with the Standard Normal Distribution Chart
51:57
Example V: Restaurant Revenue
54:41
Example V: Solution
54:42
Example V: Summary
1:04:21
Example V: Confirming with the Standard Normal Distribution Chart
1:06:48

• ## Transcription

 1 answerLast reply by: Dr. William MurrayMon Dec 14, 2015 10:12 AMPost by Alexander Karakosta on December 12, 2015using Y1 and Y2 makes it more difficult than it already is to follow some of these notes. I don't understand why you would want to use this method instead of a traditional X and Y.

### Marginal Probability

Lecture Slides are screen-captured images of important points in the lecture. Students can download and print out these lecture slide images to do practice problems as well as take notes while watching the lecture.

• Intro 0:00
• Discrete Case 0:48
• Marginal Probability Functions
• Continuous Case 3:07
• Marginal Density Functions
• Example I: Compute the Marginal Probability Function 5:58
• Example II: Compute the Marginal Probability Function 14:07
• Example III: Marginal Density Function 24:01
• Example IV: Marginal Density Function 30:47
• Example V: Marginal Density Function 36:05

### Transcription: Marginal Probability

Hi there, these are the probability videos here on www.educator.com, my name is Will Murray.0000

We are working through a series of videos on experiments involving two variables.0005

Right now, we will always have a Y1 and a Y2 in all of our experiments.0011

We are talking about joint density functions and things like that.0016

Today, we will talk about marginal probability.0020

Marginal probability is kind of a tool that you use in the service0024

of calculating conditional probability and conditional expectation.0028

We do have another video that comes after this one where you will see this being used0033

in the service of conditional probability and conditional expectation.0038

In this video, we are just going to learn what marginal probability is.0042

We will learn how to calculate it, we will practice it with some examples.0046

Marginal probability is something you can talk about with discrete probability or with continuous probability.0052

As I said, we have an experiment with two random variables Y1 and Y2.0058

We are going to talk about the marginal probability function.0064

There is a marginal probability function for Y1 and then there is a separate marginal probability function for Y2.0068

What they mean is, the marginal probability function of Y1, we call it P1 of Y sub 1.0075

By definition, the := that means it is defined to be.0082

It is defined to be the probability that Y1 will be a particular value of y1.0087

The way you would find that is, you would look at all the possible values of Y2.0094

Add up all the possible values of Y2 and then find the probability of each combination0100

of that particular fixed value of Y1 with all the different possible values of Y2.0106

Let me emphasize here that, that Y1 there is fixed and we are adding up over all the possible values of Y2.0113

And then, you can also talk about the marginal probability function of Y2 which means you have a fixed value of Y2.0123

You are trying to find the probability of getting that particular value of Y2.0130

The way you find that, is you add up all the probabilities of combinations with all the different possible y1s.0135

This is really quite confusing for students because there is a sort of a subscript change here.0143

We are finding the marginal probability function for Y1, notice that there are 1’s in the subscripts there.0148

What we will do is we add up over all the possible values of Y2, and then, vice versa.0155

When we are finding the marginal probability function for Y2, we add up over all the possible values of Y1.0161

That gets a little confusing, we will do some practice with this.0168

You will see in the examples how it works out.0170

That is just the discrete case, the continuous case is very much analogous.0173

We are going to take a look at that with the continuous case but you will see it is kind of the same thing.0179

Except that, we are going to change the summation signs to integral signs.0183

The continuous case, we will talk about the marginal density function of Y1 and Y2.0189

For Y1, we talk about F sub 1 of Y1.0195

Again, we add up, instead of P of Y1 Y2, it is F of Y1 Y2, the joint density function.0200

We take the integral over all possible values of Y2.0207

To make it most general here, I have written from Y2 goes from -infinity to infinity.0213

For the marginal density function of Y2, we add up or we take the integral over all possible values of Y1.0221

I wrote Y1 goes from -infinity to infinity.0231

It is true in many of our experiments and many of our problems,0237

but we do not actually have distributions covering an infinite range.0242

You would not actually have to integrate from -infinity to infinity.0246

The point here is, you just integrate over whatever the range is for that particular variable.0250

Here, you just integrate over the full range for Y1.0257

Here, you just integrate over the full range for Y2.0263

Whatever those full ranges are, for whatever that variable is, that is what you integrate over.0272

Again, we had that same subscript change that I mentioned with the discrete case,0277

which is that, when you are finding the marginal density function for Y1,0281

what you end up doing is you integrate over Y2.0286

Be very careful about that, when we are finding marginal density function for Y1, your variable of integration is Y2.0290

And of course, vice versa , when you are finding the marginal density function for Y2, your variable of integration is Y1.0298

That does get to be a little confusing, we will try to keep it straight.0307

As I said, the marginal density functions are, there are something that are used in computing conditional probability.0311

We will come back in the next video, the next day lecture, and we will see how these are used.0322

The purpose of this video is just to practice calculating the marginal probability functions.0329

It may not be that enlightening, why we are doing this at this point.0333

We are going to work through some examples, calculate the marginal probability functions, and just see what we get.0337

In the next video, we will come back and we will calculate conditional probability,0345

conditional expectation, using these marginal probability functions.0349

And you will see how we can actually apply these to real settings.0353

In example 1, we are going to roll to dice, a red dice and a blue dice.0359

The variables are not going to be the traditional ones you might think of.0363

Often, you would say Y1 is going to be the value of the red dice, Y2 is the value of the blue dice.0366

I changed that around a little bit to make it a little more interesting.0371

Y1 is going to be the value showing on the red dice.0374

Y1 could be anywhere from, when you roll a dice, you can get anywhere from 1 up to 6.0380

Y2 is the total showing on both dice, that can be anywhere from 2 up to 12.0388

They are not exactly symmetric, these two variables here.0397

We are going to calculate the marginal probability function of Y1.0399

By the way, we are going to do the same example again for example 2,0404

except that we will calculate the marginal probability function of Y2.0407

We will get to see both of them, in some kind of different behavior there.0411

Let us figure out the marginal probability function of Y1.0416

We showed two ways to calculate this.0421

Let me remind you the definition of the marginal probability function.0423

P1 of Y1, by definition, is the probability that Y1 is going to be equal to that particular value of Y1.0427

Then, the other way to think about it is, you add up over all the possible values of Y2,0437

of the probability of each combination Y1 and Y2.0443

I showed two ways to think about that.0449

I will calculate at least a couple of these using both ways and I will try to figure out which one is easier.0453

I’m going to look at all the possible values for Y1, and that is all the values from 1 to 6.0459

When Y1 is 1, P1 of 1, one way to think about it is, what is the probability that we are going to get that particular value of 1.0466

That is just the probability that the red dice comes up to be 1.0477

That is definitely P of Y1 = 1, that is definitely 1/6.0482

Another way to think about that is to say, it is the sum from Y2, all the possible values of Y2 which is 2 up to 12 of 1, Y2.0489

Then, we would add up P of 1, 2 + P of 1, 3, all way up to the probability of 1, 12.0504

That should not be 1/12, that should be 1, 12.0519

We asked ourselves, what the probability of each one of those combinations is?0523

Let us think about what the probability of 1, 2 is.0528

That means you got a 1 on the red dice and a total of 2, which the blue dice would have to be the 1 as well.0531

The probability of getting a 1,1 in the two dice is 1/36.0539

The probability of getting a 1 in the red dice and 3 total, means you have to get 1 red and 2 on the blue dice, that is also 1/36.0546

All the way up to here to the probability of getting a 1 on the red dice and a 7 total, which is still 1/36.0558

The probability of 1 on the red dice and getting 8 total.0569

What is the probability of getting a 1 on the red dice and 8 on the total?0575

In order to get that, you have to get a 7 on the blue dice.0579

You cannot get a 7 on a single roll of a dice, that probability is 0.0582

In turn, the probability of getting a 1 on the red dice and a 9 total is still 0.0590

All the way up to getting 1 on the red dice and 12 total, that probability is 0.0597

What we got here is adding up some fractions, they are all 1/36.0602

I think there is going to be 2, 3, 4, 5, 6, 7, that was 6 total.0607

It is 6/36, that is 1/6.0612

That agrees with what we had earlier, when we figure out the probability of getting 1 on the red dice.0618

We showed two different ways to calculate this.0625

In this case, if you look around, clearly one of them is much easier.0628

It was much easier to calculate just looking at the probability that Y1 is 1,0632

then by breaking it down over all the possible combinations of Y2.0637

I have to calculate the same kinds of things for all the other possible values of Y1.0643

But, I definitely going to use the first technique because it seems much easier.0648

P1 of 2 is the probability that Y1 is equal to 2.0652

Again, that is the probability that I'm getting a 2 on the red dice and that is also 1/6.0660

I could break it up using this long method that I did before, but I think it is clear now that,0666

that is not the most efficient method.0675

I could write it as probability of 2,2 + the probability of 2,3 all the way up to the probability of 2,12.0678

I could figure out each one of those probabilities but it is clear that that would take much longer.0689

I’m going to ignore that.0694

I will go ahead and calculate the rest of my probability function.0696

P1 of 3 it is the probability that Y1 is 3.0700

Again, it is getting 3 on the red dice, your probability is 1/6.0706

P1 of 4 is 1/6, P1 of 5 is 1/6, and P1 of 6 is 1/6.0712

All of those come out to be 1/6.0728

My marginal probability function P1 of Y1 is just given by P1 of 1 is 1/6, P1 of 2 is 1/6, P1 of 3 is 1/6, and so on,0732

for all the other possible values of Y1.0746

That gives us the probability function, that finishes example 1.0750

Let me recap that.0754

There are sort of 2 ways we could have calculated that.0755

We are going to look at each possible value of Y1.0758

And then, for each one, we can either calculate the probability that Y1 is that value,0762

or we can expand it out over all the possible values of Y2 and add up all the individual combinations.0767

What we discovered is that although it would be possible to do it by expanding it out,0774

we actually did it for the case of Y1 = 1 here.0778

It is much easier just to find the probability that Y1 = Y1.0782

For each one of the values, 1, 2, 3, 4, 5, and 6, we got probabilities of 1/6.0788

That would really answer the question.0799

If we want to do it the long way, we would have to look at all the possible values of Y2 going from 2 to 12.0801

Add up the probability of 1 and those values for each one.0808

For some of them we get 1/3 because those are the probabilities of rolling a 1 on the red dice and0812

you would have to be 1 on the blue dice.0820

Or 1 on the red dice and you do not have to be 2 on the blue dice.0822

Those will give you 1/36.0825

If we look at the combinations that have 1 on the red dice and anything bigger than 8 as a total, that cannot happen.0827

All those probabilities were 0, we will end up with 6/36 which gives us that same 1/60835

that we got before, but it took much more work to find it that way.0840

Let us avoid that way, if we can.0844

In example 2, we have a following up on example 1.0849

We are rolling two dice, there is a red dice and a blue dice, same two variables as before.0852

Y1 is what shows on the red dice and Y2 is the total.0858

We are going to calculate the marginal probability function P2 of Y2.0864

Let me remind you that Y1 can take values from 1 to 6 here, because the red dice can show anything from 1 to 6.0870

Y2 is the total, that could be anywhere from 2 to 12.0877

We want to find the probabilities of each one of those individual values of Y2.0883

P2 of Y2, one way to think about it is the probability that Y2 is going to be that particular value of y2.0895

The way I want to think about it is, to add up the probabilities of all the combinations of P1 of Y1 Y2 over all the possible values of Y1.0903

We will the try a couple those out and we will see, maybe, which one is easier in each case.0916

We want to look at this for each possible value of Y2.0924

The first possible value of Y2 is 2, P2 of 2.0928

I will do it the long way first, I will add up over all the Y1.0935

That could be 1, 2, 3, all the way up to 6.0941

Let us figure out what the probabilities are for each one of those.0955

The probability of 1,2, that means you are getting a 1 on the red dice and a 2 total, which mean the blue dice would have to be a 1.0957

That probability of getting 1 on both dice is 1/36.0967

The probability of 2,2 means you get a 1 on the red dice and we have to get a 0 on the blue dice, to add up to 2.0973

That cannot happen, there is a 0 there.0981

The probability of getting a 3 on the red dice and a 2 total.0984

Certainly, it is not going to happen, and nor or any of these other possibilities, that is just 1/36.0987

The probability of, let us try calculating that directly because I think it might be a little faster.0999

P2 of 2 is also equal to the probability that Y2 is equal 2, which means what is the probability of getting a total of 2.1008

In order to get a total of 2, you have to get a 1 on the red dice and 1 on the blue dice, that is exactly 1/36.1021

Look, that is really much faster to calculate it that way.1030

Why do not we calculate it that way, from now on.1034

P2 of 3, I will just show a long way for one more but I also show the short way, and we will see how it goes.1039

That is the probability of 1, 3 + the probability of 2, 3 + the probability of 3, 3, up to the probability of 6, 3.1048

The probability of 1, 3 that means 1 on the red dice and 2 total.1062

I’m sorry, 3 total which means 2 on the blue dice.1067

That would mean that you have to get to 1 on the red dice, 2 on the blue dice, there is a 1/36 chance of that.1072

2 on the red dice and 1 on the blue dice, it would also be 1/36.1080

And then, 3 on the red dice would have to be 0 on the blue dice, that cannot happen.1086

All these others are 0, I will just write it as 2/36.1092

I'm not going to simplify that, of course, you could simplify it to 1/18.1097

I think it will be easier to spot a pattern, if I leave it unsimplified.1101

I will just leave it as 2/36.1105

Another way to calculate that would be, the probability that Y2 is equal to 3.1108

What is the probability of getting a total of 3, when you roll two dice?1114

The way you can get a total of 3 is red dice 1 blue dice 2 or blue dice 1 red dice 2.1122

There are two ways to get that, out of 36 possible rolls, that is 2 out of 36.1132

I think that is really much shorter.1139

I’m just going to calculate the others using the short way because otherwise, I will run out of space here.1140

P2 of 4, how many different ways are there to get a 4, when you roll two dice?1148

We could do 1-3, 2-2, or 3-1, that is 3 out of 36.1154

Again, I think I’m not going to simplify that, even though it is obvious that it could simplify.1160

But, it is going to be easy to spot a pattern.1166

In fact, maybe you already can spot a pattern because we got 1/36, 2/36, 3/36.1168

P2 of 5, how many ways are there to get a 5?1176

It could go 1-4, 2-3, 3-2, or 4-1, that is 4 out of 36.1180

P2 of 6, how many ways are there to get a 6?1189

There are 5 different ways because you can go 5-1, 4-2, 3-3, 2-4, or 1-5.1194

P2 of 7 is 6 out of 36 because there are 6 different ways that you can get a total of 7.1205

You can go 6-1, 5-2, 4-3, 3-4, 2-5, 1-6.1215

P2 of 8, these patterns changed its course here.1228

We had 1 out of 36, 2 out of 36, 4 out of 36, 5 out of 36, 6 out of 36.1237

You might think logically that we are going to go 7 out of 36.1243

It actually peaks at P2 of 7, it trails back to 5 out of 36.1246

The reason for that is that, there are only five ways to get an 8, when you roll two dice.1252

You can go 6-2, 5-3, 4-4, 3-5, or 2-6.1259

Your pattern changed it course and it starts to drop off again.1268

P2 of 9, there are four ways to get a 9 because it can go 6-3, 5-4, 4-5, 3-6, that is 4 out of 36.1273

P2 of 10 is 3 out of 36, there are three ways to get a 10.1283

P2 of 11 is 2 out of 36, and finally, P2 of 12, how many ways are there to get a 12?1291

There is just one way you have to get double 6, 1 out of 36.1301

All of these, you could calculate using the expanded form.1306

But the expanded form was obviously taking much too long.1311

I got my answer more easily, just by thinking directly about Y2, about the total of the dice.1315

That gives my answers, I found the marginal probably function P2 of Y2.1323

1/36 from 2, 2/36 for 3, and so on.1329

Going down for all the possible values of Y2, I gave you a fraction representing the probability.1334

To recap there, there are two ways you can calculate this.1343

You can calculate directly the probability that Y2 will be equal to any given value y2.1346

Or you can sum it up over all the possible Y1.1353

Summing it up over all the possible Y1, we did that for Y2 = 2.1357

We took Y1 = 1, 2, 3, 4, 5, 6.1362

Discovered that only one of those gives us any probability because for all the other possible values of Y1,1368

there is no way to make the given total.1373

We just got 1/36.1376

But then, a much easier way we discover to calculate it was that, just to find the probability that the total will be 2.1378

The only way to get that is by getting a 1-1, when you roll, it is 1/36.1385

P2 of 3 means we sum up over all the possible values of Y1, 1, 2, 3, 4, 5, 6.1391

Two of them gives us positive probabilities, the rest are all 0.1399

It gives us 2 out of 36.1403

But then, the probability directly that Y2 is 3 just means you are looking at two different possible rolls,1406

a 1 and 2 or 2 and 1.1414

You will get 2 out of 36 that way.1416

It really looks like that is the shorter way.1418

For all of these other values, I just thought about all the possible ways to get the given total and then counted them up.1420

That is how I got these different numbers, that seems to be a better way to solve that.1427

At least, in the discreet case.1432

We will see in the next few examples, how we are going to use an integral to solve the same kind of thing in the continuous case.1434

In examples 3, we have a continuous problem with the marginal probability.1443

We are given the joint density function F of Y1 Y2 is 6 × (1 - Y2).1450

It is important to pay attention to the domain that we are given here which is that Y1 and Y2 are both between 0 and 1.1458

Let me go ahead and draw this out, as I talk about it.1467

Y1 and Y2, there is Y1, goes from 0 to 1 and Y2 also goes from 0 to 1.1470

But we are told that Y2 has to be bigger than or equal to Y1.1479

If you think about that, in terms of X and Y, we got the same values bigger than X.1486

Let me go ahead and draw the line Y = X.1491

There it is, right there.1496

We want the region that is above that line because we want Y2 to be bigger than or equal to Y1.1498

It is that triangular region above that diagonal line there.1506

Let me colored that in blue.1510

That is the region that we are looking at.1513

We have been asked now to find the marginal density function F1 of Y1.1517

Let me remind you of the definition of marginal density function.1522

F1 of Y1, remember, there is this variable change where, if you are trying to find F1 of Y1,1527

what you do is you integrate over Y2.1534

And then, you find the density function F of Y1 Y2, the joint density function.1538

You integrate that with respect to Y2.1543

That means we have to describe this region, in terms of Y2.1546

Let me draw what the region looks like from the perspective of Y2.1552

Y2 rows up from that diagonal line up to, let me draw that a little bit longer, up to the line Y2 = 1.1559

If we describe that region, in terms of Y2, we would say, the diagonal line was Y1 = Y2 or Y2 = Y1.1572

We would say Y2, the limits on Y2, if you want to describe that region, the Y2 goes from Y1, the diagonal line, up to 1.1583

That is what we use as the limits on our integral.1596

This is the integral from Y2 = Y1 to Y2 = 1.1600

The density function, I’m just going fill it in, I will expand it out.1606

6 -6 Y2, and we are going to integrate this with respect to Y2.1611

That is a pretty easy integral, the integral of 6 is just 6Y2.1618

The integral of 6Y2 is 3Y2².1624

I’m supposed to evaluate that from Y2 = Y1 to Y2 = 1.1629

I said Y2 = 1, I should have said Y2 = Y1 on the lower limit there and Y2 = 1 on the upper limit.1637

If we plug those in, we get 6.1646

I said 3Y2, it should have been 3Y2² because we are integrating 6Y2 there.1649

If we plug in Y2 = 1, then we get 6 -3.1657

Plugging in Y2 = Y1 for the lower limit, I get -6Y1 + 3Y1².1662

I can simplify that a little bit and rearrange a bit.1676

F1 of Y1 is 3Y1² -6 Y1 + 6 -3 is + 3, that is my marginal density function for Y1.1680

Let me recap the steps there.1696

I want to find the marginal density function F1 of Y1.1699

There is this variable switch where you integrate the joint density function over Y2.1702

You have to describe your region, in terms of Y2.1711

Our region, in terms of Y2, goes from the lower line, the diagonal line Y2 = Y1 to that upper bound, is Y2 = 1.1715

That is where I got these limits from, it comes from that graph right there.1726

That is where I got those limits.1731

And then, I plug those limits into the integral.1733

I also plugged in the density function that we are given.1736

I integrate that with respect to Y2 and that is a pretty easy integral.1739

I plug in my limits and what I end up with is a function, in terms of Y1 which is what is supposed to happen.1743

You always want your marginal density function of Y1 to be a function of Y1.1751

In the next example, in example 4 of this lecture, what we are going to do is1756

we are going to come back and look at this same scenario.1760

We are going to find the marginal density function of Y2.1764

If we do it right then we will get a function of Y2.1767

This is important to notice that we get a function of Y1 at the end here.1772

By the way, this example is kind of a setup from example that we are going to use in the next lecture1784

on conditional probability and conditional expectation.1792

We would use this answer in the next lecture.1796

You want to make sure that you understand this.1800

We are going to use it for, I think it is going to be example 3 in the next lecture1802

which is on conditional probability and conditional expectation.1816

If you are wondering, what do we just calculate here, probability.1824

What is it good for, what we can use this for, if you want to take a sneak peek ahead,1830

just get forward to the next lecture and look at example 3.1836

You will see the same setup and you will see the same function.1839

We are actually going to use it to calculate some probability.1841

In example 4, we are looking at the same setup that we had in example 3.1848

But, instead of calculating the marginal density function in terms of Y1,1852

that is what we did back in example 3, we are going to calculate the marginal density function in terms of Y2.1856

Let me just redraw that graph, it is the same graph that we had in example 3.1864

We have Y1 and Y2 here, they both go from 0 to 1 but we are given that Y2 is bigger than Y1,1870

which means we are only going to stay above the line, the Y =X line, the Y2 = Y1 line.1879

We are looking at this triangular region above the line Y=X.1888

That is the domain of definition for our joint density function.1893

Now, we are trying to find the marginal density function F2 of Y2.1900

Remember, there is always this variable switch.1904

When you try to find the marginal density function for Y2, you integrate over Y1 and then1906

you integrate the joint density function F of Y1 Y2, your variable is DY1.1912

That means, we have to describe this region in terms of Y1.1919

Let us think about what Y1 does here.1925

Y1 grows up from the left hand side is the vertical line Y1 = 0.1928

The right hand side is the line Y1 = Y2, the diagonal line there.1936

My limits here on Y1 are from 0 to Y2, that is what we are going to use to set up the integral.1942

We had the integral from Y1 = 0 to Y1 = Y2.1953

Our joint density function is still 6 × 1 - Y2.1959

We have DY1.1965

There is a very seductive mistake to be made here and my students often make it.1969

It is very easy to make.1975

You see a Y2 in your integral and you want to integrate it with respect to Y2.1976

You see that Y2 and think that the integral is Y2²/2.1985

Not so, because we are integrating with respect to Y1.1991

Y1 is our variable of integration, when we see the Y2 that is just a big old constant.1996

This integral 6 × 1 - Y2, integrate that just as a constant, we just get all that × Y1.2002

We are going to evaluate that from Y1 = 0 to Y1 = Y2.2014

We get 6 × 1 - Y2 × Y2, when Y1 is 0 we get nothing, I just get -0.2024

If I simplify this down a little bit, I will get 6Y2.2034

I’m just going to distribute everything across.2039

6Y2 -6Y2², that is my joint density function in terms of Y2.2041

Notice, as I mentioned before in example 3, we do want to get a function in terms of Y2.2051

That is a function of Y2 which is a good thing, that kind of confirms that I have been doing at least some of my work correctly.2060

Because, I do not want to get a function that involves Y1 in any way, at the end of this2070

because the joint density function is always a function of Y2.2074

That finishes off example 4, but let me recap the steps here.2082

This is the same graph that we drew for example 3.2086

It comes from looking at this definition of the region here.2090

We draw the graph, the same graph we have for example 3.2094

But, the difference from example 3 now is that, we are finding the marginal density function of Y2, instead of Y1.2097

There is this variable switch that means we are integrating over Y1.2104

I need to describe my region, in terms of bounds for Y1.2112

That is why I drew a horizontal line here, instead of the vertical line that we had before.2117

I describe this region as going from Y1 = 0 to the diagonal line Y1 = Y2.2122

That gave me my limits that I use on the integral.2128

It is relatively easy integral, as long as you do not fall into the trap of thinking, look I have a Y2, I will just integrate that Y2.2133

Remember, we are integrating with respect to Y1.2145

The 6 × 1 - Y2 is a constant.2148

You just get that constant × Y1, plug in our values for Y1, and we end up with a function of Y2.2151

That represents our marginal density function, in terms of Y2.2158

In our last example here, example 5, we have a joint density function F of Y1 Y2 = E ⁻Y2.2167

Our region here, both Y1 and Y2 go from 0 to infinity.2176

Let me set up my axis and we will try to draw that.2183

There is Y1, always on the horizontal axis and Y2 is always on the vertical axis.2185

They both start at 0.2191

The catch here is that Y2 was always bigger than Y1.2193

Let me draw the line Y1 = Y2 or Y = X.2198

We want Y2 bigger than Y1 which means we want the region above this line.2203

Let me color in that region above this line.2209

Of course, it goes on to infinity.2212

I’m not going to draw all of it but that is the general shape.2215

It is this triangular region.2218

What we want to do is, find the marginal density function F1 of Y1.2220

Let me remind you of the definition of marginal density.2225

F1 of Y1, I gave you the formula for this back on the third slide to this lecture.2228

There is this variable switch where you always integrate over the other variable.2236

It is the integral over Y2 of the joint density function F of Y1 Y2 DY2.2240

I need to describe that region, in terms of Y2 so that I can set up my limits on the integral.2251

I want to think about what the limits would be on that region, in terms of Y2.2262

Y2, that means take a vertical arrow here.2268

Y2 grows up from that line, it goes on to infinity, does not it.2272

There is infinity and there is the line Y2 = Y1.2278

My region can be described as Y2 goes from the diagonal line Y1 up to infinity.2284

Those are the bounds on my integral, Y2 = Y1 to Y2 goes to infinity.2296

My joint density function is just E ^- Y2.2306

I’m integrating with respect to Y2.2310

The integral of E ⁻Y2 with respect to Y2 is - E ^- Y2.2315

I did a u substitution in my head to get that integral.2322

I want to evaluate that from Y2 = Y1 = Y1 to Y2, take the limit as it goes to infinity.2328

If I plug in infinity, that is E ^- infinity.2339

Which means, 1 divided by E ⁺infinity.2346

1/ infinity is just 0 - - E ⁻Y1 because I plugged in Y2 = Y1.2350

That simplifies down to E ⁻Y1.2361

Let me remind you what we are calculating here.2367

F1 of Y1, the marginal density function Y1.2370

It is kind of reassuring here that, when we calculate that, we get a function of Y1.2374

That is what we should get, when we calculate a marginal density function.2380

That is our answer, let me recap the steps on this.2386

First, I tried to graph this region.2392

Y1 and Y2 both go from 0 to infinity but we are looking at the region where Y2 is bigger than Y1.2395

That is why we have this upper triangular region here, all the region above the line Y=X there.2402

Then, to find the marginal density function for Y1, we have to integrate over Y2.2410

There is always this variable switch, that is always very confusing.2416

In order to find the region of integration, I tried to describe that region in terms of Y2.2422

The lower bound for Y2 is this diagonal line, that is where I got this Y2 = Y1.2429

Upper bound, it goes on forever, that is why there is an infinity there.2434

I plugged those in and I get this region of integration.2438

The density function just comes from the stem of the problem E ⁻Y2.2443

Integrate that with respect to Y2, did a u substitution in my head, u = -Y2.2448

It came up with –E ⁻Y2, and then I plugged in Y2.2454

The limit is, as it goes to infinity that just takes you to 0 and Y2 = Y1 gives me my Y1 in the exponent.2459

It simplifies down to this nice function of Y1 which is appropriate, because we should get a function of Y1,2467

when we are looking for the marginal density function of Y1.2475

I really want you to understand this example, because we are going to use this example again2479

for calculating some conditional and expected probability, in the next lecture.2485

I think this example is set to be example 5, in the next lecture.2492

We are going to use the answer to this example in the next lecture which is on conditional probability and conditional expectation.2499

You will see this coming back, I want to make sure you understand this now so that when we dropped the answer in,2514

you will not be confused by it, in the next lecture probability.2520

Make sure that everything is good with this example.2528

In the meantime, that wraps up this lecture on marginal probability.2532

As I mentioned in the beginning, marginal probability is really a tool in the service of conditional probability.2536

You will see a lot of the stuff used in the next lecture on condition probability and conditional expectation.2541

I hope you will stick around for this.2547

These are a part of a larger series of lectures on probability, here on www.educator.com.2549

My name is Will Murray, thank you for joining us today, bye.2556

OR

### Start Learning Now

Our free lessons will get you started (Adobe Flash® required).