William Murray

William Murray

Independent Random Variables

Slide Duration:

Table of Contents

Section 1: Probability by Counting
Experiments, Outcomes, Samples, Spaces, Events

59m 30s

Intro
0:00
Terminology
0:19
Experiment
0:26
Outcome
0:56
Sample Space
1:16
Event
1:55
Key Formula
2:47
Formula for Finding the Probability of an Event
2:48
Example: Drawing a Card
3:36
Example I
5:01
Experiment
5:38
Outcomes
5:54
Probability of the Event
8:11
Example II
12:00
Experiment
12:17
Outcomes
12:34
Probability of the Event
13:49
Example III
16:33
Experiment
17:09
Outcomes
17:33
Probability of the Event
18:25
Example IV
21:20
Experiment
21:21
Outcomes
22:00
Probability of the Event
23:22
Example V
31:41
Experiment
32:14
Outcomes
32:35
Probability of the Event
33:27
Alternate Solution
40:16
Example VI
43:33
Experiment
44:08
Outcomes
44:24
Probability of the Event
53:35
Combining Events: Multiplication & Addition

1h 2m 47s

Intro
0:00
Unions of Events
0:40
Unions of Events
0:41
Disjoint Events
3:42
Intersections of Events
4:18
Intersections of Events
4:19
Conditional Probability
5:47
Conditional Probability
5:48
Independence
8:20
Independence
8:21
Warning: Independent Does Not Mean Disjoint
9:53
If A and B are Independent
11:20
Example I: Choosing a Number at Random
12:41
Solving by Counting
12:52
Solving by Probability
17:26
Example II: Combination
22:07
Combination Deal at a Restaurant
22:08
Example III: Rolling Two Dice
24:18
Define the Events
24:20
Solving by Counting
27:35
Solving by Probability
29:32
Example IV: Flipping a Coin
35:07
Flipping a Coin Four Times
35:08
Example V: Conditional Probabilities
41:22
Define the Events
42:23
Calculate the Conditional Probabilities
46:21
Example VI: Independent Events
53:42
Define the Events
53:43
Are Events Independent?
55:21
Choices: Combinations & Permutations

56m 3s

Intro
0:00
Choices: With or Without Replacement?
0:12
Choices: With or Without Replacement?
0:13
Example: With Replacement
2:17
Example: Without Replacement
2:55
Choices: Ordered or Unordered?
4:10
Choices: Ordered or Unordered?
4:11
Example: Unordered
4:52
Example: Ordered
6:08
Combinations
9:23
Definition & Equation: Combinations
9:24
Example: Combinations
12:12
Permutations
13:56
Definition & Equation: Permutations
13:57
Example: Permutations
15:00
Key Formulas
17:19
Number of Ways to Pick r Things from n Possibilities
17:20
Example I: Five Different Candy Bars
18:31
Example II: Five Identical Candy Bars
24:53
Example III: Five Identical Candy Bars
31:56
Example IV: Five Different Candy Bars
39:21
Example V: Pizza & Toppings
45:03
Inclusion & Exclusion

43m 40s

Intro
0:00
Inclusion/Exclusion: Two Events
0:09
Inclusion/Exclusion: Two Events
0:10
Inclusion/Exclusion: Three Events
2:30
Inclusion/Exclusion: Three Events
2:31
Example I: Inclusion & Exclusion
6:24
Example II: Inclusion & Exclusion
11:01
Example III: Inclusion & Exclusion
18:41
Example IV: Inclusion & Exclusion
28:24
Example V: Inclusion & Exclusion
39:33
Independence

46m 9s

Intro
0:00
Formula and Intuition
0:12
Definition of Independence
0:19
Intuition
0:49
Common Misinterpretations
1:37
Myth & Truth 1
1:38
Myth & Truth 2
2:23
Combining Independent Events
3:56
Recall: Formula for Conditional Probability
3:58
Combining Independent Events
4:10
Example I: Independence
5:36
Example II: Independence
14:14
Example III: Independence
21:10
Example IV: Independence
32:45
Example V: Independence
41:13
Bayes' Rule

1h 2m 10s

Intro
0:00
When to Use Bayes' Rule
0:08
When to Use Bayes' Rule: Disjoint Union of Events
0:09
Bayes' Rule for Two Choices
2:50
Bayes' Rule for Two Choices
2:51
Bayes' Rule for Multiple Choices
5:03
Bayes' Rule for Multiple Choices
5:04
Example I: What is the Chance that She is Diabetic?
6:55
Example I: Setting up the Events
6:56
Example I: Solution
11:33
Example II: What is the chance that It Belongs to a Woman?
19:28
Example II: Setting up the Events
19:29
Example II: Solution
21:45
Example III: What is the Probability that She is a Democrat?
27:31
Example III: Setting up the Events
27:32
Example III: Solution
32:08
Example IV: What is the chance that the Fruit is an Apple?
39:11
Example IV: Setting up the Events
39:12
Example IV: Solution
43:50
Example V: What is the Probability that the Oldest Child is a Girl?
51:16
Example V: Setting up the Events
51:17
Example V: Solution
53:07
Section 2: Random Variables
Random Variables & Probability Distribution

38m 21s

Intro
0:00
Intuition
0:15
Intuition for Random Variable
0:16
Example: Random Variable
0:44
Intuition, Cont.
2:52
Example: Random Variable as Payoff
2:57
Definition
5:11
Definition of a Random Variable
5:13
Example: Random Variable in Baseball
6:02
Probability Distributions
7:18
Probability Distributions
7:19
Example I: Probability Distribution for the Random Variable
9:29
Example II: Probability Distribution for the Random Variable
14:52
Example III: Probability Distribution for the Random Variable
21:52
Example IV: Probability Distribution for the Random Variable
27:25
Example V: Probability Distribution for the Random Variable
34:12
Expected Value (Mean)

46m 14s

Intro
0:00
Definition of Expected Value
0:20
Expected Value of a (Discrete) Random Variable or Mean
0:21
Indicator Variables
3:03
Indicator Variable
3:04
Linearity of Expectation
4:36
Linearity of Expectation for Random Variables
4:37
Expected Value of a Function
6:03
Expected Value of a Function
6:04
Example I: Expected Value
7:30
Example II: Expected Value
14:14
Example III: Expected Value of Flipping a Coin
21:42
Example III: Part A
21:43
Example III: Part B
30:43
Example IV: Semester Average
36:39
Example V: Expected Value of a Function of a Random Variable
41:28
Variance & Standard Deviation

47m 23s

Intro
0:00
Definition of Variance
0:11
Variance of a Random Variable
0:12
Variance is a Measure of the Variability, or Volatility
1:06
Most Useful Way to Calculate Variance
2:46
Definition of Standard Deviation
3:44
Standard Deviation of a Random Variable
3:45
Example I: Which of the Following Sets of Data Has the Largest Variance?
5:34
Example II: Which of the Following Would be the Least Useful in Understanding a Set of Data?
9:02
Example III: Calculate the Mean, Variance, & Standard Deviation
11:48
Example III: Mean
12:56
Example III: Variance
14:06
Example III: Standard Deviation
15:42
Example IV: Calculate the Mean, Variance, & Standard Deviation
17:54
Example IV: Mean
18:47
Example IV: Variance
20:36
Example IV: Standard Deviation
25:34
Example V: Calculate the Mean, Variance, & Standard Deviation
29:56
Example V: Mean
30:13
Example V: Variance
33:28
Example V: Standard Deviation
34:48
Example VI: Calculate the Mean, Variance, & Standard Deviation
37:29
Example VI: Possible Outcomes
38:09
Example VI: Mean
39:29
Example VI: Variance
41:22
Example VI: Standard Deviation
43:28
Markov's Inequality

26m 45s

Intro
0:00
Markov's Inequality
0:25
Markov's Inequality: Definition & Condition
0:26
Markov's Inequality: Equation
1:15
Markov's Inequality: Reverse Equation
2:48
Example I: Money
4:11
Example II: Rental Car
9:23
Example III: Probability of an Earthquake
12:22
Example IV: Defective Laptops
16:52
Example V: Cans of Tuna
21:06
Tchebysheff's Inequality

42m 11s

Intro
0:00
Tchebysheff's Inequality (Also Known as Chebyshev's Inequality)
0:52
Tchebysheff's Inequality: Definition
0:53
Tchebysheff's Inequality: Equation
1:19
Tchebysheff's Inequality: Intuition
3:21
Tchebysheff's Inequality in Reverse
4:09
Tchebysheff's Inequality in Reverse
4:10
Intuition
5:13
Example I: Money
5:55
Example II: College Units
13:20
Example III: Using Tchebysheff's Inequality to Estimate Proportion
16:40
Example IV: Probability of an Earthquake
25:21
Example V: Using Tchebysheff's Inequality to Estimate Proportion
32:57
Section 3: Discrete Distributions
Binomial Distribution (Bernoulli Trials)

52m 36s

Intro
0:00
Binomial Distribution
0:29
Binomial Distribution (Bernoulli Trials) Overview
0:30
Prototypical Examples: Flipping a Coin n Times
1:36
Process with Two Outcomes: Games Between Teams
2:12
Process with Two Outcomes: Rolling a Die to Get a 6
2:42
Formula for the Binomial Distribution
3:45
Fixed Parameters
3:46
Formula for the Binomial Distribution
6:27
Key Properties of the Binomial Distribution
9:54
Mean
9:55
Variance
10:56
Standard Deviation
11:13
Example I: Games Between Teams
11:36
Example II: Exam Score
17:01
Example III: Expected Grade & Standard Deviation
25:59
Example IV: Pogo-sticking Championship, Part A
33:25
Example IV: Pogo-sticking Championship, Part B
38:24
Example V: Expected Championships Winning & Standard Deviation
45:22
Geometric Distribution

52m 50s

Intro
0:00
Geometric Distribution
0:22
Geometric Distribution: Definition
0:23
Prototypical Example: Flipping a Coin Until We Get a Head
1:08
Geometric Distribution vs. Binomial Distribution.
1:31
Formula for the Geometric Distribution
2:13
Fixed Parameters
2:14
Random Variable
2:49
Formula for the Geometric Distribution
3:16
Key Properties of the Geometric Distribution
6:47
Mean
6:48
Variance
7:10
Standard Deviation
7:25
Geometric Series
7:46
Recall from Calculus II: Sum of Infinite Series
7:47
Application to Geometric Distribution
10:10
Example I: Drawing Cards from a Deck (With Replacement) Until You Get an Ace
13:02
Example I: Question & Solution
13:03
Example II: Mean & Standard Deviation of Winning Pin the Tail on the Donkey
16:32
Example II: Mean
16:33
Example II: Standard Deviation
18:37
Example III: Rolling a Die
22:09
Example III: Setting Up
22:10
Example III: Part A
24:18
Example III: Part B
26:01
Example III: Part C
27:38
Example III: Summary
32:02
Example IV: Job Interview
35:16
Example IV: Setting Up
35:15
Example IV: Part A
37:26
Example IV: Part B
38:33
Example IV: Summary
39:37
Example V: Mean & Standard Deviation of Time to Conduct All the Interviews
41:13
Example V: Setting Up
42:50
Example V: Mean
46:05
Example V: Variance
47:37
Example V: Standard Deviation
48:22
Example V: Summary
49:36
Negative Binomial Distribution

51m 39s

Intro
0:00
Negative Binomial Distribution
0:11
Negative Binomial Distribution: Definition
0:12
Prototypical Example: Flipping a Coin Until We Get r Successes
0:46
Negative Binomial Distribution vs. Binomial Distribution
1:04
Negative Binomial Distribution vs. Geometric Distribution
1:33
Formula for Negative Binomial Distribution
3:39
Fixed Parameters
3:40
Random Variable
4:57
Formula for Negative Binomial Distribution
5:18
Key Properties of Negative Binomial
7:44
Mean
7:47
Variance
8:03
Standard Deviation
8:09
Example I: Drawing Cards from a Deck (With Replacement) Until You Get Four Aces
8:32
Example I: Question & Solution
8:33
Example II: Chinchilla Grooming
12:37
Example II: Mean
12:38
Example II: Variance
15:09
Example II: Standard Deviation
15:51
Example II: Summary
17:10
Example III: Rolling a Die Until You Get Four Sixes
18:27
Example III: Setting Up
19:38
Example III: Mean
19:38
Example III: Variance
20:31
Example III: Standard Deviation
21:21
Example IV: Job Applicants
24:00
Example IV: Setting Up
24:01
Example IV: Part A
26:16
Example IV: Part B
29:53
Example V: Mean & Standard Deviation of Time to Conduct All the Interviews
40:10
Example V: Setting Up
40:11
Example V: Mean
45:24
Example V: Variance
46:22
Example V: Standard Deviation
47:01
Example V: Summary
48:16
Hypergeometric Distribution

36m 27s

Intro
0:00
Hypergeometric Distribution
0:11
Hypergeometric Distribution: Definition
0:12
Random Variable
1:38
Formula for the Hypergeometric Distribution
1:50
Fixed Parameters
1:51
Formula for the Hypergeometric Distribution
2:53
Key Properties of Hypergeometric
6:14
Mean
6:15
Variance
6:42
Standard Deviation
7:16
Example I: Students Committee
7:30
Example II: Expected Number of Women on the Committee in Example I
11:08
Example III: Pairs of Shoes
13:49
Example IV: What is the Expected Number of Left Shoes in Example III?
20:46
Example V: Using Indicator Variables & Linearity of Expectation
25:40
Poisson Distribution

52m 19s

Intro
0:00
Poisson Distribution
0:18
Poisson Distribution: Definition
0:19
Formula for the Poisson Distribution
2:16
Fixed Parameter
2:17
Formula for the Poisson Distribution
2:59
Key Properties of the Poisson Distribution
5:30
Mean
5:34
Variance
6:07
Standard Deviation
6:27
Example I: Forest Fires
6:41
Example II: Call Center, Part A
15:56
Example II: Call Center, Part B
20:50
Example III: Confirming that the Mean of the Poisson Distribution is λ
26:53
Example IV: Find E (Y²) for the Poisson Distribution
35:24
Example V: Earthquakes, Part A
37:57
Example V: Earthquakes, Part B
44:02
Section 4: Continuous Distributions
Density & Cumulative Distribution Functions

57m 17s

Intro
0:00
Density Functions
0:43
Density Functions
0:44
Density Function to Calculate Probabilities
2:41
Cumulative Distribution Functions
4:28
Cumulative Distribution Functions
4:29
Using F to Calculate Probabilities
5:58
Properties of the CDF (Density & Cumulative Distribution Functions)
7:27
F(-∞) = 0
7:34
F(∞) = 1
8:30
F is Increasing
9:14
F'(y) = f(y)
9:21
Example I: Density & Cumulative Distribution Functions, Part A
9:43
Example I: Density & Cumulative Distribution Functions, Part B
14:16
Example II: Density & Cumulative Distribution Functions, Part A
21:41
Example II: Density & Cumulative Distribution Functions, Part B
26:16
Example III: Density & Cumulative Distribution Functions, Part A
32:17
Example III: Density & Cumulative Distribution Functions, Part B
37:08
Example IV: Density & Cumulative Distribution Functions
43:34
Example V: Density & Cumulative Distribution Functions, Part A
51:53
Example V: Density & Cumulative Distribution Functions, Part B
54:19
Mean & Variance for Continuous Distributions

36m 18s

Intro
0:00
Mean
0:32
Mean for a Continuous Random Variable
0:33
Expectation is Linear
2:07
Variance
2:55
Variance for Continuous random Variable
2:56
Easier to Calculate Via the Mean
3:26
Standard Deviation
5:03
Standard Deviation
5:04
Example I: Mean & Variance for Continuous Distributions
5:43
Example II: Mean & Variance for Continuous Distributions
10:09
Example III: Mean & Variance for Continuous Distributions
16:05
Example IV: Mean & Variance for Continuous Distributions
26:40
Example V: Mean & Variance for Continuous Distributions
30:12
Uniform Distribution

32m 49s

Intro
0:00
Uniform Distribution
0:15
Uniform Distribution
0:16
Each Part of the Region is Equally Probable
1:39
Key Properties of the Uniform Distribution
2:45
Mean
2:46
Variance
3:27
Standard Deviation
3:48
Example I: Newspaper Delivery
5:25
Example II: Picking a Real Number from a Uniform Distribution
8:21
Example III: Dinner Date
11:02
Example IV: Proving that a Variable is Uniformly Distributed
18:50
Example V: Ice Cream Serving
27:22
Normal (Gaussian) Distribution

1h 3m 54s

Intro
0:00
Normal (Gaussian) Distribution
0:35
Normal (Gaussian) Distribution & The Bell Curve
0:36
Fixed Parameters
0:55
Formula for the Normal Distribution
1:32
Formula for the Normal Distribution
1:33
Calculating on the Normal Distribution can be Tricky
3:32
Standard Normal Distribution
5:12
Standard Normal Distribution
5:13
Graphing the Standard Normal Distribution
6:13
Standard Normal Distribution, Cont.
8:30
Standard Normal Distribution Chart
8:31
Nonstandard Normal Distribution
14:44
Nonstandard Normal Variable & Associated Standard Normal
14:45
Finding Probabilities for Z
15:39
Example I: Chance that Standard Normal Variable Will Land Between 1 and 2?
16:46
Example I: Setting Up the Equation & Graph
16:47
Example I: Solving for z Using the Standard Normal Chart
19:05
Example II: What Proportion of the Data Lies within Two Standard Deviations of the Mean?
20:41
Example II: Setting Up the Equation & Graph
20:42
Example II: Solving for z Using the Standard Normal Chart
24:38
Example III: Scores on an Exam
27:34
Example III: Setting Up the Equation & Graph, Part A
27:35
Example III: Setting Up the Equation & Graph, Part B
33:48
Example III: Solving for z Using the Standard Normal Chart, Part A
38:23
Example III: Solving for z Using the Standard Normal Chart, Part B
40:49
Example IV: Temperatures
42:54
Example IV: Setting Up the Equation & Graph
42:55
Example IV: Solving for z Using the Standard Normal Chart
47:03
Example V: Scores on an Exam
48:41
Example V: Setting Up the Equation & Graph, Part A
48:42
Example V: Setting Up the Equation & Graph, Part B
53:20
Example V: Solving for z Using the Standard Normal Chart, Part A
57:45
Example V: Solving for z Using the Standard Normal Chart, Part B
59:17
Gamma Distribution (with Exponential & Chi-square)

1h 8m 27s

Intro
0:00
Gamma Function
0:49
The Gamma Function
0:50
Properties of the Gamma Function
2:07
Formula for the Gamma Distribution
3:50
Fixed Parameters
3:51
Density Function for Gamma Distribution
4:07
Key Properties of the Gamma Distribution
7:13
Mean
7:14
Variance
7:25
Standard Deviation
7:30
Exponential Distribution
8:03
Definition of Exponential Distribution
8:04
Density
11:23
Mean
13:26
Variance
13:48
Standard Deviation
13:55
Chi-square Distribution
14:34
Chi-square Distribution: Overview
14:35
Chi-square Distribution: Mean
16:27
Chi-square Distribution: Variance
16:37
Chi-square Distribution: Standard Deviation
16:55
Example I: Graphing Gamma Distribution
17:30
Example I: Graphing Gamma Distribution
17:31
Example I: Describe the Effects of Changing α and β on the Shape of the Graph
23:33
Example II: Exponential Distribution
27:11
Example II: Using the Exponential Distribution
27:12
Example II: Summary
35:34
Example III: Earthquake
37:05
Example III: Estimate Using Markov's Inequality
37:06
Example III: Estimate Using Tchebysheff's Inequality
40:13
Example III: Summary
44:13
Example IV: Finding Exact Probability of Earthquakes
46:45
Example IV: Finding Exact Probability of Earthquakes
46:46
Example IV: Summary
51:44
Example V: Prove and Interpret Why the Exponential Distribution is Called 'Memoryless'
52:51
Example V: Prove
52:52
Example V: Interpretation
57:44
Example V: Summary
1:03:54
Beta Distribution

52m 45s

Intro
0:00
Beta Function
0:29
Fixed parameters
0:30
Defining the Beta Function
1:19
Relationship between the Gamma & Beta Functions
2:02
Beta Distribution
3:31
Density Function for the Beta Distribution
3:32
Key Properties of the Beta Distribution
6:56
Mean
6:57
Variance
7:16
Standard Deviation
7:37
Example I: Calculate B(3,4)
8:10
Example II: Graphing the Density Functions for the Beta Distribution
12:25
Example III: Show that the Uniform Distribution is a Special Case of the Beta Distribution
24:57
Example IV: Show that this Triangular Distribution is a Special Case of the Beta Distribution
31:20
Example V: Morning Commute
37:39
Example V: Identify the Density Function
38:45
Example V: Morning Commute, Part A
42:22
Example V: Morning Commute, Part B
44:19
Example V: Summary
49:13
Moment-Generating Functions

51m 58s

Intro
0:00
Moments
0:30
Definition of Moments
0:31
Moment-Generating Functions (MGFs)
3:53
Moment-Generating Functions
3:54
Using the MGF to Calculate the Moments
5:21
Moment-Generating Functions for the Discrete Distributions
8:22
Moment-Generating Functions for Binomial Distribution
8:36
Moment-Generating Functions for Geometric Distribution
9:06
Moment-Generating Functions for Negative Binomial Distribution
9:28
Moment-Generating Functions for Hypergeometric Distribution
9:43
Moment-Generating Functions for Poisson Distribution
9:57
Moment-Generating Functions for the Continuous Distributions
11:34
Moment-Generating Functions for the Uniform Distributions
11:43
Moment-Generating Functions for the Normal Distributions
12:24
Moment-Generating Functions for the Gamma Distributions
12:36
Moment-Generating Functions for the Exponential Distributions
12:44
Moment-Generating Functions for the Chi-square Distributions
13:11
Moment-Generating Functions for the Beta Distributions
13:48
Useful Formulas with Moment-Generating Functions
15:02
Useful Formulas with Moment-Generating Functions 1
15:03
Useful Formulas with Moment-Generating Functions 2
16:21
Example I: Moment-Generating Function for the Binomial Distribution
17:33
Example II: Use the MGF for the Binomial Distribution to Find the Mean of the Distribution
24:40
Example III: Find the Moment Generating Function for the Poisson Distribution
29:28
Example IV: Use the MGF for Poisson Distribution to Find the Mean and Variance of the Distribution
36:27
Example V: Find the Moment-generating Function for the Uniform Distribution
44:47
Section 5: Multivariate Distributions
Bivariate Density & Distribution Functions

50m 52s

Intro
0:00
Bivariate Density Functions
0:21
Two Variables
0:23
Bivariate Density Function
0:52
Properties of the Density Function
1:57
Properties of the Density Function 1
1:59
Properties of the Density Function 2
2:20
We Can Calculate Probabilities
2:53
If You Have a Discrete Distribution
4:36
Bivariate Distribution Functions
5:25
Bivariate Distribution Functions
5:26
Properties of the Bivariate Distribution Functions 1
7:19
Properties of the Bivariate Distribution Functions 2
7:36
Example I: Bivariate Density & Distribution Functions
8:08
Example II: Bivariate Density & Distribution Functions
14:40
Example III: Bivariate Density & Distribution Functions
24:33
Example IV: Bivariate Density & Distribution Functions
32:04
Example V: Bivariate Density & Distribution Functions
40:26
Marginal Probability

42m 38s

Intro
0:00
Discrete Case
0:48
Marginal Probability Functions
0:49
Continuous Case
3:07
Marginal Density Functions
3:08
Example I: Compute the Marginal Probability Function
5:58
Example II: Compute the Marginal Probability Function
14:07
Example III: Marginal Density Function
24:01
Example IV: Marginal Density Function
30:47
Example V: Marginal Density Function
36:05
Conditional Probability & Conditional Expectation

1h 2m 24s

Intro
0:00
Review of Marginal Probability
0:46
Recall the Marginal Probability Functions & Marginal Density Functions
0:47
Conditional Probability, Discrete Case
3:14
Conditional Probability, Discrete Case
3:15
Conditional Probability, Continuous Case
4:15
Conditional Density of Y₁ given that Y₂ = y₂
4:16
Interpret This as a Density on Y₁ & Calculate Conditional Probability
5:03
Conditional Expectation
6:44
Conditional Expectation: Continuous
6:45
Conditional Expectation: Discrete
8:03
Example I: Conditional Probability
8:29
Example II: Conditional Probability
23:59
Example III: Conditional Probability
34:28
Example IV: Conditional Expectation
43:16
Example V: Conditional Expectation
48:28
Independent Random Variables

51m 39s

Intro
0:00
Intuition
0:55
Experiment with Two Random Variables
0:56
Intuition Formula
2:17
Definition and Formulas
4:43
Definition
4:44
Short Version: Discrete
5:10
Short Version: Continuous
5:48
Theorem
9:33
For Continuous Random Variables, Y₁ & Y₂ are Independent If & Only If: Condition 1
9:34
For Continuous Random Variables, Y₁ & Y₂ are Independent If & Only If: Condition 2
11:22
Example I: Use the Definition to Determine if Y₁ and Y₂ are Independent
12:49
Example II: Use the Definition to Determine if Y₁ and Y₂ are Independent
21:33
Example III: Are Y₁ and Y₂ Independent?
27:01
Example IV: Are Y₁ and Y₂ Independent?
34:51
Example V: Are Y₁ and Y₂ Independent?
43:44
Expected Value of a Function of Random Variables

37m 7s

Intro
0:00
Review of Single Variable Case
0:29
Expected Value of a Single Variable
0:30
Expected Value of a Function g(Y)
1:12
Bivariate Case
2:11
Expected Value of a Function g(Y₁, Y₂)
2:12
Linearity of Expectation
3:24
Linearity of Expectation 1
3:25
Linearity of Expectation 2
3:38
Linearity of Expectation 3: Additivity
4:03
Example I: Calculate E (Y₁ + Y₂)
4:39
Example II: Calculate E (Y₁Y₂)
14:47
Example III: Calculate E (U₁) and E(U₂)
19:33
Example IV: Calculate E (Y₁) and E(Y₂)
22:50
Example V: Calculate E (2Y₁ + 3Y₂)
33:05
Covariance, Correlation & Linear Functions

59m 50s

Intro
0:00
Definition and Formulas for Covariance
0:38
Definition of Covariance
0:39
Formulas to Calculate Covariance
1:36
Intuition for Covariance
3:54
Covariance is a Measure of Dependence
3:55
Dependence Doesn't Necessarily Mean that the Variables Do the Same Thing
4:12
If Variables Move Together
4:47
If Variables Move Against Each Other
5:04
Both Cases Show Dependence!
5:30
Independence Theorem
8:10
Independence Theorem
8:11
The Converse is Not True
8:32
Correlation Coefficient
9:33
Correlation Coefficient
9:34
Linear Functions of Random Variables
11:57
Linear Functions of Random Variables: Expected Value
11:58
Linear Functions of Random Variables: Variance
12:58
Linear Functions of Random Variables, Cont.
14:30
Linear Functions of Random Variables: Covariance
14:35
Example I: Calculate E (Y₁), E (Y₂), and E (Y₁Y₂)
15:31
Example II: Are Y₁ and Y₂ Independent?
29:16
Example III: Calculate V (U₁) and V (U₂)
36:14
Example IV: Calculate the Covariance Correlation Coefficient
42:12
Example V: Find the Mean and Variance of the Average
52:19
Section 6: Distributions of Functions of Random Variables
Distribution Functions

1h 7m 35s

Intro
0:00
Premise
0:44
Premise
0:45
Goal
1:38
Goal Number 1: Find the Full Distribution Function
1:39
Goal Number 2: Find the Density Function
1:55
Goal Number 3: Calculate Probabilities
2:17
Three Methods
3:05
Method 1: Distribution Functions
3:06
Method 2: Transformations
3:38
Method 3: Moment-generating Functions
3:47
Distribution Functions
4:03
Distribution Functions
4:04
Example I: Find the Density Function
6:41
Step 1: Find the Distribution Function
6:42
Step 2: Find the Density Function
10:20
Summary
11:51
Example II: Find the Density Function
14:36
Step 1: Find the Distribution Function
14:37
Step 2: Find the Density Function
18:19
Summary
19:22
Example III: Find the Cumulative Distribution & Density Functions
20:39
Step 1: Find the Cumulative Distribution
20:40
Step 2: Find the Density Function
28:58
Summary
30:20
Example IV: Find the Density Function
33:01
Step 1: Setting Up the Equation & Graph
33:02
Step 2: If u ≤ 1
38:32
Step 3: If u ≥ 1
41:02
Step 4: Find the Distribution Function
42:40
Step 5: Find the Density Function
43:11
Summary
45:03
Example V: Find the Density Function
48:32
Step 1: Exponential
48:33
Step 2: Independence
50:48
Step 2: Find the Distribution Function
51:47
Step 3: Find the Density Function
1:00:17
Summary
1:02:05
Transformations

1h 16s

Intro
0:00
Premise
0:32
Premise
0:33
Goal
1:37
Goal Number 1: Find the Full Distribution Function
1:38
Goal Number 2: Find the Density Function
1:49
Goal Number 3: Calculate Probabilities
2:04
Three Methods
2:34
Method 1: Distribution Functions
2:35
Method 2: Transformations
2:57
Method 3: Moment-generating Functions
3:05
Requirements for Transformation Method
3:22
The Transformation Method Only Works for Single-variable Situations
3:23
Must be a Strictly Monotonic Function
3:50
Example: Strictly Monotonic Function
4:50
If the Function is Monotonic, Then It is Invertible
5:30
Formula for Transformations
7:09
Formula for Transformations
7:11
Example I: Determine whether the Function is Monotonic, and if so, Find Its Inverse
8:26
Example II: Find the Density Function
12:07
Example III: Determine whether the Function is Monotonic, and if so, Find Its Inverse
17:12
Example IV: Find the Density Function for the Magnitude of the Next Earthquake
21:30
Example V: Find the Expected Magnitude of the Next Earthquake
33:20
Example VI: Find the Density Function, Including the Range of Possible Values for u
47:42
Moment-Generating Functions

1h 18m 52s

Intro
0:00
Premise
0:30
Premise
0:31
Goal
1:40
Goal Number 1: Find the Full Distribution Function
1:41
Goal Number 2: Find the Density Function
1:51
Goal Number 3: Calculate Probabilities
2:01
Three Methods
2:39
Method 1: Distribution Functions
2:40
Method 2: Transformations
2:50
Method 3: Moment-Generating Functions
2:55
Review of Moment-Generating Functions
3:04
Recall: The Moment-Generating Function for a Random Variable Y
3:05
The Moment-Generating Function is a Function of t (Not y)
3:45
Moment-Generating Functions for the Discrete Distributions
4:31
Binomial
4:50
Geometric
5:12
Negative Binomial
5:24
Hypergeometric
5:33
Poisson
5:42
Moment-Generating Functions for the Continuous Distributions
6:08
Uniform
6:09
Normal
6:17
Gamma
6:29
Exponential
6:34
Chi-square
7:05
Beta
7:48
Useful Formulas with the Moment-Generating Functions
8:48
Useful Formula 1
8:49
Useful Formula 2
9:51
How to Use Moment-Generating Functions
10:41
How to Use Moment-Generating Functions
10:42
Example I: Find the Density Function
12:22
Example II: Find the Density Function
30:58
Example III: Find the Probability Function
43:29
Example IV: Find the Probability Function
51:43
Example V: Find the Distribution
1:00:14
Example VI: Find the Density Function
1:12:10
Order Statistics

1h 4m 56s

Intro
0:00
Premise
0:11
Example Question: How Tall Will the Tallest Student in My Next Semester's Probability Class Be?
0:12
Setting
0:56
Definition 1
1:49
Definition 2
2:01
Question: What are the Distributions & Densities?
4:08
Formulas
4:47
Distribution of Max
5:11
Density of Max
6:00
Distribution of Min
7:08
Density of Min
7:18
Example I: Distribution & Density Functions
8:29
Example I: Distribution
8:30
Example I: Density
11:07
Example I: Summary
12:33
Example II: Distribution & Density Functions
14:25
Example II: Distribution
14:26
Example II: Density
17:21
Example II: Summary
19:00
Example III: Mean & Variance
20:32
Example III: Mean
20:33
Example III: Variance
25:48
Example III: Summary
30:57
Example IV: Distribution & Density Functions
35:43
Example IV: Distribution
35:44
Example IV: Density
43:03
Example IV: Summary
46:11
Example V: Find the Expected Time Until the Team's First Injury
51:14
Example V: Solution
51:15
Example V: Summary
1:01:11
Sampling from a Normal Distribution

1h 7s

Intro
0:00
Setting
0:36
Setting
0:37
Assumptions and Notation
2:18
Assumption Forever
2:19
Assumption for this Lecture Only
3:21
Notation
3:49
The Sample Mean
4:15
Statistic We'll Study the Sample Mean
4:16
Theorem
5:40
Standard Normal Distribution
7:03
Standard Normal Distribution
7:04
Converting to Standard Normal
10:11
Recall
10:12
Corollary to Theorem
10:41
Example I: Heights of Students
13:18
Example II: What Happens to This Probability as n → ∞
22:36
Example III: Units at a University
32:24
Example IV: Probability of Sample Mean
40:53
Example V: How Many Samples Should We Take?
48:34
The Central Limit Theorem

1h 9m 55s

Intro
0:00
Setting
0:52
Setting
0:53
Assumptions and Notation
2:53
Our Samples are Independent (Independent Identically Distributed)
2:54
No Longer Assume that the Population is Normally Distributed
3:30
The Central Limit Theorem
4:36
The Central Limit Theorem Overview
4:38
The Central Limit Theorem in Practice
6:24
Standard Normal Distribution
8:09
Standard Normal Distribution
8:13
Converting to Standard Normal
10:13
Recall: If Y is Normal, Then …
10:14
Corollary to Theorem
11:09
Example I: Probability of Finishing Your Homework
12:56
Example I: Solution
12:57
Example I: Summary
18:20
Example I: Confirming with the Standard Normal Distribution Chart
20:18
Example II: Probability of Selling Muffins
21:26
Example II: Solution
21:27
Example II: Summary
29:09
Example II: Confirming with the Standard Normal Distribution Chart
31:09
Example III: Probability that a Soda Dispenser Gives the Correct Amount of Soda
32:41
Example III: Solution
32:42
Example III: Summary
38:03
Example III: Confirming with the Standard Normal Distribution Chart
40:58
Example IV: How Many Samples Should She Take?
42:06
Example IV: Solution
42:07
Example IV: Summary
49:18
Example IV: Confirming with the Standard Normal Distribution Chart
51:57
Example V: Restaurant Revenue
54:41
Example V: Solution
54:42
Example V: Summary
1:04:21
Example V: Confirming with the Standard Normal Distribution Chart
1:06:48
Loading...
This is a quick preview of the lesson. For full access, please Log In or Sign up.
For more information, please see full course syllabus of Probability
Bookmark & Share Embed

Share this knowledge with your friends!

Copy & Paste this embed code into your website’s HTML

Please ensure that your website editor is in text mode when you paste the code.
(In Wordpress, the mode button is on the top right corner.)
  ×
  • - Allow users to view the embedded video in full-size.
Since this lesson is not free, only the preview will appear on your website.
  • Discussion

  • Answer Engine

  • Study Guides

  • Download Lecture Slides

  • Table of Contents

  • Transcription

Start Learning Now

Our free lessons will get you started (Adobe Flash® required).
Get immediate access to our entire library.

Sign up for Educator.com

Membership Overview

  • Unlimited access to our entire library of courses.
  • Search and jump to exactly what you want to learn.
  • *Ask questions and get answers from the community and our teachers!
  • Practice questions with step-by-step solutions.
  • Download lesson files for programming and software training practice.
  • Track your course viewing progress.
  • Download lecture slides for taking notes.
  • Learn at your own pace... anytime, anywhere!

Independent Random Variables

Download Quick Notes

Independent Random Variables

Lecture Slides are screen-captured images of important points in the lecture. Students can download and print out these lecture slide images to do practice problems as well as take notes while watching the lecture.

  • Intro 0:00
  • Intuition 0:55
    • Experiment with Two Random Variables
    • Intuition Formula
  • Definition and Formulas 4:43
    • Definition
    • Short Version: Discrete
    • Short Version: Continuous
  • Theorem 9:33
    • For Continuous Random Variables, Y₁ & Y₂ are Independent If & Only If: Condition 1
    • For Continuous Random Variables, Y₁ & Y₂ are Independent If & Only If: Condition 2
  • Example I: Use the Definition to Determine if Y₁ and Y₂ are Independent 12:49
  • Example II: Use the Definition to Determine if Y₁ and Y₂ are Independent 21:33
  • Example III: Are Y₁ and Y₂ Independent? 27:01
  • Example IV: Are Y₁ and Y₂ Independent? 34:51
  • Example V: Are Y₁ and Y₂ Independent? 43:44

Transcription: Independent Random Variables

Hi there, these are the probability videos here on www.educator.com, my name is Will Murray.0000

We are working through a chapter on Bivariate densities and Bivariate distributions0006

which means we will have two variables, a Y1 and Y2.0012

We have just been looking at some videos on marginal probabilities and also, on conditional probability.0016

We are going to be using some of those ideas in this video.0023

Today's video is on independent random variables.0027

I will be using the notion of marginal density function.0031

If you do not remember anything about marginal probability or marginal density functions,0036

what you might want to do is just go back and just briefly review the idea of marginal density functions.0041

Because, we are going to use that, we will use the definition of those,0047

in this video today on independent random variables.0050

Having said that, let us jump in.0055

The intuition of independent random variables is sort of one thing.0057

And then, there is a definition and then there is a theorem about independent random variables.0062

There are three different ways to think about it.0067

There is intuition, there is a definition, and then there is a theorem which is also very useful.0069

I will spell out each one, I got a side on the intuition, and then the next side will be the formal definition,0075

and then the next side will be the theorem that you sometimes want to use.0082

The idea here is that, we have an experiment with two random variables, Y1 and Y2.0086

Intuitively, independence means that if I tell you the value of Y2, I tell you the value of one of the variables,0092

you really have no new information about the distribution of the other variable Y1.0101

Maybe, you can make a prediction about Y1 and then I would tell you the value of Y2,0109

and say do you want to change your prediction about Y1?0115

If they are independent then no, you would not change your prediction of Y10117

because the new information about the value of Y2 does not tell you anything new about the value of Y1.0122

That is the intuitive idea of independence.0130

If we spell that out, in terms of equations, what we have here, F1 of Y1 is the marginal density function of Y1.0134

What we have on the right here, F of Y1 condition on Y2 is the conditional density function of Y1.0158

The idea is that on the left, this marginal density function of Y1, this is how you would describe Y1,0176

if you have no information at all about Y2.0186

If you knew nothing about Y2, this is how you think the density of Y1 behaves.0190

On the right, we have the conditional density function.0199

This is, if I told you a particular value of Y2, how would you describe the density of Y1 with that extra information.0201

On the right is, how you would make predictions with the extra information about Y2.0212

On the left is, how you would make predictions with no information at all about Y2.0218

The idea of independence is that, those should be the same.0223

The extra information about Y2, does not change what you know about Y1.0226

That should be kind of intuitively why this formula makes sense.0233

This is not actually the formula that we will use to check whether variables are independent.0238

I’m going to give you a different definition on the next slide but I think this is sort of the more intuitive formula.0245

After I gave you the new definition on the next slide, I will try to connect it back to this formula0251

so that you see how the two ideas are related.0257

I know that is a bit confusing to have two different ways of approaching something.0261

I'm going to try to persuade you that they do both make sense, and that,0266

you can get back and forth from one to the other.0269

In the next slide, we are going to look at the formal definition of independence,0274

which I think is a little less obvious but I will connect it back to this intuitive idea.0279

The formal definition of independence for random variables, in terms of probability,0285

is the probability of both of them taking a particular value is the same as, if you evaluate them separately0292

and find the probability that each one of them takes that value separately and then multiply those probabilities.0301

That is actually the discreet version of independence.0307

The short version of that same formula is the probability of Y1, Y2 is the probability of Y1,0310

the marginal probability of Y1 × the marginal probability of Y2.0320

Let me stress here that this P1P2, those are marginal probabilities.0326

On the left, we have the joint probability function.0340

That was the discrete case and the continuous case is the analogue of that.0346

On the left, we have the joint probability function.0353

On the right, we have the two marginal probability functions, marginal density functions.0365

The idea of independence is that the joint probability density function,0375

maybe it would be better if I said joint density function instead of joint probability.0386

Let me write that down.0391

The idea of independence is that the joint density function is equal to the product of the two marginal density functions.0393

Let me write that a little more clearly, densities.0404

The joint density function factors into a product of the two marginal density functions.0412

They sort of split apart and they are independent there.0418

Let me try to connect this, this is the formal definition of independence.0422

This is the one that we are going to use for most of the problems.0429

Let me try to connect this up with the intuitive formula that I gave you back on the previous slide.0432

The way you can make those match is, you have to remember that F, the joint density function F of Y1 Y2,0441

one way to think about that is to sort of first evaluate the marginal density function of Y2.0450

And then, once you know what Y2 is, evaluate the conditional density function of Y1 condition on Y2.0457

This is that old conditional probability formula.0466

If you remember this and then, you kind of plug this into the formula for independence,0472

if you plug that in right there, F of Y1 Y2 is equal to F2 of Y2 × F of Y1 condition on Y2.0480

What you notice is that from both sides, you can cancel out an F2 of Y2.0493

We could cancel and if we canceled F2 of Y2 from both sides, we get on the left F of Y1 condition on Y2.0503

We cancel the F2 of Y2.0519

On the right, we would get just F1 of Y1.0521

That is exactly the intuitive formula that I showed you on the previous side.0527

That is how this formula, this definition connects up to the intuitive formula from the previous side.0532

This is intuition from the previous slide.0539

That is where you can derive the intuitive formula, if you like to have a formal justification0550

and how it connects up to the formal definition here.0562

There is one last way to think about independence and that comes from a theorem.0567

Let me go ahead and show that to you.0572

The theorem says that, for continuous random variables Y1 and Y2, they are independent if and only if,0574

the domain where the joint density function is defined a non 0, is a rectangle.0582

And, the joint density function can be factored into a product of a function of Y1 only and a function of Y2 only.0592

Let me expand that a little bit.0601

Condition one here means that, when you are graphing the domain, you would have some kind of square or rectangle.0603

It could be infinite, you can have something like this, you could have something that goes on forever.0614

A rectangle but it goes on forever, maybe something like this where it goes on forever,0621

in terms of one variable or it could also go on forever, in terms of the other variable.0627

These would all be considered rectangles, even though they extend infinitely far.0638

Or even, you can have something that goes on forever in both directions,0643

that is still considered to be a rectangle, for the purposes of this theorem.0650

What you could not have is some of these triangular domains that we have been looking at, in some of these examples.0654

I think we had one example where there was a triangular domain like that.0661

That was the triangular domain and that was automatically not independent.0666

All these others, at least as far as condition one is concern, would qualify as being independent.0675

The second condition means that, you can factor F of the joint density function F of Y1, Y2.0683

You can factor that into a function of Y1 × a function of Y2.0691

It is okay for either of these functions to be constants meaning, you do not have to see the variables in these functions.0699

Sometimes, you might just have a function of Y1 and you would say that other function is just 1,0704

and that is still okay to be constant.0712

It is okay, if either one of these functions are constants.0722

You would have to be able to factor it and separate it into a function of Y1 and a function of Y2 separately.0725

Inextricably, next in the density function that you cannot factor it then it is not independent.0734

We will work through the examples.0741

I’m going to try to solve most of the examples using the definition but then,0742

in a lot of them we will come back and applying this theorem.0745

We will see that, if we can use this theorem, we could have gotten the answer a lot more quickly,0750

just by kind of glancing at the region of definition, or just trying to factor the density function.0755

We will try to do the examples both ways, that you can get a feel for both of them.0763

Let us jump into those.0767

Example 1, this is an example we have seen before in some of the previous videos.0770

But, we have not looked at it in quite this light before, in terms of independence.0776

We have F of Y1 Y2 is 6 × 1- Y2.0784

The region there is a Y1 there, Y2 there, and our region is from 0 to 1 on both variables.0790

But, we are only looking at the region where Y2 is bigger than Y1.0799

That is this triangular region and we are looking at that color blue region.0803

The question is, whether Y1 and Y2 are independent.0811

If you are on top of your game right now, you already know the answer because there is a shortcut to the answer.0815

I’m going to go ahead and use the definition because that is what the example asks me to do.0821

I will use the definition, I will be very honest, I will work it out.0826

But kind of secretly that people who already know have already glance of that region and0829

there is a shortcut to the answer, that hopefully you already figured it out.0834

Let us go ahead and work it out.0838

What we are trying to figure out to test our definition of independence is whether F of Y1 Y2,0841

the joint density function separates into the two marginal density functions F1 of Y1, F2 of Y2.0847

It is a question there, whether those two were equal.0858

We will work it out, we will see if they are equal, and then we use that to determine if they are independent.0862

That is the definition of independence.0866

You got to remember the marginal density function F1 of Y1.0869

We did calculate this, this is one of the examples in the previous lecture.0872

I will go ahead and calculate it again.0878

We always have this variable switch, you always integrate over the other variables.0879

This is Y2, and Y2 in this case, goes from the line Y1 = Y2 or Y2 = Y1 to Y2 = 1.0884

In this case, we are integrating from Y2 = Y1 to Y2 = 1.0899

My joint density function is given in the problem, 6 × 1- Y2.0905

We are integrating with respect to Y2.0914

I forgot my D in there, that is very important, DY2.0919

I will just go ahead and integrate that.0924

The integral of 6 is 6Y2 -, the integral of 6Y2 is 3Y2²,0927

Y2² integrate that from Y2 = Y1 to Y2 = 1, which is 6 -3, -6Y1.0936

That is a Y2, it looks like it did not show up there.0950

I forgot my 3Y2², that is very important there.0954

-6Y1, - -, + 3Y1² and that simplifies a bit to 3Y1² - 6Y1 + 3.0959

Fair enough, that is my marginal density function for Y1.0974

F2 of Y2 is, we will switch the roles of the variables there.0978

We are integrating over Y1, Y1 goes from 0 up to Y1 = Y2.0985

Y1 = 0 to Y1 = Y2.0994

I’m doing this a little bit faster than I did in the previous videos.0998

We did figure out both of these marginal density functions, as examples in the previous video.1002

You can go back and check them out, if you want to see this work out a little more slowly.1007

6 × 1- Y2 DY1, I’m integrating with respect to Y1.1011

Be very careful here, not with respect to Y2.1020

6 × 1- Y2 × Y1, that is because Y2 is just a constant.1023

When we integrate with respect to Y1, evaluate that from Y1 = 0 to Y1 = Y2.1030

I get 6 × 1- Y2 × Y2.1038

Let me look at my condition that I'm trying to check here.1044

That is whether the joint density function splits apart into the two marginal density functions.1048

That is 6 × 1- Y2 is that equal to F1 of Y1 was this, 3Y1² - 6Y1.1055

It looks like I forgot a Y1 up there, + 3.1068

That is multiplied by 6 × 1- Y2 × Y2 F2 of Y2.1073

Clearly, if we expand out all that mess on the right, we are not going to get the equivalent expression on the left.1080

This does not work.1087

We can say that the Y1 and Y2 here, these variables Y1 and Y2 are not independent.1091

That is a formal check of how to determine whether or not these variables are independent.1108

Let me show you the secret shortcut that hopefully you had in mind, even before we started.1116

Without doing any calculus at all, I knew that as soon as I graphed this region that this was not independent.1123

That is by the theorem, Y1 and Y2 are not independent.1134

That is because the theorem said that the variables are independent if and only if, the region is a rectangle.1152

Another condition which I do not even have to check because I already know the region is not a rectangle.1163

Because the region is not a rectangle, it is a triangle.1168

That is another and much quicker way of solving this problem, is to invoke that theorem there.1181

That is the two different ways you could solve this problem.1189

The problem did ask you to use the definitions, that is why I worked it out from scratch.1193

I started with the joint density function and I want to see if it could be,1198

if it was really the product of the two marginal density functions.1202

I calculated the marginal density function F1 of Y1 and F2 of Y2.1206

Each one, you have to switch the variable that you are integrating with respect to.1211

F1 of Y1, we integrate with respect to Y2.1215

F2 of Y2, we integrate with respect to Y1.1218

And then, I describe this region separately, in terms of Y2 or in terms of Y1.1222

I ran it through these integral, did a little multivariable calculus.1233

And then, I multiply those two marginal density functions together to see whether1237

I would get back the joint density function that I started with.1242

Actually, I did not even bother to work out the multiplication because I could see that,1245

there is no way this is going to come out to be 6 × 1- Y2.1249

It is definitely, if you multiply out all this mess on the right, it is not going to work.1254

Therefore, by the definition of independence, Y1 and Y2 are not independent.1258

A quicker way that we could figure that out is, to use the theorem that I gave you on the third slide of this lecture.1266

It just says, first of all, look at the region and see if you got a rectangle.1272

If you have not got a rectangle then immediately, you know they are not independent.1276

If you have got a rectangle, there is another condition you need to check.1281

But, we could have stop as soon as we saw that region was a triangle, we know that they are not independent.1284

Let us move on and we are going to look at example 2 now.1292

F of Y1 Y2 is defined to be Y1 + Y2 and our region is a square.1297

Let me go ahead and graph that out.1305

We do not have the easy shortcut that we had on the previous example,1307

where we knew that they were not independent because the region was not a rectangle.1313

Here, a square counts as being a rectangle.1318

Here is Y2, here is Y1, there is 0 for both of them.1321

There is 1, there is 1, and so our region is just this very nice square1325

which means maybe they are independent because, at least the region is a square.1330

But, we are going to use the definition to calculate it out.1335

That means, we are going to need to find the two marginal density functions.1338

F1 of Y1 means you integrate over Y2.1342

It looks like I just integrate for Y2 = 0 to 1, Y2 = 1 of Y1 + Y2 DY2.1348

Be careful when you integrate, because you have to integrate keeping in mind that the variable is Y2.1359

Y1 is a constant, very common mistake that my students make when they are doing their probability homework,1366

is they cannot keep the variable straight, which one you are integrating.1373

We integrate Y1, that is a constant, the integral is just Y1 Y2.1377

Y2 is the variable, the integral is Y2²/2.1384

We want to evaluate all that from Y2 = 0 to Y2 = 1.1389

Let us see, when Y2 is 1, we will get Y1 + ½.1397

When Y2 is 0, it looks like both the terms dropout.1407

I found the marginal density function F1 of Y1.1410

F2 of Y2, if you look at the function that we started with, Y1 + Y2 is totally symmetric between Y1 and Y2.1414

The region is symmetric too, it is going to be the exact same calculation.1423

Just switch the roles of Y1 and Y2, you will end up with Y2 + ½ that is because everything is symmetric in this problem.1428

Let me make that a little more clear here, clear that I'm skipping a few steps because1439

I can tell that they are going to be the same, as the previous one.1442

You are just switching the roles of Y1 and Y2.1446

I want to check if Y1 and Y2 are independent.1450

I want to check if F of Y1 Y2, the joint density function, is equal to the product1453

of the two marginal density functions F1 of Y1 × F2 of Y2.1462

In this case, Y1 + Y2 is that equal to Y1 + ½ × Y2 + ½.1470

Now, if you multiply those out, no way that is going to be equal.1480

It is definitely not going to be equal.1484

Y1 and Y2 are not independent, that is the conclusion we have to draw from this.1491

Y1 and Y2 are not independent, by the original definition of independence.1500

The way I calculated that was, I really wanted to check, here is the definition of independence right here.1523

It says that the joint density function is equal to the product of the marginal density functions.1528

But I worked out the marginal density functions, F1 of Y1 means you integrate with respect to Y2.1533

I integrated the joint density function, I had to be careful there that Y2 was the variable and Y1 was just a constant.1540

That is why I got Y1 × Y2 here and Y2².1548

Worked out to Y1 + ½, it is a function of Y1.1552

Y2 works the exact same way, it gives you Y2 + ½.1557

When I plug those in, Y1 + ½ × Y2 + ½, if you multiply those together,1562

will you get the original joint density function that we start out with, no you do not get that.1567

They are not independent.1573

By the way, it is less obvious than it was in example 1.1575

In example 1, we had our region that was a triangle.1579

Immediately, the theorem told you that it was not independent.1582

In this case, our region was a square and that condition did not make it obvious anymore.1587

Maybe, you could have looked at Y1 + Y2 and said, can I factor that into a function of Y1 × a function of Y2.1592

And said, you cannot factor that, then you would have known that they are not independent.1600

The safest way is actually to check this definition and to calculate the marginal density functions,1606

and see if they multiply to the joint density function.1612

They do not, in this case, the variables are not independent.1616

In example 3, we have a discreet situation.1623

We are going to roll two dice, a red dice and a blue dice.1627

We are going to define the variables Y1 is what shows on the red dice and Y2 is the total.1632

You might think, Y1 be what shows on the red dice and Y2 is the blue dice.1639

We mixed them up a little bit to make it a little more interesting.1643

The question is, whether Y1 and Y2 are independent.1648

Let me just mention that there is an intuitive answer to this, which should make sense to you.1652

Intuition here is that, remember the intuition of independence is that if I tell you the value of one of the variables,1659

you will have some new information about the other.1667

In particular, this Y1, we know it is going to be somewhere between 1 and 6.1672

Y2 is going to be somewhere between 2 and 12 because it is the total showing on both dice.1677

The intuition is, if I tell you what is showing on one of the dice, or if I tell you what one of the variables is,1684

does it change what you might expect about the others, about the other one?1697

In this case, yes, it does.1701

Intuition is, it does change your prediction, that means these variables are dependent on each other,1703

they are not independent.1713

Let me write that down to make it clear.1716

No, they are not independent because, let me just give an example value here.1718

Let me say, suppose you roll these two dice and you are wondering what kind of roll you are going to get.1729

Suppose, you peeked at the red dice, if you get a 6, if Y1 = 6 that means you peek at the red dice and1735

say what dice came out to be a 6?1746

My prediction is, I’m more likely to have a high total.1751

That is going to change what I expect about Y2.1755

Then, Y2 is more likely to be large.1759

If I know that one die rolled very high, then, it is more likely that I got a large total.1775

In particular, if I just say I'm rolling two dice, I could get anything from 2 to 12.1780

But if you tell me that Y1 is 6, I know I'm not going to get 2 as a total, not anymore.1786

I know that I’m going to get at least 6 as the total.1792

That is a very strong intuitive hint that these variables do depend on each other, they are not independent.1794

Let me check it using the formulas as well.1805

I'm going to check P of Y1 Y2, this is the definition of independent.1808

It should be equal to P1 of Y1 × P2 of Y2.1814

I do not know whether that is true, if they are independent then they should be true.1826

I'm going to take some values of Y1 and Y2, I will go ahead and take those values that I mentioned.1831

Y1 is equal to 6, I’m going to pick those.1837

This formula should be true for all values.1841

If it is not going to be true, I can pick whatever values I want to illustrate that it is not true.1844

Y2, I will pick 12 just because I think that, if I know the red dice is 6,1850

I think that is going to change my probability of getting a 12.1857

Just think about whether the probability of 6/12 is equal to P1 of 6 × P2 of 12.1860

The probability of 6/12 means that I got a 6 on the red dice and a 12 total.1873

In order to get that, I have to get a 6 on the red and a 6 on the blue.1880

This is really the probability of 6-6.1885

The probability of rolling double 6 is 1/36, 1/6 × 1/6.1890

P1 of 6, what is my probability that the red dice is equal to 6, that is 1/6.1898

P2 of 12, what is my probability that my total is 12?1907

Again, to get 12, I have to get 6 on both dice, that is 1/36.1912

Now, is 1/36 equal to 1/6 × 1/36, sure is not.1920

That does not work out.1928

Since, we found some values for which that equality did not hold, we can say for sure,1931

that Y1 and Y2 are not independent.1938

That agrees with the intuitive answer that we already gave.1944

That does agree with the intuition, that is quite reassuring that our intuition is not completely off base1951

and the formulas do back it up.1957

Let me recap that.1960

We are rolling two dice, we have, what shows on the red dice and the total.1962

The question is, whether those are independent.1967

I do not think they are going to be independent because I think, if you to tell me what is going to come up on the red dice,1969

then I can probably say a little more about what the total is likely to be.1976

I will not be able to say exactly, but if you tell me that I get a 6 on the red dice,1980

then I know the total is somewhere between 7 and 12.1984

If you tell me that I get a 1 on the red dice, then I know the total is somewhere between 2 and 7.1988

It is really going to change, what I expect the total to be.1993

Similarly, if you tell me what the total is, maybe, I know a little more about what the red dice might be showing.1997

Like, if you tell me that the total is 12, I know the red dice is a 6 now.2003

That is the intuition there, which is that knowing one variable will influence what you predict for the other variable.2007

That means they are dependent on each other, which means they are not independent.2016

That is why I made that intuitive prediction.2022

In order to back it up, I checked it with the formula.2025

I just grabbed two values of Y1 and Y2, if they are independent then this formula should hold for all values of Y1 and Y2.2027

That is my definition of independence.2038

I'm going to check it out just with these two values, the Y1 and Y2, 6 and 12.2042

On the left, I’m finding the probability that the red dice is 6 and the total is 12 which means,2048

we must have rolled double 6.2055

The chance of getting a double 6 is 1/36.2058

On the right, P1 of 6 means what is the probability that the red dice is a 6, it is 1/6.2061

What is the probability that the total is 12?2068

Again, you would take double 6, that is 1/36.2070

I just check if the arithmetic works out and it does not.2075

1/36 is not equal to 1/6 × 1/36.2078

Y1 and Y2 are not independent and that confirms the intuitively that I made, at the beginning of the example.2082

In example 4, we have the joint density function F of Y1 Y2 is E ⁻Y1 + Y2.2093

My region here is the region on Y1 and Y2 both going from 0 to infinity.2103

There is Y1 and there is Y2, both regions go from 0 to infinity.2116

The question is, are Y1 and Y2 independent?2123

Again, there is sort of two ways I can check this.2127

One, is by using the original definition of independence.2130

And one, is by using the theorem that we got.2136

The faster way will actually be the theorem.2139

I'm going to check it from the definition first, just so that you understand that method.2142

And then, we will see how the theorem would actually be much faster.2147

We will check it out using the definition.2151

Remember, the definition of independence was that F of Y1 Y2 should be equal to,2153

should factor into the marginal density functions F1 of Y1 and F2 of Y2.2161

Let me work out what those are, those marginal density functions.2170

F1 of Y1, by definition, that means you switch the variable.2174

F1 of Y1, I was already thinking ahead to the integral that I’m about to solve.2179

I have to integrate over Y2, Y2= 0 to infinity, Y2 goes to infinity.2184

I will take a limit for that, of the joint density function E ^ (-Y1 + Y2) DY2.2192

When I solve that out, I’m integrating with respect to Y2.2204

If I think about that, I can factor out an E ^- Y2 and an E ^- Y1.2210

E ⁻Y1 will just be a constant, factor that right on out.2219

That is E ⁻Y1, I got the integral of E ⁻Y2 DY2.2225

And that is not such a bad integral, it is E ⁻Y1 × –E ⁻Y2.2233

I want to evaluate that from Y to = 0, and then take the limit as Y2 goes to infinity, that is E ⁻Y1.2241

Y2 going to infinity means, we are talking about E ⁻infinity, that is 1/E ⁺infinity or 1/infinity which is just 0.2252

-E⁰, -E⁰ which is 1, those two negatives cancel and I just get a +1.2265

I get E ⁻Y1, notice that I get a function of Y1 which is what I'm supposed to get,2277

when I take the marginal density function.2283

This is actually symmetric, F2 of Y2 was going to behave the exact same way.2289

I’m not going to belabor the details there, it is going to be E ⁻Y2.2294

I will check out my definition of independence, F of Y1 Y2.2300

Y2 is equal to, or possibly equal to F1 of Y1 × F2 of Y2.2306

E ⁻Y1 + Y2, that is my joint density function that I have been given.2316

F1 of Y1, I worked out was E ⁻Y1.2323

F2 of Y2, I worked out was E ⁻Y2.2327

Those could combine, we get E ^- Y1 + Y2.2334

In fact, it does work, that equality really holds true.2338

By the definition, yes, they are independent.2346

Y1 and Y2 are independent, checking from the definitions.2360

That is really very reassuring there.2364

Let me show you another way you could have done this problem, which is to use the theorem, two parts of the theorem.2368

By the way, I gave you this theorem in the third slide of these lectures.2379

Just check back and you will see the third slide.2384

The domain is a rectangle, it is an infinite rectangle.2386

But, that does check out here.2396

Remember, it is okay for it to be infinite, according to the theorem, it is okay if the rectangle is infinite.2399

I'm looking at this domain here, it is an infinite rectangle.2406

That is condition one of the theorem, that is satisfied.2410

Condition two of the theorem was that, the joint density function F of Y1 Y22415

had to factor into a function of Y1 only × the function of Y2 only.2423

Let us check that out here.2428

E ⁻Y1 + Y2, yes, I can factor that into E ⁻Y1 × E ⁻Y2 which is a function of, let me just write that as G of Y1.2431

Y1 × H of Y2 because I do have Y1 only in the first function and Y2 only in the second function.2453

That second edition of the theorem is satisfied.2464

Once, I have checked both of those conditions, I can go to that same conclusion and say yes, they are independent.2468

I could have saved myself doing a lot of integration there, if I had used the theorem.2475

I want to make sure that you are comfortable using the definition.2481

But also, using the theorem which can save you lots of time, if you know how to use it.2484

Let me recap the steps here.2489

The first way that I want to check this problem was to look at the definition.2491

Does the joint density function factor into the two marginal density functions?2495

I had to calculate the two marginal density functions F1 of Y1, you switch the variables and2501

you integrate over Y2 with respect to Y2.2508

I looked at my range on Y2, that goes from 0 to infinity.2512

That is where I got these limits right here.2517

I integrated the joint density function that I was given, E ⁻Y1 + Y2.2520

That is where that came from.2526

That factors, it is really nice that if factors because we are integrating with respect to Y2, that means the term with Y1,2529

pulls right out of the integral and then I'm just doing an integral on Y2, pretty easy one.2538

Plug in my limits, infinity and 0, and I just simplify down to E ⁻Y1.2544

That was the marginal density function on Y1.2553

The exact same arithmetic occurs with Y2, except you are just switching the two variables.2558

I'm going to check that definition, does the joint density function separate into the two marginal density functions.2566

When I plugged everything in there, it look like I had a true equation.2578

Just by the definition, I get that the two variables are independent.2582

A quicker way to do that would have been, both the theorem from the third slide of this lecture.2587

But we look at the domain, that is looking at this domain right here.2593

It is an infinite rectangle, that does satisfy it is not a triangular region or anything like that.2597

If you look at the joint density function, we can factor it into a function of Y1 × a function of Y2.2605

They separate the variables there.2612

And that, right out there, would have been enough to confirm to me that these variables really are independent.2615

In our last example here on independent random variables,2625

we are given a joint density function of 4Y1 Y2 and our region, I will go ahead and graph it.2629

Y1 and Y2 both going from 0 to 1, here is Y2 and here is Y1.2637

We want to figure out, I seem to switch my variables for some reason, I certainly would not want do that.2644

There is Y1 and there is Y2, here is 0 and 1 on both axis.2652

There is my region, a very nice square.2660

By the way, if you are really on your game right now,2663

if you have been paying close attention to everything in this lecture, you already know the answer to this problem.2666

If you really know what is going on.2672

I'm going to take it a little slowly and we will work it through.2674

We will find an answer and then I will kind of comeback at the end, and show you how you could have done it very quickly.2680

If you knew quickly, what you are doing.2685

We are going to check the definition of independence which it asks us2690

whether the joint density function factors into the product of the marginal densities.2695

Let me find the marginal densities, F1 of Y1 is equal to, we integrate on Y2 here.2700

Y2 goes from 0 to 1, my joint density function is 4Y1 Y2, and we are integrating DY2.2708

If I integrate that, the Y1 is a constant and the integral of 4Y2 is 2Y2².2718

2Y2², we still have the constant Y1.2727

We integrate that or we evaluate that from Y2 = 0 to Y2 = 1.2732

If I plug in Y2 = 1, I just get 2Y1.2739

Y2 = 0 does mean nothing.2745

I have figured out my marginal density function F1 of Y1.2747

F2 of Y2, if I do this exact same arithmetic, everything is symmetric here.2750

We will just swap the value of the variables.2759

It is going to work out to be 2Y2, that would be the marginal density function for Y2.2761

I want to check using the definition of independence, is F of Y1 Y2 is it equal to F1 of Y1,2768

the thing that I just calculated × F2 of Y2.2778

Is that going to work out?2783

I will plug in everything and I will see if it works out.2784

F4 × Y1 Y2 is that equal to 2Y1, that is what I just calculated, × 2Y2.2787

That was still a question, but when I look at them, it is really is, 4Y1 Y2 is 2 × Y1 × 2 × Y2.2798

Low and behold, it does check.2808

I really confirmed by the definition that, yes, Y1 and Y2 are independent.2811

That is very reassuring, that they do come out to be independent.2826

However, I hope that some of you watching the video were kind of chuckling to yourselves all along,2830

because you knew this answer in advance.2836

Here is how you knew the answer in advance.2839

You remembered that theorem that I gave you on the third slide of the lecture.2841

You can go back and check that out.2848

There are two conditions you had to check.2849

The domain is a rectangle and a square, in this case, does qualify as a rectangle.2852

You checked right away that the domain is a rectangle, that is confirmed.2861

The second condition that you have to check is that the joint density function,2867

F of Y1 Y2 factors into a function of Y1 × Y2.2871

Let me go ahead and say it is 4Y1 Y2.2883

If you wanted, you could write that as 4Y1 × Y2.2886

Or you can put a 2 on each part, it really does not matter.2893

The important thing here is that it is a function of Y1 only, × a function of Y2 only.2895

It does indeed factor as it is supposed to, in order to be independent.2903

Both of those are probably things you could have checked in your head, if you really knew what was going on.2911

In that case, you knew right away from the beginning of the problem, by the theorem Y1 and Y2 are independent.2918

You could save yourself this integration and the tedious checking, and jump to the answer right away.2925

Let me recap that problem.2933

I wanted to check this definition of independence,2935

that the joint density function does occur as the product of the two marginal density functions.2939

That meant, I had to calculate the two marginal density functions.2945

F1 of Y1, to calculate the marginal density function, you integrate over the other variable.2949

I put my Y2 there and my region is Y2 goes from 0 to 1.2956

That is how I got those limits right there, Y2 goes from 0 to 1.2963

My joint density function is 4Y1 Y2, I got that from the stem of the problem there.2967

Integrate that with respect to Y2.2976

The Y1 just comes along as a constant, the integral of 4Y2 is 2Y2².2979

Plug in the values for Y2, I get 2Y1.2986

The F2 of Y2 is completely symmetric.2990

The function and the region are both symmetric.2994

It is going to work out to be 2Y2.2997

And then, if I plug those both in, the 2Y1 and the 2Y2, I multiply them together and look,3001

I really do get the original joint density function that we started with.3006

It really did work out, in this example.3011

This is kind of a special example, if you check back some of the previous examples, example 1 and 2,3013

it did not work out when we multiply those.3018

In this case, it did work out and we can say that the variables are independent.3020

The quicker way to do that is to use the theorem that we had in the third slide of this lecture,3026

is to just look at this domain right here, and say that is a square.3032

A square counts as being a rectangle.3035

And then, you look quickly at the joint density function and say can I factor it somehow,3038

with all the Y1 on one side and all the Y2 in the other factor.3044

Yes I can, I can factor it just like this, all the Y1 in one part and all the Y2 on the other part.3049

I have successfully separated it into a function of Y1 × the function of Y2.3055

And that was the second condition that you had to check with the theorem.3062

The theorem says that both of those conditions are met, then Y1 and Y2 really are independent.3065

That is by far, a faster way to know quickly whether your variables are independent.3071

That wraps up this lecture on independent random variables.3078

This is part of the larger chapter on Bivariate distributions and Bivariate density functions.3083

In turn, that is part of the probability videos here on www.educator.com.3090

My name is Will Murray and I thank you very much for joining me today, bye now.3096

Educator®

Please sign in to participate in this lecture discussion.

Resetting Your Password?
OR

Start Learning Now

Our free lessons will get you started (Adobe Flash® required).
Get immediate access to our entire library.

Membership Overview

  • Available 24/7. Unlimited Access to Our Entire Library.
  • Search and jump to exactly what you want to learn.
  • *Ask questions and get answers from the community and our teachers!
  • Practice questions with step-by-step solutions.
  • Download lecture slides for taking notes.
  • Track your course viewing progress.
  • Accessible anytime, anywhere with our Android and iOS apps.