Enter your Sign on user name and password.

Forgot password?
Sign In | Sign Up
Start learning today, and be successful in your academic & professional career. Start Today!

Use Chrome browser to play professor video
William Murray

William Murray

Expected Value (Mean)

Slide Duration:

Table of Contents

I. Probability by Counting
Experiments, Outcomes, Samples, Spaces, Events

59m 30s

Intro
0:00
Terminology
0:19
Experiment
0:26
Outcome
0:56
Sample Space
1:16
Event
1:55
Key Formula
2:47
Formula for Finding the Probability of an Event
2:48
Example: Drawing a Card
3:36
Example I
5:01
Experiment
5:38
Outcomes
5:54
Probability of the Event
8:11
Example II
12:00
Experiment
12:17
Outcomes
12:34
Probability of the Event
13:49
Example III
16:33
Experiment
17:09
Outcomes
17:33
Probability of the Event
18:25
Example IV
21:20
Experiment
21:21
Outcomes
22:00
Probability of the Event
23:22
Example V
31:41
Experiment
32:14
Outcomes
32:35
Probability of the Event
33:27
Alternate Solution
40:16
Example VI
43:33
Experiment
44:08
Outcomes
44:24
Probability of the Event
53:35
Combining Events: Multiplication & Addition

1h 2m 47s

Intro
0:00
Unions of Events
0:40
Unions of Events
0:41
Disjoint Events
3:42
Intersections of Events
4:18
Intersections of Events
4:19
Conditional Probability
5:47
Conditional Probability
5:48
Independence
8:20
Independence
8:21
Warning: Independent Does Not Mean Disjoint
9:53
If A and B are Independent
11:20
Example I: Choosing a Number at Random
12:41
Solving by Counting
12:52
Solving by Probability
17:26
Example II: Combination
22:07
Combination Deal at a Restaurant
22:08
Example III: Rolling Two Dice
24:18
Define the Events
24:20
Solving by Counting
27:35
Solving by Probability
29:32
Example IV: Flipping a Coin
35:07
Flipping a Coin Four Times
35:08
Example V: Conditional Probabilities
41:22
Define the Events
42:23
Calculate the Conditional Probabilities
46:21
Example VI: Independent Events
53:42
Define the Events
53:43
Are Events Independent?
55:21
Choices: Combinations & Permutations

56m 3s

Intro
0:00
Choices: With or Without Replacement?
0:12
Choices: With or Without Replacement?
0:13
Example: With Replacement
2:17
Example: Without Replacement
2:55
Choices: Ordered or Unordered?
4:10
Choices: Ordered or Unordered?
4:11
Example: Unordered
4:52
Example: Ordered
6:08
Combinations
9:23
Definition & Equation: Combinations
9:24
Example: Combinations
12:12
Permutations
13:56
Definition & Equation: Permutations
13:57
Example: Permutations
15:00
Key Formulas
17:19
Number of Ways to Pick r Things from n Possibilities
17:20
Example I: Five Different Candy Bars
18:31
Example II: Five Identical Candy Bars
24:53
Example III: Five Identical Candy Bars
31:56
Example IV: Five Different Candy Bars
39:21
Example V: Pizza & Toppings
45:03
Inclusion & Exclusion

43m 40s

Intro
0:00
Inclusion/Exclusion: Two Events
0:09
Inclusion/Exclusion: Two Events
0:10
Inclusion/Exclusion: Three Events
2:30
Inclusion/Exclusion: Three Events
2:31
Example I: Inclusion & Exclusion
6:24
Example II: Inclusion & Exclusion
11:01
Example III: Inclusion & Exclusion
18:41
Example IV: Inclusion & Exclusion
28:24
Example V: Inclusion & Exclusion
39:33
Independence

46m 9s

Intro
0:00
Formula and Intuition
0:12
Definition of Independence
0:19
Intuition
0:49
Common Misinterpretations
1:37
Myth & Truth 1
1:38
Myth & Truth 2
2:23
Combining Independent Events
3:56
Recall: Formula for Conditional Probability
3:58
Combining Independent Events
4:10
Example I: Independence
5:36
Example II: Independence
14:14
Example III: Independence
21:10
Example IV: Independence
32:45
Example V: Independence
41:13
Bayes' Rule

1h 2m 10s

Intro
0:00
When to Use Bayes' Rule
0:08
When to Use Bayes' Rule: Disjoint Union of Events
0:09
Bayes' Rule for Two Choices
2:50
Bayes' Rule for Two Choices
2:51
Bayes' Rule for Multiple Choices
5:03
Bayes' Rule for Multiple Choices
5:04
Example I: What is the Chance that She is Diabetic?
6:55
Example I: Setting up the Events
6:56
Example I: Solution
11:33
Example II: What is the chance that It Belongs to a Woman?
19:28
Example II: Setting up the Events
19:29
Example II: Solution
21:45
Example III: What is the Probability that She is a Democrat?
27:31
Example III: Setting up the Events
27:32
Example III: Solution
32:08
Example IV: What is the chance that the Fruit is an Apple?
39:11
Example IV: Setting up the Events
39:12
Example IV: Solution
43:50
Example V: What is the Probability that the Oldest Child is a Girl?
51:16
Example V: Setting up the Events
51:17
Example V: Solution
53:07
II. Random Variables
Random Variables & Probability Distribution

38m 21s

Intro
0:00
Intuition
0:15
Intuition for Random Variable
0:16
Example: Random Variable
0:44
Intuition, Cont.
2:52
Example: Random Variable as Payoff
2:57
Definition
5:11
Definition of a Random Variable
5:13
Example: Random Variable in Baseball
6:02
Probability Distributions
7:18
Probability Distributions
7:19
Example I: Probability Distribution for the Random Variable
9:29
Example II: Probability Distribution for the Random Variable
14:52
Example III: Probability Distribution for the Random Variable
21:52
Example IV: Probability Distribution for the Random Variable
27:25
Example V: Probability Distribution for the Random Variable
34:12
Expected Value (Mean)

46m 14s

Intro
0:00
Definition of Expected Value
0:20
Expected Value of a (Discrete) Random Variable or Mean
0:21
Indicator Variables
3:03
Indicator Variable
3:04
Linearity of Expectation
4:36
Linearity of Expectation for Random Variables
4:37
Expected Value of a Function
6:03
Expected Value of a Function
6:04
Example I: Expected Value
7:30
Example II: Expected Value
14:14
Example III: Expected Value of Flipping a Coin
21:42
Example III: Part A
21:43
Example III: Part B
30:43
Example IV: Semester Average
36:39
Example V: Expected Value of a Function of a Random Variable
41:28
Variance & Standard Deviation

47m 23s

Intro
0:00
Definition of Variance
0:11
Variance of a Random Variable
0:12
Variance is a Measure of the Variability, or Volatility
1:06
Most Useful Way to Calculate Variance
2:46
Definition of Standard Deviation
3:44
Standard Deviation of a Random Variable
3:45
Example I: Which of the Following Sets of Data Has the Largest Variance?
5:34
Example II: Which of the Following Would be the Least Useful in Understanding a Set of Data?
9:02
Example III: Calculate the Mean, Variance, & Standard Deviation
11:48
Example III: Mean
12:56
Example III: Variance
14:06
Example III: Standard Deviation
15:42
Example IV: Calculate the Mean, Variance, & Standard Deviation
17:54
Example IV: Mean
18:47
Example IV: Variance
20:36
Example IV: Standard Deviation
25:34
Example V: Calculate the Mean, Variance, & Standard Deviation
29:56
Example V: Mean
30:13
Example V: Variance
33:28
Example V: Standard Deviation
34:48
Example VI: Calculate the Mean, Variance, & Standard Deviation
37:29
Example VI: Possible Outcomes
38:09
Example VI: Mean
39:29
Example VI: Variance
41:22
Example VI: Standard Deviation
43:28
Markov's Inequality

26m 45s

Intro
0:00
Markov's Inequality
0:25
Markov's Inequality: Definition & Condition
0:26
Markov's Inequality: Equation
1:15
Markov's Inequality: Reverse Equation
2:48
Example I: Money
4:11
Example II: Rental Car
9:23
Example III: Probability of an Earthquake
12:22
Example IV: Defective Laptops
16:52
Example V: Cans of Tuna
21:06
Tchebysheff's Inequality

42m 11s

Intro
0:00
Tchebysheff's Inequality (Also Known as Chebyshev's Inequality)
0:52
Tchebysheff's Inequality: Definition
0:53
Tchebysheff's Inequality: Equation
1:19
Tchebysheff's Inequality: Intuition
3:21
Tchebysheff's Inequality in Reverse
4:09
Tchebysheff's Inequality in Reverse
4:10
Intuition
5:13
Example I: Money
5:55
Example II: College Units
13:20
Example III: Using Tchebysheff's Inequality to Estimate Proportion
16:40
Example IV: Probability of an Earthquake
25:21
Example V: Using Tchebysheff's Inequality to Estimate Proportion
32:57
III. Discrete Distributions
Binomial Distribution (Bernoulli Trials)

52m 36s

Intro
0:00
Binomial Distribution
0:29
Binomial Distribution (Bernoulli Trials) Overview
0:30
Prototypical Examples: Flipping a Coin n Times
1:36
Process with Two Outcomes: Games Between Teams
2:12
Process with Two Outcomes: Rolling a Die to Get a 6
2:42
Formula for the Binomial Distribution
3:45
Fixed Parameters
3:46
Formula for the Binomial Distribution
6:27
Key Properties of the Binomial Distribution
9:54
Mean
9:55
Variance
10:56
Standard Deviation
11:13
Example I: Games Between Teams
11:36
Example II: Exam Score
17:01
Example III: Expected Grade & Standard Deviation
25:59
Example IV: Pogo-sticking Championship, Part A
33:25
Example IV: Pogo-sticking Championship, Part B
38:24
Example V: Expected Championships Winning & Standard Deviation
45:22
Geometric Distribution

52m 50s

Intro
0:00
Geometric Distribution
0:22
Geometric Distribution: Definition
0:23
Prototypical Example: Flipping a Coin Until We Get a Head
1:08
Geometric Distribution vs. Binomial Distribution.
1:31
Formula for the Geometric Distribution
2:13
Fixed Parameters
2:14
Random Variable
2:49
Formula for the Geometric Distribution
3:16
Key Properties of the Geometric Distribution
6:47
Mean
6:48
Variance
7:10
Standard Deviation
7:25
Geometric Series
7:46
Recall from Calculus II: Sum of Infinite Series
7:47
Application to Geometric Distribution
10:10
Example I: Drawing Cards from a Deck (With Replacement) Until You Get an Ace
13:02
Example I: Question & Solution
13:03
Example II: Mean & Standard Deviation of Winning Pin the Tail on the Donkey
16:32
Example II: Mean
16:33
Example II: Standard Deviation
18:37
Example III: Rolling a Die
22:09
Example III: Setting Up
22:10
Example III: Part A
24:18
Example III: Part B
26:01
Example III: Part C
27:38
Example III: Summary
32:02
Example IV: Job Interview
35:16
Example IV: Setting Up
35:15
Example IV: Part A
37:26
Example IV: Part B
38:33
Example IV: Summary
39:37
Example V: Mean & Standard Deviation of Time to Conduct All the Interviews
41:13
Example V: Setting Up
42:50
Example V: Mean
46:05
Example V: Variance
47:37
Example V: Standard Deviation
48:22
Example V: Summary
49:36
Negative Binomial Distribution

51m 39s

Intro
0:00
Negative Binomial Distribution
0:11
Negative Binomial Distribution: Definition
0:12
Prototypical Example: Flipping a Coin Until We Get r Successes
0:46
Negative Binomial Distribution vs. Binomial Distribution
1:04
Negative Binomial Distribution vs. Geometric Distribution
1:33
Formula for Negative Binomial Distribution
3:39
Fixed Parameters
3:40
Random Variable
4:57
Formula for Negative Binomial Distribution
5:18
Key Properties of Negative Binomial
7:44
Mean
7:47
Variance
8:03
Standard Deviation
8:09
Example I: Drawing Cards from a Deck (With Replacement) Until You Get Four Aces
8:32
Example I: Question & Solution
8:33
Example II: Chinchilla Grooming
12:37
Example II: Mean
12:38
Example II: Variance
15:09
Example II: Standard Deviation
15:51
Example II: Summary
17:10
Example III: Rolling a Die Until You Get Four Sixes
18:27
Example III: Setting Up
19:38
Example III: Mean
19:38
Example III: Variance
20:31
Example III: Standard Deviation
21:21
Example IV: Job Applicants
24:00
Example IV: Setting Up
24:01
Example IV: Part A
26:16
Example IV: Part B
29:53
Example V: Mean & Standard Deviation of Time to Conduct All the Interviews
40:10
Example V: Setting Up
40:11
Example V: Mean
45:24
Example V: Variance
46:22
Example V: Standard Deviation
47:01
Example V: Summary
48:16
Hypergeometric Distribution

36m 27s

Intro
0:00
Hypergeometric Distribution
0:11
Hypergeometric Distribution: Definition
0:12
Random Variable
1:38
Formula for the Hypergeometric Distribution
1:50
Fixed Parameters
1:51
Formula for the Hypergeometric Distribution
2:53
Key Properties of Hypergeometric
6:14
Mean
6:15
Variance
6:42
Standard Deviation
7:16
Example I: Students Committee
7:30
Example II: Expected Number of Women on the Committee in Example I
11:08
Example III: Pairs of Shoes
13:49
Example IV: What is the Expected Number of Left Shoes in Example III?
20:46
Example V: Using Indicator Variables & Linearity of Expectation
25:40
Poisson Distribution

52m 19s

Intro
0:00
Poisson Distribution
0:18
Poisson Distribution: Definition
0:19
Formula for the Poisson Distribution
2:16
Fixed Parameter
2:17
Formula for the Poisson Distribution
2:59
Key Properties of the Poisson Distribution
5:30
Mean
5:34
Variance
6:07
Standard Deviation
6:27
Example I: Forest Fires
6:41
Example II: Call Center, Part A
15:56
Example II: Call Center, Part B
20:50
Example III: Confirming that the Mean of the Poisson Distribution is λ
26:53
Example IV: Find E (Y²) for the Poisson Distribution
35:24
Example V: Earthquakes, Part A
37:57
Example V: Earthquakes, Part B
44:02
IV. Continuous Distributions
Density & Cumulative Distribution Functions

57m 17s

Intro
0:00
Density Functions
0:43
Density Functions
0:44
Density Function to Calculate Probabilities
2:41
Cumulative Distribution Functions
4:28
Cumulative Distribution Functions
4:29
Using F to Calculate Probabilities
5:58
Properties of the CDF (Density & Cumulative Distribution Functions)
7:27
F(-∞) = 0
7:34
F(∞) = 1
8:30
F is Increasing
9:14
F'(y) = f(y)
9:21
Example I: Density & Cumulative Distribution Functions, Part A
9:43
Example I: Density & Cumulative Distribution Functions, Part B
14:16
Example II: Density & Cumulative Distribution Functions, Part A
21:41
Example II: Density & Cumulative Distribution Functions, Part B
26:16
Example III: Density & Cumulative Distribution Functions, Part A
32:17
Example III: Density & Cumulative Distribution Functions, Part B
37:08
Example IV: Density & Cumulative Distribution Functions
43:34
Example V: Density & Cumulative Distribution Functions, Part A
51:53
Example V: Density & Cumulative Distribution Functions, Part B
54:19
Mean & Variance for Continuous Distributions

36m 18s

Intro
0:00
Mean
0:32
Mean for a Continuous Random Variable
0:33
Expectation is Linear
2:07
Variance
2:55
Variance for Continuous random Variable
2:56
Easier to Calculate Via the Mean
3:26
Standard Deviation
5:03
Standard Deviation
5:04
Example I: Mean & Variance for Continuous Distributions
5:43
Example II: Mean & Variance for Continuous Distributions
10:09
Example III: Mean & Variance for Continuous Distributions
16:05
Example IV: Mean & Variance for Continuous Distributions
26:40
Example V: Mean & Variance for Continuous Distributions
30:12
Uniform Distribution

32m 49s

Intro
0:00
Uniform Distribution
0:15
Uniform Distribution
0:16
Each Part of the Region is Equally Probable
1:39
Key Properties of the Uniform Distribution
2:45
Mean
2:46
Variance
3:27
Standard Deviation
3:48
Example I: Newspaper Delivery
5:25
Example II: Picking a Real Number from a Uniform Distribution
8:21
Example III: Dinner Date
11:02
Example IV: Proving that a Variable is Uniformly Distributed
18:50
Example V: Ice Cream Serving
27:22
Normal (Gaussian) Distribution

1h 3m 54s

Intro
0:00
Normal (Gaussian) Distribution
0:35
Normal (Gaussian) Distribution & The Bell Curve
0:36
Fixed Parameters
0:55
Formula for the Normal Distribution
1:32
Formula for the Normal Distribution
1:33
Calculating on the Normal Distribution can be Tricky
3:32
Standard Normal Distribution
5:12
Standard Normal Distribution
5:13
Graphing the Standard Normal Distribution
6:13
Standard Normal Distribution, Cont.
8:30
Standard Normal Distribution Chart
8:31
Nonstandard Normal Distribution
14:44
Nonstandard Normal Variable & Associated Standard Normal
14:45
Finding Probabilities for Z
15:39
Example I: Chance that Standard Normal Variable Will Land Between 1 and 2?
16:46
Example I: Setting Up the Equation & Graph
16:47
Example I: Solving for z Using the Standard Normal Chart
19:05
Example II: What Proportion of the Data Lies within Two Standard Deviations of the Mean?
20:41
Example II: Setting Up the Equation & Graph
20:42
Example II: Solving for z Using the Standard Normal Chart
24:38
Example III: Scores on an Exam
27:34
Example III: Setting Up the Equation & Graph, Part A
27:35
Example III: Setting Up the Equation & Graph, Part B
33:48
Example III: Solving for z Using the Standard Normal Chart, Part A
38:23
Example III: Solving for z Using the Standard Normal Chart, Part B
40:49
Example IV: Temperatures
42:54
Example IV: Setting Up the Equation & Graph
42:55
Example IV: Solving for z Using the Standard Normal Chart
47:03
Example V: Scores on an Exam
48:41
Example V: Setting Up the Equation & Graph, Part A
48:42
Example V: Setting Up the Equation & Graph, Part B
53:20
Example V: Solving for z Using the Standard Normal Chart, Part A
57:45
Example V: Solving for z Using the Standard Normal Chart, Part B
59:17
Gamma Distribution (with Exponential & Chi-square)

1h 8m 27s

Intro
0:00
Gamma Function
0:49
The Gamma Function
0:50
Properties of the Gamma Function
2:07
Formula for the Gamma Distribution
3:50
Fixed Parameters
3:51
Density Function for Gamma Distribution
4:07
Key Properties of the Gamma Distribution
7:13
Mean
7:14
Variance
7:25
Standard Deviation
7:30
Exponential Distribution
8:03
Definition of Exponential Distribution
8:04
Density
11:23
Mean
13:26
Variance
13:48
Standard Deviation
13:55
Chi-square Distribution
14:34
Chi-square Distribution: Overview
14:35
Chi-square Distribution: Mean
16:27
Chi-square Distribution: Variance
16:37
Chi-square Distribution: Standard Deviation
16:55
Example I: Graphing Gamma Distribution
17:30
Example I: Graphing Gamma Distribution
17:31
Example I: Describe the Effects of Changing α and β on the Shape of the Graph
23:33
Example II: Exponential Distribution
27:11
Example II: Using the Exponential Distribution
27:12
Example II: Summary
35:34
Example III: Earthquake
37:05
Example III: Estimate Using Markov's Inequality
37:06
Example III: Estimate Using Tchebysheff's Inequality
40:13
Example III: Summary
44:13
Example IV: Finding Exact Probability of Earthquakes
46:45
Example IV: Finding Exact Probability of Earthquakes
46:46
Example IV: Summary
51:44
Example V: Prove and Interpret Why the Exponential Distribution is Called 'Memoryless'
52:51
Example V: Prove
52:52
Example V: Interpretation
57:44
Example V: Summary
1:03:54
Beta Distribution

52m 45s

Intro
0:00
Beta Function
0:29
Fixed parameters
0:30
Defining the Beta Function
1:19
Relationship between the Gamma & Beta Functions
2:02
Beta Distribution
3:31
Density Function for the Beta Distribution
3:32
Key Properties of the Beta Distribution
6:56
Mean
6:57
Variance
7:16
Standard Deviation
7:37
Example I: Calculate B(3,4)
8:10
Example II: Graphing the Density Functions for the Beta Distribution
12:25
Example III: Show that the Uniform Distribution is a Special Case of the Beta Distribution
24:57
Example IV: Show that this Triangular Distribution is a Special Case of the Beta Distribution
31:20
Example V: Morning Commute
37:39
Example V: Identify the Density Function
38:45
Example V: Morning Commute, Part A
42:22
Example V: Morning Commute, Part B
44:19
Example V: Summary
49:13
Moment-Generating Functions

51m 58s

Intro
0:00
Moments
0:30
Definition of Moments
0:31
Moment-Generating Functions (MGFs)
3:53
Moment-Generating Functions
3:54
Using the MGF to Calculate the Moments
5:21
Moment-Generating Functions for the Discrete Distributions
8:22
Moment-Generating Functions for Binomial Distribution
8:36
Moment-Generating Functions for Geometric Distribution
9:06
Moment-Generating Functions for Negative Binomial Distribution
9:28
Moment-Generating Functions for Hypergeometric Distribution
9:43
Moment-Generating Functions for Poisson Distribution
9:57
Moment-Generating Functions for the Continuous Distributions
11:34
Moment-Generating Functions for the Uniform Distributions
11:43
Moment-Generating Functions for the Normal Distributions
12:24
Moment-Generating Functions for the Gamma Distributions
12:36
Moment-Generating Functions for the Exponential Distributions
12:44
Moment-Generating Functions for the Chi-square Distributions
13:11
Moment-Generating Functions for the Beta Distributions
13:48
Useful Formulas with Moment-Generating Functions
15:02
Useful Formulas with Moment-Generating Functions 1
15:03
Useful Formulas with Moment-Generating Functions 2
16:21
Example I: Moment-Generating Function for the Binomial Distribution
17:33
Example II: Use the MGF for the Binomial Distribution to Find the Mean of the Distribution
24:40
Example III: Find the Moment Generating Function for the Poisson Distribution
29:28
Example IV: Use the MGF for Poisson Distribution to Find the Mean and Variance of the Distribution
36:27
Example V: Find the Moment-generating Function for the Uniform Distribution
44:47
V. Multivariate Distributions
Bivariate Density & Distribution Functions

50m 52s

Intro
0:00
Bivariate Density Functions
0:21
Two Variables
0:23
Bivariate Density Function
0:52
Properties of the Density Function
1:57
Properties of the Density Function 1
1:59
Properties of the Density Function 2
2:20
We Can Calculate Probabilities
2:53
If You Have a Discrete Distribution
4:36
Bivariate Distribution Functions
5:25
Bivariate Distribution Functions
5:26
Properties of the Bivariate Distribution Functions 1
7:19
Properties of the Bivariate Distribution Functions 2
7:36
Example I: Bivariate Density & Distribution Functions
8:08
Example II: Bivariate Density & Distribution Functions
14:40
Example III: Bivariate Density & Distribution Functions
24:33
Example IV: Bivariate Density & Distribution Functions
32:04
Example V: Bivariate Density & Distribution Functions
40:26
Marginal Probability

42m 38s

Intro
0:00
Discrete Case
0:48
Marginal Probability Functions
0:49
Continuous Case
3:07
Marginal Density Functions
3:08
Example I: Compute the Marginal Probability Function
5:58
Example II: Compute the Marginal Probability Function
14:07
Example III: Marginal Density Function
24:01
Example IV: Marginal Density Function
30:47
Example V: Marginal Density Function
36:05
Conditional Probability & Conditional Expectation

1h 2m 24s

Intro
0:00
Review of Marginal Probability
0:46
Recall the Marginal Probability Functions & Marginal Density Functions
0:47
Conditional Probability, Discrete Case
3:14
Conditional Probability, Discrete Case
3:15
Conditional Probability, Continuous Case
4:15
Conditional Density of Y₁ given that Y₂ = y₂
4:16
Interpret This as a Density on Y₁ & Calculate Conditional Probability
5:03
Conditional Expectation
6:44
Conditional Expectation: Continuous
6:45
Conditional Expectation: Discrete
8:03
Example I: Conditional Probability
8:29
Example II: Conditional Probability
23:59
Example III: Conditional Probability
34:28
Example IV: Conditional Expectation
43:16
Example V: Conditional Expectation
48:28
Independent Random Variables

51m 39s

Intro
0:00
Intuition
0:55
Experiment with Two Random Variables
0:56
Intuition Formula
2:17
Definition and Formulas
4:43
Definition
4:44
Short Version: Discrete
5:10
Short Version: Continuous
5:48
Theorem
9:33
For Continuous Random Variables, Y₁ & Y₂ are Independent If & Only If: Condition 1
9:34
For Continuous Random Variables, Y₁ & Y₂ are Independent If & Only If: Condition 2
11:22
Example I: Use the Definition to Determine if Y₁ and Y₂ are Independent
12:49
Example II: Use the Definition to Determine if Y₁ and Y₂ are Independent
21:33
Example III: Are Y₁ and Y₂ Independent?
27:01
Example IV: Are Y₁ and Y₂ Independent?
34:51
Example V: Are Y₁ and Y₂ Independent?
43:44
Expected Value of a Function of Random Variables

37m 7s

Intro
0:00
Review of Single Variable Case
0:29
Expected Value of a Single Variable
0:30
Expected Value of a Function g(Y)
1:12
Bivariate Case
2:11
Expected Value of a Function g(Y₁, Y₂)
2:12
Linearity of Expectation
3:24
Linearity of Expectation 1
3:25
Linearity of Expectation 2
3:38
Linearity of Expectation 3: Additivity
4:03
Example I: Calculate E (Y₁ + Y₂)
4:39
Example II: Calculate E (Y₁Y₂)
14:47
Example III: Calculate E (U₁) and E(U₂)
19:33
Example IV: Calculate E (Y₁) and E(Y₂)
22:50
Example V: Calculate E (2Y₁ + 3Y₂)
33:05
Covariance, Correlation & Linear Functions

59m 50s

Intro
0:00
Definition and Formulas for Covariance
0:38
Definition of Covariance
0:39
Formulas to Calculate Covariance
1:36
Intuition for Covariance
3:54
Covariance is a Measure of Dependence
3:55
Dependence Doesn't Necessarily Mean that the Variables Do the Same Thing
4:12
If Variables Move Together
4:47
If Variables Move Against Each Other
5:04
Both Cases Show Dependence!
5:30
Independence Theorem
8:10
Independence Theorem
8:11
The Converse is Not True
8:32
Correlation Coefficient
9:33
Correlation Coefficient
9:34
Linear Functions of Random Variables
11:57
Linear Functions of Random Variables: Expected Value
11:58
Linear Functions of Random Variables: Variance
12:58
Linear Functions of Random Variables, Cont.
14:30
Linear Functions of Random Variables: Covariance
14:35
Example I: Calculate E (Y₁), E (Y₂), and E (Y₁Y₂)
15:31
Example II: Are Y₁ and Y₂ Independent?
29:16
Example III: Calculate V (U₁) and V (U₂)
36:14
Example IV: Calculate the Covariance Correlation Coefficient
42:12
Example V: Find the Mean and Variance of the Average
52:19
VI. Distributions of Functions of Random Variables
Distribution Functions

1h 7m 35s

Intro
0:00
Premise
0:44
Premise
0:45
Goal
1:38
Goal Number 1: Find the Full Distribution Function
1:39
Goal Number 2: Find the Density Function
1:55
Goal Number 3: Calculate Probabilities
2:17
Three Methods
3:05
Method 1: Distribution Functions
3:06
Method 2: Transformations
3:38
Method 3: Moment-generating Functions
3:47
Distribution Functions
4:03
Distribution Functions
4:04
Example I: Find the Density Function
6:41
Step 1: Find the Distribution Function
6:42
Step 2: Find the Density Function
10:20
Summary
11:51
Example II: Find the Density Function
14:36
Step 1: Find the Distribution Function
14:37
Step 2: Find the Density Function
18:19
Summary
19:22
Example III: Find the Cumulative Distribution & Density Functions
20:39
Step 1: Find the Cumulative Distribution
20:40
Step 2: Find the Density Function
28:58
Summary
30:20
Example IV: Find the Density Function
33:01
Step 1: Setting Up the Equation & Graph
33:02
Step 2: If u ≤ 1
38:32
Step 3: If u ≥ 1
41:02
Step 4: Find the Distribution Function
42:40
Step 5: Find the Density Function
43:11
Summary
45:03
Example V: Find the Density Function
48:32
Step 1: Exponential
48:33
Step 2: Independence
50:48
Step 2: Find the Distribution Function
51:47
Step 3: Find the Density Function
1:00:17
Summary
1:02:05
Transformations

1h 16s

Intro
0:00
Premise
0:32
Premise
0:33
Goal
1:37
Goal Number 1: Find the Full Distribution Function
1:38
Goal Number 2: Find the Density Function
1:49
Goal Number 3: Calculate Probabilities
2:04
Three Methods
2:34
Method 1: Distribution Functions
2:35
Method 2: Transformations
2:57
Method 3: Moment-generating Functions
3:05
Requirements for Transformation Method
3:22
The Transformation Method Only Works for Single-variable Situations
3:23
Must be a Strictly Monotonic Function
3:50
Example: Strictly Monotonic Function
4:50
If the Function is Monotonic, Then It is Invertible
5:30
Formula for Transformations
7:09
Formula for Transformations
7:11
Example I: Determine whether the Function is Monotonic, and if so, Find Its Inverse
8:26
Example II: Find the Density Function
12:07
Example III: Determine whether the Function is Monotonic, and if so, Find Its Inverse
17:12
Example IV: Find the Density Function for the Magnitude of the Next Earthquake
21:30
Example V: Find the Expected Magnitude of the Next Earthquake
33:20
Example VI: Find the Density Function, Including the Range of Possible Values for u
47:42
Moment-Generating Functions

1h 18m 52s

Intro
0:00
Premise
0:30
Premise
0:31
Goal
1:40
Goal Number 1: Find the Full Distribution Function
1:41
Goal Number 2: Find the Density Function
1:51
Goal Number 3: Calculate Probabilities
2:01
Three Methods
2:39
Method 1: Distribution Functions
2:40
Method 2: Transformations
2:50
Method 3: Moment-Generating Functions
2:55
Review of Moment-Generating Functions
3:04
Recall: The Moment-Generating Function for a Random Variable Y
3:05
The Moment-Generating Function is a Function of t (Not y)
3:45
Moment-Generating Functions for the Discrete Distributions
4:31
Binomial
4:50
Geometric
5:12
Negative Binomial
5:24
Hypergeometric
5:33
Poisson
5:42
Moment-Generating Functions for the Continuous Distributions
6:08
Uniform
6:09
Normal
6:17
Gamma
6:29
Exponential
6:34
Chi-square
7:05
Beta
7:48
Useful Formulas with the Moment-Generating Functions
8:48
Useful Formula 1
8:49
Useful Formula 2
9:51
How to Use Moment-Generating Functions
10:41
How to Use Moment-Generating Functions
10:42
Example I: Find the Density Function
12:22
Example II: Find the Density Function
30:58
Example III: Find the Probability Function
43:29
Example IV: Find the Probability Function
51:43
Example V: Find the Distribution
1:00:14
Example VI: Find the Density Function
1:12:10
Order Statistics

1h 4m 56s

Intro
0:00
Premise
0:11
Example Question: How Tall Will the Tallest Student in My Next Semester's Probability Class Be?
0:12
Setting
0:56
Definition 1
1:49
Definition 2
2:01
Question: What are the Distributions & Densities?
4:08
Formulas
4:47
Distribution of Max
5:11
Density of Max
6:00
Distribution of Min
7:08
Density of Min
7:18
Example I: Distribution & Density Functions
8:29
Example I: Distribution
8:30
Example I: Density
11:07
Example I: Summary
12:33
Example II: Distribution & Density Functions
14:25
Example II: Distribution
14:26
Example II: Density
17:21
Example II: Summary
19:00
Example III: Mean & Variance
20:32
Example III: Mean
20:33
Example III: Variance
25:48
Example III: Summary
30:57
Example IV: Distribution & Density Functions
35:43
Example IV: Distribution
35:44
Example IV: Density
43:03
Example IV: Summary
46:11
Example V: Find the Expected Time Until the Team's First Injury
51:14
Example V: Solution
51:15
Example V: Summary
1:01:11
Sampling from a Normal Distribution

1h 7s

Intro
0:00
Setting
0:36
Setting
0:37
Assumptions and Notation
2:18
Assumption Forever
2:19
Assumption for this Lecture Only
3:21
Notation
3:49
The Sample Mean
4:15
Statistic We'll Study the Sample Mean
4:16
Theorem
5:40
Standard Normal Distribution
7:03
Standard Normal Distribution
7:04
Converting to Standard Normal
10:11
Recall
10:12
Corollary to Theorem
10:41
Example I: Heights of Students
13:18
Example II: What Happens to This Probability as n → ∞
22:36
Example III: Units at a University
32:24
Example IV: Probability of Sample Mean
40:53
Example V: How Many Samples Should We Take?
48:34
The Central Limit Theorem

1h 9m 55s

Intro
0:00
Setting
0:52
Setting
0:53
Assumptions and Notation
2:53
Our Samples are Independent (Independent Identically Distributed)
2:54
No Longer Assume that the Population is Normally Distributed
3:30
The Central Limit Theorem
4:36
The Central Limit Theorem Overview
4:38
The Central Limit Theorem in Practice
6:24
Standard Normal Distribution
8:09
Standard Normal Distribution
8:13
Converting to Standard Normal
10:13
Recall: If Y is Normal, Then …
10:14
Corollary to Theorem
11:09
Example I: Probability of Finishing Your Homework
12:56
Example I: Solution
12:57
Example I: Summary
18:20
Example I: Confirming with the Standard Normal Distribution Chart
20:18
Example II: Probability of Selling Muffins
21:26
Example II: Solution
21:27
Example II: Summary
29:09
Example II: Confirming with the Standard Normal Distribution Chart
31:09
Example III: Probability that a Soda Dispenser Gives the Correct Amount of Soda
32:41
Example III: Solution
32:42
Example III: Summary
38:03
Example III: Confirming with the Standard Normal Distribution Chart
40:58
Example IV: How Many Samples Should She Take?
42:06
Example IV: Solution
42:07
Example IV: Summary
49:18
Example IV: Confirming with the Standard Normal Distribution Chart
51:57
Example V: Restaurant Revenue
54:41
Example V: Solution
54:42
Example V: Summary
1:04:21
Example V: Confirming with the Standard Normal Distribution Chart
1:06:48
Loading...
This is a quick preview of the lesson. For full access, please Log In or Sign up.
For more information, please see full course syllabus of Probability
  • Discussion

  • Study Guides

  • Download Lecture Slides

  • Table of Contents

  • Transcription

Lecture Comments (10)

1 answer

Last reply by: Dr. William Murray
Fri Dec 11, 2015 10:10 AM

Post by daniels calvin on December 10, 2015

When I worked it out prior to watching I modeled y as the final points you earned and your class scores out of 100 as your probability. IE if you got a 60% on exam one, you got 60% of 25 points towards the final grade.  Is that reasonable?

1 answer

Last reply by: Dr. William Murray
Wed Apr 8, 2015 6:25 PM

Post by Alvi Akbar on April 7, 2015

Is there any way i Can change the video Playback speed ?

1 answer

Last reply by: Dr. William Murray
Mon Feb 16, 2015 6:27 PM

Post by Anhtuan Tran on February 15, 2015

Hi Professor Murray,
On example 2, I understand the math that you did, but it just does not intuitively make sense, especially the last part. So for example, when you draw a 10 or a face card, do you still have to pay 10 dollars? But if so, it doesn't reflect that on the calculation because when you square it, the minus sign goes away.
Next question: why are we interested in calculating the E(y^2). Does it mean that if we draw for example a 4, we win $16 instead of $4?
Thank you.

1 answer

Last reply by: Dr. William Murray
Tue Sep 2, 2014 7:58 PM

Post by Ikze Cho on September 1, 2014

Can we use other values apart from 1 and 0 for the indicator random variables?

1 answer

Last reply by: Dr. William Murray
Mon May 19, 2014 6:02 PM

Post by Ying Cao on May 14, 2014

Even it is Y^2, the casino will still pay to the player for the $10^2. I think the E(Y^2) = $22, instead of 53 in the problem at 14:14.

Expected Value (Mean)

Download Quick Notes

Expected Value (Mean)

Lecture Slides are screen-captured images of important points in the lecture. Students can download and print out these lecture slide images to do practice problems as well as take notes while watching the lecture.

  • Intro 0:00
  • Definition of Expected Value 0:20
    • Expected Value of a (Discrete) Random Variable or Mean
  • Indicator Variables 3:03
    • Indicator Variable
  • Linearity of Expectation 4:36
    • Linearity of Expectation for Random Variables
  • Expected Value of a Function 6:03
    • Expected Value of a Function
  • Example I: Expected Value 7:30
  • Example II: Expected Value 14:14
  • Example III: Expected Value of Flipping a Coin 21:42
    • Example III: Part A
    • Example III: Part B
  • Example IV: Semester Average 36:39
  • Example V: Expected Value of a Function of a Random Variable 41:28

Transcription: Expected Value (Mean)

Hi and welcome back to the probability lectures here on www.educator.com, my name is Will Murray.0000

Today, we are going to talk about the expected value of a random variable which is also known as the mean of a random variable.0006

Those two words or phrases mean exactly the same thing.0013

Mean and expected value are used interchangeably.0016

Let us learn what those mean.0020

The way you calculate the expected value of a random variable is you find all the possible values of the random variable,0022

you multiply each one by the probability that that value will come up and then you add those up.0032

The notation that we use for that is actually a little confusing because0044

they are 2 different notations that are used interchangeably there.0048

E of Y means expected value and μ is the Greek letter μ and that stands for mean.0052

Those mean exactly the same thing, we use those interchangeably.0061

Remember from now, if we say the word mean, if we say the phrase expected value, those are exactly the same.0066

They are both defined the same way, it means you look at all the possible values and0074

the probability of each one, multiply those together and add them up.0079

That is the definition, that is not the hugely illuminating.0083

The intuition behind expected value is that you want a think of it0087

as the average value of that random variable over the long run, if you repeat an experiment many times.0092

If you do the experiment many times, on average the, random variable Y should be that value, the value of the mean.0101

It does not mean it will ever be exactly equal to that value but in the long run, it will average out to that value.0110

Another way to think about this is if you think of Y as being a payoff for a fair game.0120

We talked about this a little bit in the previous lecture, when we first learn about random variables.0126

We talked about how you some× want to think of the random variable as being a payoff on a game.0131

We play a game, we do some kind of experiment, and then based on that experiment0136

I pay you a certain amount of money and that amount of money is the value of Y.0141

If you think of it that way and then if you want to make this game fair then the expected value is the amount that you should pay me, in order to play this game once.0146

If you play this game many times, on the average, the amount you pay me to play and0157

the amount I pay you will balance out and will be even.0164

It is the amount that a game should cost, in order to make it a fair game.0169

That is the intuitive idea of expected value or mean.0174

We got several different notions we need to explore with this.0180

The first one is indicator of random variables.0183

Indicator random variables are very simple random variables.0187

What an indicator random variable does is it just looks as particular event and0191

it takes on the value 1 if that event is true and the value of 0 if that event is false.0197

It is an indicator for whether that event happened.0205

A very common example of an indicator random variable would be if you are flipping a coin a number of times0208

and then you would set up an indicator random variable to indicate whether or not a particular flip was heads.0215

That is a kind of an example of an indicator random variable.0222

We will see in some of the problems later on in this lecture how you use indicator variables.0226

We will use it for that example of flipping a coin many times.0231

The important thing about an indicator random variable is the expected value of that0234

indicator random variable is just the probability that that event is true.0240

It is very easy to calculate the expected value of an indicator random variable.0246

You just calculate the probability of that event being true.0251

For example, if you are flipping a coin many × and you have an indicator random variable for the first flip being heads,0255

if it is a fair coin, there are is 50-50 chance that that first flip is going to be a head.0262

The expected value of that indicator random variable is just ½.0267

Let me show you a little more about how we are going to use these ideas.0273

A very key notion with expected values that is very useful in probability is this notion of linearity of expectation0278

The idea there is that you can combine random variables and you can break them apart.0288

The expected value is preserved across those kinds of combinations.0293

If you have 2 random variables Y1 and Y2, or if you have constants A and B,0298

the expected value of this combination just breaks apart linearly into A × the expected value of Y1 + B × the expected value of Y2.0307

A very key formula there, we usually use it to break apart complicated random variables into simple random variables.0319

The most common use of that is to break apart a variable into indicator variables0328

because it is very easy to calculate the expected value of indicator variables.0336

This is all a bit theoretical right now.0341

We are going to get some nice examples in just a couple of moments.0343

You will get to see how a complicated variable splits up into indicator variables and0346

it makes the expected value much easier to calculate.0352

There is one more notion that we need to explore before we jump into the examples0358

which is the expected value of a function of a random variable.0363

Instead of just looking around the variable by itself, we will look at some function G of Y.0368

Probably the most common example for G of Y will be Y², that is very common example there.0373

The expected value of functional random variables is defined very similarly to how the expected value of the original variable was.0379

Let me just remind you how we define the expected value of the original variable.0389

E of Y was the sum overall real number, of all possible real numbers of P of Y × Y.0394

That was the expected value of Y.0406

That Y on the right there came from that Y in there.0409

That is what we are finding the expected value.0414

Here we are finding the expected value of a function G of Y.0417

We replace that Y on the right with G of Y.0421

We replace that Y on the right, instead of just having P of Y × Y.0428

We did P of Y × G of Y and then we calculate that sum.0433

Often, it will be Y², we will see some examples of that.0438

I think this is a bit abstract but after we do some examples then it will make a lot more sense.0441

Let us work through some of these examples together.0448

First example here, this is actually the same experiment that we had back in the previous lecture0451

but we are going to calculate the expected value now.0457

The example is that you are going to draw a card from a standard 52 card deck.0463

If it is ace through 9, I pay you that amount.0468

If you draw an ace, I will pay you $1.00.0471

If you draw a 2, I will pay $2.00.0473

If you draw a 9, I will pay you $9.00.0475

If it is a 10 though, you have to pay me $10.00.0478

If it is any face card, a jack, queen, or king, then you have to pay me $10.00 for that privilege.0481

I want to figure out what is the expected value for you for this random variable.0489

Let us calculate that out together.0495

Remember, the expected value of a random variable just by definition is the sum of all possible values of Y , of P of Y,0497

the probability of that particular value × the value itself Y.0509

Let us think about the different values that this game could take for you.0514

It could end up winning $1.00.0518

If you win $1.00 from this game that is if you get an ace.0520

There are 4 ways you can get an ace out of 52 cards.0527

You have a 1/13, that is 4/52, 1/13 chance of winning exactly $1.00.0530

By the way, we calculated this probability distribution as an example in the previous lecture.0537

If this is completely mysterious, you may want to go back and0542

watch that example in the previous lecture then this one will make a little more sense.0545

What else could you make, you could make $2.00 out of this experiment.0554

There is a 1/13 chance that you are going to make $2.00 because there are 4 cards out of the possible 52 that will give you $2.00.0558

1/13 chance you are going to make $3.00 because there are 4 cards, there are 4 3’s in the deck out of 52 cards.0569

All the way on up to you can make $9.00 and there is a 1/13 chance that you can make $9.00.0578

You could also end up losing money on this experiment.0588

If you get a 10, jack, queen, or king, you pay me $10.00.0591

For you, that is a -$10.00 difference.0595

The odds of getting that were 16 out of 52, we calculated that last time.0603

There are 16 cards out of 52 possible that count as 10, jacks, queens, or kings.0608

16 out of 52 reduce to 4 out of 13.0617

That is the probability that you will end up paying me $10.00.0621

Let us calculate this together.0625

You see you got a 1/13 everywhere, if we factor out that 1/13 and then I will just have 1 + 2 + 30629

up to 9 -4 × 10 because I factor out the denominator of 13, so 4 × 10 there.0638

1 + 2 + 3 + 4, 5, 6, 7, 8, 9, that turned up to be 45 - 4 × 10 is 40 which is 5/13.0651

What that means is that is the expected value of a random variable, it is also the mean.0669

Remember, the mean and expected value are exactly the same thing, so use those interchangeably.0676

Μ and E of Y, you want to get use to using those interchangeably.0681

What that means, I guess 5/13 is a little bit less than ½.0686

In the long run, if we play this game many times, on the average you are going to make about 5/13 for a dollar per game.0692

Obviously, in each game, you will never make exactly 5/13 because we are only trading back and forth a whole dollar here.0703

But it means that in the long run, every time we play 13 games, you should expect to make about $5.00 on average.0712

You, if we are going to make this a fair game, you should pay me for the privilege of paying.0726

Because on the average, I’m going to end up paying you a little bit per game.0732

You should pay 5/13 for a dollar to make it a fair game.0736

If I'm going to open a casino then I will probably round that up to 50¢.0749

I will make you pay 50¢ to play this game because I want to make sure that I will make a profit in the long run.0756

If I’m in a casino, I have to charge a little bit more than 5/13 for a dollar,0763

in order to guarantee that I will make a profit in the run long and cover my other expenses.0767

Maybe I will charge you 50¢, maybe I will charge you an even dollar to play this game but0773

I have to charge you more than 5/13 for the dollar or else I’m going to lose in the long run and0778

you will take money away from me by playing this game multiple times.0782

Just to remind you where everything came from here.0787

We are calculating the expected value of a random variable, that means we look at all the possible values and the probability of each one.0789

Those possible values, you can make $1.00, $2.00, $3.00, up to $9.00, or you could end up having to pay me $10.00.0798

And then, we calculated the probabilities of each one.0806

We did this example in the previous lecture but those probabilities just depend on how many cards give you those particular payoffs.0809

We figure out those probabilities and then this just was a fairly easy set of fractions to simplify down to 5/13 for the dollar.0818

That is the expected value which is the same as the mean of this game.0829

That means that if we are going to make it a fair game, a game that in the long run0833

there is no winning or losing between us, you should pay me 5/13 for the dollar.0838

If I want to make a profit off this, if I'm running a casino, of course, I want to make a profit.0843

I’m going to charge you a little bit more than 5/13 for the dollar, maybe 50¢, maybe a whole dollar to play this game.0848

Example 2 here, it is the same example as above, at least in the beginning.0857

You are going to draw a card from a standard 52 card deck.0861

If it is an ace through 9, I pay you that amount.0865

If it is a 10 or a face card, you pay me $10.00.0867

That part is the same, let Y be the amount that I pay you.0870

What is the expected value for Y²?0874

That is the new part, we are looking at Y² instead of Y.0877

The difference between this example and the previous one.0882

If this were a casino game and the casino promise to pay you Y², how much Y would the casino charge you to play?0885

What we are really doing here is we are looking at the expected value of a function of a random variable.0894

We are trying to find the expected value of Y².0900

Remember, the expected value of a function of a random variable,0906

let me show you up here the formula that we learned earlier.0910

It is the sum over all the possible values in the variable of the probability of each one × that function applied to the value.0915

The difference there is instead of having Y by itself, you put G of in here.0928

I will add up all the probabilities and all the Y values, except instead of putting in the Y values, I’m going to put in Y² each time now.0933

Our probabilities are 1/13, the Y is going to be 1.0943

Let me actually write the sum generically first, just to make it a little more clear here.0953

We are going to find the sum of Y/Y × P of Y × GL of Y, that is the sum on Y of P of y.0958

G of Y, that is this function right here, it is Y².0971

Our probabilities were 1/13 and our first value was 1, but we are going to square it, 1/13.0976

The next value was 2² 1/13, 3², up to 1/13 × 9².0984

Finally, the last value we could get was our probability 4/13 and it was -10 then we have to square that.0997

Now, we have to do some arithmetic to simplify this.1007

Again, I still see that I have a 1/13 everywhere.1010

1/13 × 1 + 2² is 4, 3² is 9, and so on, up to 9² is 81.1014

We have this last value of 4 × -10².1026

Squaring it makes it positive, + 4 × 100.1030

It is just some arithmetic to simplify that.1038

There is a formula, by the way, for adding up the sum of cubes.1040

It is a clever formula, I did not bring that in here because it is not really relevant to what I'm trying to teach you today.1044

I will just to add up those numbers quickly.1052

1 + 4 + 9 up to 81 + 400 turns out to be 685.1055

Our real answer there is 685/13, that is our exact answer.1064

If we are in a casino, you do not to deal in 13 for the dollar.1075

That is a little bit around $53.00, I round it out there to $53.00.1079

That means if you play this game, on the average, on the long run, you can make about $53.00.1091

Of course, you will not make exactly $53.00, you will make one of these values.1097

You either make $1.00 or $4.00, or $9.00.1101

You might make $81.00, you might make $100.1106

You are going to make one of those values but in the long run, on the average, you are going to make about $53.00.1110

If I'm the casino running this game, I had a 100,000 people play it on a given night then1116

I'm going to expect to pay out about $53.00 per customer on average.1125

That means, I will payout about $5,300,000 to people playing this game on any given night.1131

That is a lot of money.1139

If I'm the casino then I want to make sure I get a profit on this game.1143

A casino would have to charge a little more than $53.00, in order to make sure that1148

they make a profit in the long run on this game.1158

They might charge, if we are going to run this casino, my charge maybe $60.00 to play.1163

That means on every customer that plays this game, on average,1173

we expect to make $7.00 off this customer and in the long run we will make a nice, healthy profit.1178

In the short run, we might to lose money because certain customers are going to win $80.00 or $81.00.1183

Certain customers are going to win $100 from their $60.00 play.1191

In the long run, we will make about $7.00 from each game.1195

We will be happy as a casino even if we lose a few games in the meantime.1199

Let me recap what we are doing here.1205

We are really finding the expected value of the function of a random variable here.1207

I’m using this formula that was given on one of the early slides of this lecture, the third or fourth slide.1212

I think it is the 4th slide of this lecture, the expected value of a function of a random variable.1219

What you do is you look at all the possible values in the random variable,1225

the probability of each one, and you multiply it by that function of that value.1229

In this case, the function is Y².1235

Instead of having P of Y × Y, the way we had in the previous example, we have P of Y × Y square.1237

We do the same calculations before except the new element here is I’m squaring each of the Y before I run through the calculation.1245

The most notable change there is that makes that -10, makes it positive.1255

We now get 4 × 100, instead of 4 × -10.1260

The arithmetic there comes out to be about $53.00 which means if the casinos can offer you Y², then on average,1265

you are going to make about $53.00 per game.1274

If the casino wants to make sure that they make a profit, they will charge a little more than $50.00,1277

maybe $60.00 would be enough to guarantee them that they are going to make about $7.00 per game.1283

You are going to make more money on some games and lose some money on some games but make about $7.00 per game.1291

It is a nice, healthy profit for the casino.1297

The next example here, I want to flip a coin 3× and we are going to calculate the expected number of heads.1303

The second part of this problem, we are going to flip a coin 100× and calculate the expected number of heads.1312

We are going to use two different strategies to think about this problem and1319

I'm sort of trying to illustrate the principle of linearity of expectations.1322

We are getting to that in a few minutes but let me start out just by calculating the answer to the first problem directly.1327

We are going to set up our random variable first.1336

Let Y, we are going to define Y to be rows columns equals and it is defined to be the total number of heads.1339

That is the random variable that I'm going to keep track of here, the total number of heads.1351

I want to think about all the possible outcomes of this experiment and then what the value of Y would be for each one.1362

Let us think about all the possible things that can happen, when you flip a coin 3×.1371

You could get head-head-head, you could get head-head- tail, could get head-tail-head.1377

I’m sort of running them through in a binary fashion here, head-tail-tail.1385

I’m imagining myself counting in binary, tail-head-head, tail-head-tail, tail-tail-head, and tail-tail-tail.1392

There should be 8 outcomes total.1405

Let me make sure I have got them all, 1, 2, 3, 4, 5, 6, 7, 8 outcomes total.1407

Let us think about what the value of Y would be for each one of those.1414

The total number of heads, in the first one we got 3 heads, then 2 heads, 2 heads, 1 head, 2 heads, 1 head, 1 head, and 0 heads.1418

Those where the values of Y for each one there.1429

Let me calculate the expectation of that random variable, the mean of that random variable.1434

I’m just going to use the basic formula which was the sum/all possible values of Y of P of Y × Y.1441

I’m just using the basic definition of expectation of the mean of the random variable.1451

Remember, expected value and mean are the same thing.1457

Those are just absolutely synonymous expressions.1460

You use them interchangeably that you do not get any different information from one as from the other.1463

The probability of each one of these outcomes is 1/8.1470

It is 1/8 × 3 + 1/8 × 2 +, there is going to a 1/8 on all of these.1477

Let me just factor out the 1/8 and add up all the values of Y there.1487

It is a 3 + 2 + 2 + 1 + 2 + 1 + 1 + 0.1493

Let us see, 3 + 2 is 5 + 2 is 7 + 1 is 8 + 2 is 10 + 1 is 11 + 1 is 12, this is 12/8.1504

Of course, that simplifies down to 3/2, that is the expected number of heads.1517

It is a little curious there, if you flip a coin 3 ×, there is no way you can get 3/2 heads.1524

You are either going to get 0 heads, or 1 head, or 2 heads, or 3 heads.1529

It is not saying we expect to get 3/2 of a head because we cannot get 3/2 of a head.1534

What it is really saying is if you do this experiment many times, on the average, you will get 3/2 heads per experiment.1540

If you do this experiment, for example 8 ×, you would probably expect to see about 12 heads in total because 12/8 is 3/2.1551

That is what this number is saying.1561

That was a bit of computation in order to find that.1564

Let me show you another way to calculate this and I think it is a better way.1568

We are going to use linearity of expectation and we are going to use indicator variables.1572

Here is a better way to calculate this expected value.1579

We are going to define the indicator variables.1589

Remember, indicator variables are variables that just take the value of 1 or 01600

depending on whether some event is true or false.1605

I have talked about indicator variables on one of the early slides back in the beginning of the same lecture.1607

You can just scroll back and check that out, if you do not remember what the indicator variable is.1613

In this case, I would define y1 to be the indicator variable for the first flip being heads.1619

Y1is going to be 1 if the first flip is a head and it is going to be 0 if the first flip is a tail.1626

Y2 is very much the same thing.1642

It is going to be 1 if the second flip is a head and 0 if it is a tail.1646

Y3 is the same thing for the third flip.1658

If the third flip is a head and 0 if it is a tail.1663

The point of that is that y1is just an indicator variable for the first flip being a head.1669

Y2 is an indicator variable for the second flip being a head.1674

Y3 is an indicator for the third flip being a head.1678

Let us think about y1by itself, the expected value of Y1.1684

The expected value of y1is just the probability that the first flip is a head.1694

The first flip is a head, that was what we learn about indicator variables in an early slide for this lecture.1703

It is very easy to calculate the expected value of an indicator variable.1715

It is just the probability that that event is true.1721

What is the probability that the first flip is a head, that of course is ½.1724

And then, we can find the expected value of Y2 the same way is also ½ and the expected value of Y3 is also ½.1730

Here is the beauty of this system.1742

Y is our total number of heads.1744

Our total number of heads is going to be the number of heads you get on the first flip +1752

the number of heads you get on the second flip + the number of heads you get on the third flip.1757

Y breaks down into a sum of these 3 indicator variables.1763

What we can use now is the beautiful linearity of expectation.1769

Linearity of expectation says the expected value of Y is equal to the expected value of Y1+, let me put them all together here.1784

Y is the same as Y1+ Y2 + Y3, the total number of heads is equal to the number heads you get on each flip.1797

The expected value of Y1, now we are using linearity + expected value of Y2 + the expected value of Y3.1805

But we figure all those out that is just ½ + ½ + ½ which is 3/2.1816

That is the same answer we got before but the advantage of that is,1824

we did not have to scroll down through all these 8 different possible outcomes.1828

That is a good way of calculating our answer to part A.1836

All those was just in the service of calculator answer to part A.1840

Let us think about part B now, I have not left myself much space here.1844

If we want to answer part B by listing all the outcomes, we would need to list 2 ⁺100 outcomes because we are flipping a coin 100 ×.1848

Each time we flip it, there are 2 possible things that can happen.1866

There are a string of 100 heads and tails for each possible outcome and there is 2 ⁺100,1869

2 × 2 × 2 a hundred × possible outcomes.1876

There is no way we can list that many outcomes.1881

I certainly do not have time on this video and you do not want to watch all of that.1884

There is really no way we can do this problem using the first method that I taught you for part A here.1888

We are pretty much stuck until we learn about linearity of expectation.1893

Instead, use linearity.1901

If you break it up the exact same way we broke up the first one, E of Y is equal to E of Y1 up to E of Y100.1907

We have broken it up into 100 little indicator variables and1926

the expected value of each one of those little indicator variables is just the same as the variables before, that is ½ + ½, 100 × over,1930

We are adding up a hundred ½ here and the expected number of heads,1952

the average number of heads, this is not too surprising, in a hundred flips is 50.1958

We are really depending heavily on that linearity of expectation,1964

in order to simplify a problem from having to write out 2 ⁺100 possible outcomes down into just adding up a bunch of ½.1969

That is really showing the power of linearity of expectation there for larger experiment.1980

Let me recap here.1989

We cannot do this, the first part of this problem 2 different ways.1990

First way, we just listed all the outcomes here, all the possible things that can happen when you flip a coin 3 ×.1993

All the possible strings that can happen, head-head-head, head-head-tail, and so on.2002

For each one of those, we wrote down how many different heads were listed in the string2007

and then we added up the probability of each one which is 1/8 × the number of heads we saw each time.2013

This set of numbers comes from this set right here.2021

If you just do the arithmetic there, it all simplifies down to 3/2.2026

In the long run, we expect to see 3/2 heads when you do this experiment, on average.2030

You will never see exactly 3/2 heads because there is no way to have half of the head.2037

What this means is that the long run, on average, will be that you will see 3/2 heads per experiment.2042

That was sort of the first way to do it but then I said there is a much better way2052

which is to set up these indicator variables which keep track of each flip, the first flip, the second flip, the third flip.2056

It is just 1 if that flip is a head and 0 otherwise.2064

It is the number of heads you see on the first flip.2067

You either see 1 or 0.2071

The expected value for each one of these indicator variables, the expected value of any indicator variable is the probability that2074

that event is true which in this case, what is the probability that the first flip is a head, its ½.2082

For each one of the other flips, the probability is ½.2088

Y is the total number of heads which breaks down into the number of heads on the first flip +2092

the number heads on the second flip + the number of heads in the third flip.2098

Here is where we invoke linearity of expectation.2102

The expected value of Y turns to be the expected value of Y1+ Y2 + Y3 and that breaks up.2105

Here is the linearity right here, that breaks up into the expected value of the individual random variables2112

which we already calculated to be ½ each so we get 3/2.2121

That may not seem much better on this small example A here but when we get to part B, we have a hundred flips.2126

The number of outcomes we would have to list would be 2⁺100 because it is 2 × 2 × 2 a hundred ×.2135

There is no way we can list that.2142

There is no way we can use that first method to solve that one.2144

But instead, if we use linearity, it breaks up very easy way into 100 little indicator variables.2148

Each one has expected value ½ and if you just add up a ½ a hundred ×, we get 50 which is a very easy answer.2157

It is not surprising that if you flip a coin 100 ×, on average, you expect to see 50 heads.2165

It does not mean that it is very likely that you will see exactly 50 heads.2172

It means that in the long run, if you do this experiment many times, you should see an average of 50 heads per experiment.2177

This one was all about the power of linearity of expectation and indicator variables.2188

That is the point that I was trying to drive home with this example.2195

In example 4, we are going to calculate your average grade in your probability class.2202

In your class, you are going to take 2 midterm exams and they are 25% each of your semester grade.2208

The final exam counts for 30%, that is a little bigger there and the homework counts for 20%.2215

That is where your grades is based on.2220

What your scores are 60 and 80 on each of the midterms, you score an 80 on the final, and then you score 100 on the homework.2223

The question is, what is your semester average?2233

This might not seem like an expected value problem.2235

Remember, another word for unexpected value is mean and mean really does mean average.2239

I will show you how this breaks down exactly into an expected value problem.2245

The μ, here I’m going to use the Greek letter μ for expected value.2251

Let me remind you what the formula is for expected value.2257

It is the sum over all possible values in the variable of the probability of each value × that value.2261

Let us think about all the possible scores you can get.2272

You scored 60 on 1 midterm, you scored 80 on something else, a couple of things, and you scored 100 on something.2276

Those are the possible Y values that we can see and we want to figure out how much weighting,2287

how much probability is attached to each one of those Y values.2291

Let us figure out the probability of each one of those Y values.2296

Think of that as the weight on each of those Y values.2299

We are going to find the probability for each one of those or weight of each of those.2306

How much weight did you score 60 on?2312

60 was on the first midterm, that was 25%.2314

I will put a 25/100 here, 25% there.2318

Where did you score 80 on?2325

You scored 80 on the midterm and then 80 on the final, that is 80.2326

The second midterm was 25/100 + the final is 30/100.2333

That is how much of weighting you scored on and 100 on the homework which was 20%, so 20/100.2346

You scored 100% on that part of the class.2359

I’m really using the formula for expected value to calculate your mean for the class or your average for the class.2363

This is a fairly easy thing to simplify here.2372

In fact, I did not bother to write down the intermediate steps.2376

I’m just going to calculate this all together.2381

If you calculate this all together, it turns out to simplify down to exactly 79.2385

That is your mean for the class, that is your average for the semester.2401

If you have a particular grading scale, you would se that to determine your grade for the semester.2408

Let me show you how what seem like a problem of calculating your semester average in class turned out to be an expected value problem.2418

We are calculating your μ.2427

Remember, μ is always the same as E of Y, it means mean or expected value.2428

Those are the same concept.2433

The way you calculate it is you look at all the possible things you could have scored and then you multiply each one by,2435

we said the probability but it is really the weight attached to each one of those items.2442

When I have looked at all the scores, I saw 60, 80, and 100.2448

I put a weighting on each one, 25% for the 60, 20% for the 100, and the 80 was a little complicated2454

because there were 2 different items that gave me an 80.2460

I had to put a weighting, sort of an extra weighting 25 out of 100 for the second midterm2463

and then 30 out of 100 for the final exam.2471

When I added all those things together and I simplify the arithmetic, simplify it down to exactly 79,2475

that is your semester average in your probability class.2483

In example 5 here, we are going to roll one dice and let Y be the number showing on the dice.2491

We are going to calculate E not of Y but E of Y².2498

This is another test of finding the expected value of a function of a random variable.2503

Let me remind you of the formula for that.2512

The expected value of the function of a random variable,2514

you calculate it quite similarly to expected value for the random variable itself.2518

It is P of Y, the variable itself, you just have a Y here.2524

Since, there is a function of the random variable, you run it through G of Y.2529

In this case, our function is exactly Y² and we are going to add up all the possible values2534

that could be showing in the probability of each one × Y² for each one.2543

If we are just finding the mean or the expected value of the variable itself,2553

that would just be a Y but now it is Y² because of what the problem is asking us.2557

Let us go ahead and think about all the possible values you can have.2563

The possible values when you roll a dice are 1, 2, 3, 4, 5, and 6.2567

If we want each one through the function that means we square each one.2574

Let me go ahead and square each one here.2578

Let me put a probability on each one.2581

1/6 × 1² + 1/6 × 2² + 1/6 + 1/6, and I did not give myself enough space here, + 1/6 × 6².2583

Of course, it is easier there to factor out the 1/6.2605

I will put a 1/6 on the outside here.2608

1² is 1 + 2² is 4, 3² is 9, 4² is 16, 5² is 25, and 6² is 36.2611

If you add those up, 36 + 25 is 61, 16 is 77, 9 is 86, 4 is 90, 1 is 91.2622

I get 91/6 as my expected value of Y² or my μ for Y².2635

That is the end of that problem but let me recap the steps here.2649

We are using the formula for the expected value of a function of a random variable.2652

I gave you that formula back on one of the early slides in this lecture.2657

You can go back and check that out.2662

The difference between calculating the expected value of a function and2664

the expected value of the original random variable means instead of Y here, you just change it to G of Y.2668

You are no longer talking about Y, you are talking about G of Y which means that you are adding up the probabilities ×,2677

in this case we have our function is Y².2684

That is the most common function we are going to be looking at.2687

In this case, we have Y² here.2692

We take all of our values and we square each one, 1², 2², 3².2695

These are all the values you can get by rolling one dice, 5², 6².2700

The probability of each one of those is 1/6, we will multiply by 1/6.2704

Now, we just simplify the arithmetic.2709

Square all those numbers, multiply it by 1/6, and we get 91/6.2711

This is a kind of an early preview of something we are going to be doing in the next lecture.2716

We are going to be looking at variance and standard deviation.2721

In order to calculate that, it is going to be very useful to calculate the expected value of Y² for our random variable.2725

That is why I'm giving you several examples this time of calculating the expected value of Y²2733

because we are going to be using that very often in the next lecture.2738

I hope this made sense to you, if it did not, you might want to work it through a couple more times,2743

before you go on to the next lecture and learn about variance and standard deviation2748

because you really want to understand how to find the expected value of Y² first.2753

That is the last example for this lecture on expected values and means.2759

Of course, expected values and means are the same thing.2765

You are all watching the probability lectures here on www.educator.com.2768

My name is Will Murray, thanks for watching, bye.2772

Educator®

Please sign in for full access to this lesson.

Sign-InORCreate Account

Enter your Sign-on user name and password.

Forgot password?

Start Learning Now

Our free lessons will get you started (Adobe Flash® required).
Get immediate access to our entire library.

Sign up for Educator.com

Membership Overview

  • Unlimited access to our entire library of courses.
  • Search and jump to exactly what you want to learn.
  • *Ask questions and get answers from the community and our teachers!
  • Practice questions with step-by-step solutions.
  • Download lesson files for programming and software training practice.
  • Track your course viewing progress.
  • Download lecture slides for taking notes.

Use this form or mail us to .

For support articles click here.