  William Murray

Markov's Inequality

Slide Duration:

Section 1: Probability by Counting
Experiments, Outcomes, Samples, Spaces, Events

59m 30s

Intro
0:00
Terminology
0:19
Experiment
0:26
Outcome
0:56
Sample Space
1:16
Event
1:55
Key Formula
2:47
Formula for Finding the Probability of an Event
2:48
Example: Drawing a Card
3:36
Example I
5:01
Experiment
5:38
Outcomes
5:54
Probability of the Event
8:11
Example II
12:00
Experiment
12:17
Outcomes
12:34
Probability of the Event
13:49
Example III
16:33
Experiment
17:09
Outcomes
17:33
Probability of the Event
18:25
Example IV
21:20
Experiment
21:21
Outcomes
22:00
Probability of the Event
23:22
Example V
31:41
Experiment
32:14
Outcomes
32:35
Probability of the Event
33:27
Alternate Solution
40:16
Example VI
43:33
Experiment
44:08
Outcomes
44:24
Probability of the Event
53:35

1h 2m 47s

Intro
0:00
Unions of Events
0:40
Unions of Events
0:41
Disjoint Events
3:42
Intersections of Events
4:18
Intersections of Events
4:19
Conditional Probability
5:47
Conditional Probability
5:48
Independence
8:20
Independence
8:21
Warning: Independent Does Not Mean Disjoint
9:53
If A and B are Independent
11:20
Example I: Choosing a Number at Random
12:41
Solving by Counting
12:52
Solving by Probability
17:26
Example II: Combination
22:07
Combination Deal at a Restaurant
22:08
Example III: Rolling Two Dice
24:18
Define the Events
24:20
Solving by Counting
27:35
Solving by Probability
29:32
Example IV: Flipping a Coin
35:07
Flipping a Coin Four Times
35:08
Example V: Conditional Probabilities
41:22
Define the Events
42:23
Calculate the Conditional Probabilities
46:21
Example VI: Independent Events
53:42
Define the Events
53:43
Are Events Independent?
55:21
Choices: Combinations & Permutations

56m 3s

Intro
0:00
Choices: With or Without Replacement?
0:12
Choices: With or Without Replacement?
0:13
Example: With Replacement
2:17
Example: Without Replacement
2:55
Choices: Ordered or Unordered?
4:10
Choices: Ordered or Unordered?
4:11
Example: Unordered
4:52
Example: Ordered
6:08
Combinations
9:23
Definition & Equation: Combinations
9:24
Example: Combinations
12:12
Permutations
13:56
Definition & Equation: Permutations
13:57
Example: Permutations
15:00
Key Formulas
17:19
Number of Ways to Pick r Things from n Possibilities
17:20
Example I: Five Different Candy Bars
18:31
Example II: Five Identical Candy Bars
24:53
Example III: Five Identical Candy Bars
31:56
Example IV: Five Different Candy Bars
39:21
Example V: Pizza & Toppings
45:03
Inclusion & Exclusion

43m 40s

Intro
0:00
Inclusion/Exclusion: Two Events
0:09
Inclusion/Exclusion: Two Events
0:10
Inclusion/Exclusion: Three Events
2:30
Inclusion/Exclusion: Three Events
2:31
Example I: Inclusion & Exclusion
6:24
Example II: Inclusion & Exclusion
11:01
Example III: Inclusion & Exclusion
18:41
Example IV: Inclusion & Exclusion
28:24
Example V: Inclusion & Exclusion
39:33
Independence

46m 9s

Intro
0:00
Formula and Intuition
0:12
Definition of Independence
0:19
Intuition
0:49
Common Misinterpretations
1:37
Myth & Truth 1
1:38
Myth & Truth 2
2:23
Combining Independent Events
3:56
Recall: Formula for Conditional Probability
3:58
Combining Independent Events
4:10
Example I: Independence
5:36
Example II: Independence
14:14
Example III: Independence
21:10
Example IV: Independence
32:45
Example V: Independence
41:13
Bayes' Rule

1h 2m 10s

Intro
0:00
When to Use Bayes' Rule
0:08
When to Use Bayes' Rule: Disjoint Union of Events
0:09
Bayes' Rule for Two Choices
2:50
Bayes' Rule for Two Choices
2:51
Bayes' Rule for Multiple Choices
5:03
Bayes' Rule for Multiple Choices
5:04
Example I: What is the Chance that She is Diabetic?
6:55
Example I: Setting up the Events
6:56
Example I: Solution
11:33
Example II: What is the chance that It Belongs to a Woman?
19:28
Example II: Setting up the Events
19:29
Example II: Solution
21:45
Example III: What is the Probability that She is a Democrat?
27:31
Example III: Setting up the Events
27:32
Example III: Solution
32:08
Example IV: What is the chance that the Fruit is an Apple?
39:11
Example IV: Setting up the Events
39:12
Example IV: Solution
43:50
Example V: What is the Probability that the Oldest Child is a Girl?
51:16
Example V: Setting up the Events
51:17
Example V: Solution
53:07
Section 2: Random Variables
Random Variables & Probability Distribution

38m 21s

Intro
0:00
Intuition
0:15
Intuition for Random Variable
0:16
Example: Random Variable
0:44
Intuition, Cont.
2:52
Example: Random Variable as Payoff
2:57
Definition
5:11
Definition of a Random Variable
5:13
Example: Random Variable in Baseball
6:02
Probability Distributions
7:18
Probability Distributions
7:19
Example I: Probability Distribution for the Random Variable
9:29
Example II: Probability Distribution for the Random Variable
14:52
Example III: Probability Distribution for the Random Variable
21:52
Example IV: Probability Distribution for the Random Variable
27:25
Example V: Probability Distribution for the Random Variable
34:12
Expected Value (Mean)

46m 14s

Intro
0:00
Definition of Expected Value
0:20
Expected Value of a (Discrete) Random Variable or Mean
0:21
Indicator Variables
3:03
Indicator Variable
3:04
Linearity of Expectation
4:36
Linearity of Expectation for Random Variables
4:37
Expected Value of a Function
6:03
Expected Value of a Function
6:04
Example I: Expected Value
7:30
Example II: Expected Value
14:14
Example III: Expected Value of Flipping a Coin
21:42
Example III: Part A
21:43
Example III: Part B
30:43
Example IV: Semester Average
36:39
Example V: Expected Value of a Function of a Random Variable
41:28
Variance & Standard Deviation

47m 23s

Intro
0:00
Definition of Variance
0:11
Variance of a Random Variable
0:12
Variance is a Measure of the Variability, or Volatility
1:06
Most Useful Way to Calculate Variance
2:46
Definition of Standard Deviation
3:44
Standard Deviation of a Random Variable
3:45
Example I: Which of the Following Sets of Data Has the Largest Variance?
5:34
Example II: Which of the Following Would be the Least Useful in Understanding a Set of Data?
9:02
Example III: Calculate the Mean, Variance, & Standard Deviation
11:48
Example III: Mean
12:56
Example III: Variance
14:06
Example III: Standard Deviation
15:42
Example IV: Calculate the Mean, Variance, & Standard Deviation
17:54
Example IV: Mean
18:47
Example IV: Variance
20:36
Example IV: Standard Deviation
25:34
Example V: Calculate the Mean, Variance, & Standard Deviation
29:56
Example V: Mean
30:13
Example V: Variance
33:28
Example V: Standard Deviation
34:48
Example VI: Calculate the Mean, Variance, & Standard Deviation
37:29
Example VI: Possible Outcomes
38:09
Example VI: Mean
39:29
Example VI: Variance
41:22
Example VI: Standard Deviation
43:28
Markov's Inequality

26m 45s

Intro
0:00
Markov's Inequality
0:25
Markov's Inequality: Definition & Condition
0:26
Markov's Inequality: Equation
1:15
Markov's Inequality: Reverse Equation
2:48
Example I: Money
4:11
Example II: Rental Car
9:23
Example III: Probability of an Earthquake
12:22
Example IV: Defective Laptops
16:52
Example V: Cans of Tuna
21:06
Tchebysheff's Inequality

42m 11s

Intro
0:00
Tchebysheff's Inequality (Also Known as Chebyshev's Inequality)
0:52
Tchebysheff's Inequality: Definition
0:53
Tchebysheff's Inequality: Equation
1:19
Tchebysheff's Inequality: Intuition
3:21
Tchebysheff's Inequality in Reverse
4:09
Tchebysheff's Inequality in Reverse
4:10
Intuition
5:13
Example I: Money
5:55
Example II: College Units
13:20
Example III: Using Tchebysheff's Inequality to Estimate Proportion
16:40
Example IV: Probability of an Earthquake
25:21
Example V: Using Tchebysheff's Inequality to Estimate Proportion
32:57
Section 3: Discrete Distributions
Binomial Distribution (Bernoulli Trials)

52m 36s

Intro
0:00
Binomial Distribution
0:29
Binomial Distribution (Bernoulli Trials) Overview
0:30
Prototypical Examples: Flipping a Coin n Times
1:36
Process with Two Outcomes: Games Between Teams
2:12
Process with Two Outcomes: Rolling a Die to Get a 6
2:42
Formula for the Binomial Distribution
3:45
Fixed Parameters
3:46
Formula for the Binomial Distribution
6:27
Key Properties of the Binomial Distribution
9:54
Mean
9:55
Variance
10:56
Standard Deviation
11:13
Example I: Games Between Teams
11:36
Example II: Exam Score
17:01
Example III: Expected Grade & Standard Deviation
25:59
Example IV: Pogo-sticking Championship, Part A
33:25
Example IV: Pogo-sticking Championship, Part B
38:24
Example V: Expected Championships Winning & Standard Deviation
45:22
Geometric Distribution

52m 50s

Intro
0:00
Geometric Distribution
0:22
Geometric Distribution: Definition
0:23
Prototypical Example: Flipping a Coin Until We Get a Head
1:08
Geometric Distribution vs. Binomial Distribution.
1:31
Formula for the Geometric Distribution
2:13
Fixed Parameters
2:14
Random Variable
2:49
Formula for the Geometric Distribution
3:16
Key Properties of the Geometric Distribution
6:47
Mean
6:48
Variance
7:10
Standard Deviation
7:25
Geometric Series
7:46
Recall from Calculus II: Sum of Infinite Series
7:47
Application to Geometric Distribution
10:10
Example I: Drawing Cards from a Deck (With Replacement) Until You Get an Ace
13:02
Example I: Question & Solution
13:03
Example II: Mean & Standard Deviation of Winning Pin the Tail on the Donkey
16:32
Example II: Mean
16:33
Example II: Standard Deviation
18:37
Example III: Rolling a Die
22:09
Example III: Setting Up
22:10
Example III: Part A
24:18
Example III: Part B
26:01
Example III: Part C
27:38
Example III: Summary
32:02
Example IV: Job Interview
35:16
Example IV: Setting Up
35:15
Example IV: Part A
37:26
Example IV: Part B
38:33
Example IV: Summary
39:37
Example V: Mean & Standard Deviation of Time to Conduct All the Interviews
41:13
Example V: Setting Up
42:50
Example V: Mean
46:05
Example V: Variance
47:37
Example V: Standard Deviation
48:22
Example V: Summary
49:36
Negative Binomial Distribution

51m 39s

Intro
0:00
Negative Binomial Distribution
0:11
Negative Binomial Distribution: Definition
0:12
Prototypical Example: Flipping a Coin Until We Get r Successes
0:46
Negative Binomial Distribution vs. Binomial Distribution
1:04
Negative Binomial Distribution vs. Geometric Distribution
1:33
Formula for Negative Binomial Distribution
3:39
Fixed Parameters
3:40
Random Variable
4:57
Formula for Negative Binomial Distribution
5:18
Key Properties of Negative Binomial
7:44
Mean
7:47
Variance
8:03
Standard Deviation
8:09
Example I: Drawing Cards from a Deck (With Replacement) Until You Get Four Aces
8:32
Example I: Question & Solution
8:33
Example II: Chinchilla Grooming
12:37
Example II: Mean
12:38
Example II: Variance
15:09
Example II: Standard Deviation
15:51
Example II: Summary
17:10
Example III: Rolling a Die Until You Get Four Sixes
18:27
Example III: Setting Up
19:38
Example III: Mean
19:38
Example III: Variance
20:31
Example III: Standard Deviation
21:21
Example IV: Job Applicants
24:00
Example IV: Setting Up
24:01
Example IV: Part A
26:16
Example IV: Part B
29:53
Example V: Mean & Standard Deviation of Time to Conduct All the Interviews
40:10
Example V: Setting Up
40:11
Example V: Mean
45:24
Example V: Variance
46:22
Example V: Standard Deviation
47:01
Example V: Summary
48:16
Hypergeometric Distribution

36m 27s

Intro
0:00
Hypergeometric Distribution
0:11
Hypergeometric Distribution: Definition
0:12
Random Variable
1:38
Formula for the Hypergeometric Distribution
1:50
Fixed Parameters
1:51
Formula for the Hypergeometric Distribution
2:53
Key Properties of Hypergeometric
6:14
Mean
6:15
Variance
6:42
Standard Deviation
7:16
Example I: Students Committee
7:30
Example II: Expected Number of Women on the Committee in Example I
11:08
Example III: Pairs of Shoes
13:49
Example IV: What is the Expected Number of Left Shoes in Example III?
20:46
Example V: Using Indicator Variables & Linearity of Expectation
25:40
Poisson Distribution

52m 19s

Intro
0:00
Poisson Distribution
0:18
Poisson Distribution: Definition
0:19
Formula for the Poisson Distribution
2:16
Fixed Parameter
2:17
Formula for the Poisson Distribution
2:59
Key Properties of the Poisson Distribution
5:30
Mean
5:34
Variance
6:07
Standard Deviation
6:27
Example I: Forest Fires
6:41
Example II: Call Center, Part A
15:56
Example II: Call Center, Part B
20:50
Example III: Confirming that the Mean of the Poisson Distribution is λ
26:53
Example IV: Find E (Y²) for the Poisson Distribution
35:24
Example V: Earthquakes, Part A
37:57
Example V: Earthquakes, Part B
44:02
Section 4: Continuous Distributions
Density & Cumulative Distribution Functions

57m 17s

Intro
0:00
Density Functions
0:43
Density Functions
0:44
Density Function to Calculate Probabilities
2:41
Cumulative Distribution Functions
4:28
Cumulative Distribution Functions
4:29
Using F to Calculate Probabilities
5:58
Properties of the CDF (Density & Cumulative Distribution Functions)
7:27
F(-∞) = 0
7:34
F(∞) = 1
8:30
F is Increasing
9:14
F'(y) = f(y)
9:21
Example I: Density & Cumulative Distribution Functions, Part A
9:43
Example I: Density & Cumulative Distribution Functions, Part B
14:16
Example II: Density & Cumulative Distribution Functions, Part A
21:41
Example II: Density & Cumulative Distribution Functions, Part B
26:16
Example III: Density & Cumulative Distribution Functions, Part A
32:17
Example III: Density & Cumulative Distribution Functions, Part B
37:08
Example IV: Density & Cumulative Distribution Functions
43:34
Example V: Density & Cumulative Distribution Functions, Part A
51:53
Example V: Density & Cumulative Distribution Functions, Part B
54:19
Mean & Variance for Continuous Distributions

36m 18s

Intro
0:00
Mean
0:32
Mean for a Continuous Random Variable
0:33
Expectation is Linear
2:07
Variance
2:55
Variance for Continuous random Variable
2:56
Easier to Calculate Via the Mean
3:26
Standard Deviation
5:03
Standard Deviation
5:04
Example I: Mean & Variance for Continuous Distributions
5:43
Example II: Mean & Variance for Continuous Distributions
10:09
Example III: Mean & Variance for Continuous Distributions
16:05
Example IV: Mean & Variance for Continuous Distributions
26:40
Example V: Mean & Variance for Continuous Distributions
30:12
Uniform Distribution

32m 49s

Intro
0:00
Uniform Distribution
0:15
Uniform Distribution
0:16
Each Part of the Region is Equally Probable
1:39
Key Properties of the Uniform Distribution
2:45
Mean
2:46
Variance
3:27
Standard Deviation
3:48
Example I: Newspaper Delivery
5:25
Example II: Picking a Real Number from a Uniform Distribution
8:21
Example III: Dinner Date
11:02
Example IV: Proving that a Variable is Uniformly Distributed
18:50
Example V: Ice Cream Serving
27:22
Normal (Gaussian) Distribution

1h 3m 54s

Intro
0:00
Normal (Gaussian) Distribution
0:35
Normal (Gaussian) Distribution & The Bell Curve
0:36
Fixed Parameters
0:55
Formula for the Normal Distribution
1:32
Formula for the Normal Distribution
1:33
Calculating on the Normal Distribution can be Tricky
3:32
Standard Normal Distribution
5:12
Standard Normal Distribution
5:13
Graphing the Standard Normal Distribution
6:13
Standard Normal Distribution, Cont.
8:30
Standard Normal Distribution Chart
8:31
Nonstandard Normal Distribution
14:44
Nonstandard Normal Variable & Associated Standard Normal
14:45
Finding Probabilities for Z
15:39
Example I: Chance that Standard Normal Variable Will Land Between 1 and 2?
16:46
Example I: Setting Up the Equation & Graph
16:47
Example I: Solving for z Using the Standard Normal Chart
19:05
Example II: What Proportion of the Data Lies within Two Standard Deviations of the Mean?
20:41
Example II: Setting Up the Equation & Graph
20:42
Example II: Solving for z Using the Standard Normal Chart
24:38
Example III: Scores on an Exam
27:34
Example III: Setting Up the Equation & Graph, Part A
27:35
Example III: Setting Up the Equation & Graph, Part B
33:48
Example III: Solving for z Using the Standard Normal Chart, Part A
38:23
Example III: Solving for z Using the Standard Normal Chart, Part B
40:49
Example IV: Temperatures
42:54
Example IV: Setting Up the Equation & Graph
42:55
Example IV: Solving for z Using the Standard Normal Chart
47:03
Example V: Scores on an Exam
48:41
Example V: Setting Up the Equation & Graph, Part A
48:42
Example V: Setting Up the Equation & Graph, Part B
53:20
Example V: Solving for z Using the Standard Normal Chart, Part A
57:45
Example V: Solving for z Using the Standard Normal Chart, Part B
59:17
Gamma Distribution (with Exponential & Chi-square)

1h 8m 27s

Intro
0:00
Gamma Function
0:49
The Gamma Function
0:50
Properties of the Gamma Function
2:07
Formula for the Gamma Distribution
3:50
Fixed Parameters
3:51
Density Function for Gamma Distribution
4:07
Key Properties of the Gamma Distribution
7:13
Mean
7:14
Variance
7:25
Standard Deviation
7:30
Exponential Distribution
8:03
Definition of Exponential Distribution
8:04
Density
11:23
Mean
13:26
Variance
13:48
Standard Deviation
13:55
Chi-square Distribution
14:34
Chi-square Distribution: Overview
14:35
Chi-square Distribution: Mean
16:27
Chi-square Distribution: Variance
16:37
Chi-square Distribution: Standard Deviation
16:55
Example I: Graphing Gamma Distribution
17:30
Example I: Graphing Gamma Distribution
17:31
Example I: Describe the Effects of Changing α and β on the Shape of the Graph
23:33
Example II: Exponential Distribution
27:11
Example II: Using the Exponential Distribution
27:12
Example II: Summary
35:34
Example III: Earthquake
37:05
Example III: Estimate Using Markov's Inequality
37:06
Example III: Estimate Using Tchebysheff's Inequality
40:13
Example III: Summary
44:13
Example IV: Finding Exact Probability of Earthquakes
46:45
Example IV: Finding Exact Probability of Earthquakes
46:46
Example IV: Summary
51:44
Example V: Prove and Interpret Why the Exponential Distribution is Called 'Memoryless'
52:51
Example V: Prove
52:52
Example V: Interpretation
57:44
Example V: Summary
1:03:54
Beta Distribution

52m 45s

Intro
0:00
Beta Function
0:29
Fixed parameters
0:30
Defining the Beta Function
1:19
Relationship between the Gamma & Beta Functions
2:02
Beta Distribution
3:31
Density Function for the Beta Distribution
3:32
Key Properties of the Beta Distribution
6:56
Mean
6:57
Variance
7:16
Standard Deviation
7:37
Example I: Calculate B(3,4)
8:10
Example II: Graphing the Density Functions for the Beta Distribution
12:25
Example III: Show that the Uniform Distribution is a Special Case of the Beta Distribution
24:57
Example IV: Show that this Triangular Distribution is a Special Case of the Beta Distribution
31:20
Example V: Morning Commute
37:39
Example V: Identify the Density Function
38:45
Example V: Morning Commute, Part A
42:22
Example V: Morning Commute, Part B
44:19
Example V: Summary
49:13
Moment-Generating Functions

51m 58s

Intro
0:00
Moments
0:30
Definition of Moments
0:31
Moment-Generating Functions (MGFs)
3:53
Moment-Generating Functions
3:54
Using the MGF to Calculate the Moments
5:21
Moment-Generating Functions for the Discrete Distributions
8:22
Moment-Generating Functions for Binomial Distribution
8:36
Moment-Generating Functions for Geometric Distribution
9:06
Moment-Generating Functions for Negative Binomial Distribution
9:28
Moment-Generating Functions for Hypergeometric Distribution
9:43
Moment-Generating Functions for Poisson Distribution
9:57
Moment-Generating Functions for the Continuous Distributions
11:34
Moment-Generating Functions for the Uniform Distributions
11:43
Moment-Generating Functions for the Normal Distributions
12:24
Moment-Generating Functions for the Gamma Distributions
12:36
Moment-Generating Functions for the Exponential Distributions
12:44
Moment-Generating Functions for the Chi-square Distributions
13:11
Moment-Generating Functions for the Beta Distributions
13:48
Useful Formulas with Moment-Generating Functions
15:02
Useful Formulas with Moment-Generating Functions 1
15:03
Useful Formulas with Moment-Generating Functions 2
16:21
Example I: Moment-Generating Function for the Binomial Distribution
17:33
Example II: Use the MGF for the Binomial Distribution to Find the Mean of the Distribution
24:40
Example III: Find the Moment Generating Function for the Poisson Distribution
29:28
Example IV: Use the MGF for Poisson Distribution to Find the Mean and Variance of the Distribution
36:27
Example V: Find the Moment-generating Function for the Uniform Distribution
44:47
Section 5: Multivariate Distributions
Bivariate Density & Distribution Functions

50m 52s

Intro
0:00
Bivariate Density Functions
0:21
Two Variables
0:23
Bivariate Density Function
0:52
Properties of the Density Function
1:57
Properties of the Density Function 1
1:59
Properties of the Density Function 2
2:20
We Can Calculate Probabilities
2:53
If You Have a Discrete Distribution
4:36
Bivariate Distribution Functions
5:25
Bivariate Distribution Functions
5:26
Properties of the Bivariate Distribution Functions 1
7:19
Properties of the Bivariate Distribution Functions 2
7:36
Example I: Bivariate Density & Distribution Functions
8:08
Example II: Bivariate Density & Distribution Functions
14:40
Example III: Bivariate Density & Distribution Functions
24:33
Example IV: Bivariate Density & Distribution Functions
32:04
Example V: Bivariate Density & Distribution Functions
40:26
Marginal Probability

42m 38s

Intro
0:00
Discrete Case
0:48
Marginal Probability Functions
0:49
Continuous Case
3:07
Marginal Density Functions
3:08
Example I: Compute the Marginal Probability Function
5:58
Example II: Compute the Marginal Probability Function
14:07
Example III: Marginal Density Function
24:01
Example IV: Marginal Density Function
30:47
Example V: Marginal Density Function
36:05
Conditional Probability & Conditional Expectation

1h 2m 24s

Intro
0:00
Review of Marginal Probability
0:46
Recall the Marginal Probability Functions & Marginal Density Functions
0:47
Conditional Probability, Discrete Case
3:14
Conditional Probability, Discrete Case
3:15
Conditional Probability, Continuous Case
4:15
Conditional Density of Y₁ given that Y₂ = y₂
4:16
Interpret This as a Density on Y₁ & Calculate Conditional Probability
5:03
Conditional Expectation
6:44
Conditional Expectation: Continuous
6:45
Conditional Expectation: Discrete
8:03
Example I: Conditional Probability
8:29
Example II: Conditional Probability
23:59
Example III: Conditional Probability
34:28
Example IV: Conditional Expectation
43:16
Example V: Conditional Expectation
48:28
Independent Random Variables

51m 39s

Intro
0:00
Intuition
0:55
Experiment with Two Random Variables
0:56
Intuition Formula
2:17
Definition and Formulas
4:43
Definition
4:44
Short Version: Discrete
5:10
Short Version: Continuous
5:48
Theorem
9:33
For Continuous Random Variables, Y₁ & Y₂ are Independent If & Only If: Condition 1
9:34
For Continuous Random Variables, Y₁ & Y₂ are Independent If & Only If: Condition 2
11:22
Example I: Use the Definition to Determine if Y₁ and Y₂ are Independent
12:49
Example II: Use the Definition to Determine if Y₁ and Y₂ are Independent
21:33
Example III: Are Y₁ and Y₂ Independent?
27:01
Example IV: Are Y₁ and Y₂ Independent?
34:51
Example V: Are Y₁ and Y₂ Independent?
43:44
Expected Value of a Function of Random Variables

37m 7s

Intro
0:00
Review of Single Variable Case
0:29
Expected Value of a Single Variable
0:30
Expected Value of a Function g(Y)
1:12
Bivariate Case
2:11
Expected Value of a Function g(Y₁, Y₂)
2:12
Linearity of Expectation
3:24
Linearity of Expectation 1
3:25
Linearity of Expectation 2
3:38
4:03
Example I: Calculate E (Y₁ + Y₂)
4:39
Example II: Calculate E (Y₁Y₂)
14:47
Example III: Calculate E (U₁) and E(U₂)
19:33
Example IV: Calculate E (Y₁) and E(Y₂)
22:50
Example V: Calculate E (2Y₁ + 3Y₂)
33:05
Covariance, Correlation & Linear Functions

59m 50s

Intro
0:00
Definition and Formulas for Covariance
0:38
Definition of Covariance
0:39
Formulas to Calculate Covariance
1:36
Intuition for Covariance
3:54
Covariance is a Measure of Dependence
3:55
Dependence Doesn't Necessarily Mean that the Variables Do the Same Thing
4:12
If Variables Move Together
4:47
If Variables Move Against Each Other
5:04
Both Cases Show Dependence!
5:30
Independence Theorem
8:10
Independence Theorem
8:11
The Converse is Not True
8:32
Correlation Coefficient
9:33
Correlation Coefficient
9:34
Linear Functions of Random Variables
11:57
Linear Functions of Random Variables: Expected Value
11:58
Linear Functions of Random Variables: Variance
12:58
Linear Functions of Random Variables, Cont.
14:30
Linear Functions of Random Variables: Covariance
14:35
Example I: Calculate E (Y₁), E (Y₂), and E (Y₁Y₂)
15:31
Example II: Are Y₁ and Y₂ Independent?
29:16
Example III: Calculate V (U₁) and V (U₂)
36:14
Example IV: Calculate the Covariance Correlation Coefficient
42:12
Example V: Find the Mean and Variance of the Average
52:19
Section 6: Distributions of Functions of Random Variables
Distribution Functions

1h 7m 35s

Intro
0:00
Premise
0:44
Premise
0:45
Goal
1:38
Goal Number 1: Find the Full Distribution Function
1:39
Goal Number 2: Find the Density Function
1:55
Goal Number 3: Calculate Probabilities
2:17
Three Methods
3:05
Method 1: Distribution Functions
3:06
Method 2: Transformations
3:38
Method 3: Moment-generating Functions
3:47
Distribution Functions
4:03
Distribution Functions
4:04
Example I: Find the Density Function
6:41
Step 1: Find the Distribution Function
6:42
Step 2: Find the Density Function
10:20
Summary
11:51
Example II: Find the Density Function
14:36
Step 1: Find the Distribution Function
14:37
Step 2: Find the Density Function
18:19
Summary
19:22
Example III: Find the Cumulative Distribution & Density Functions
20:39
Step 1: Find the Cumulative Distribution
20:40
Step 2: Find the Density Function
28:58
Summary
30:20
Example IV: Find the Density Function
33:01
Step 1: Setting Up the Equation & Graph
33:02
Step 2: If u ≤ 1
38:32
Step 3: If u ≥ 1
41:02
Step 4: Find the Distribution Function
42:40
Step 5: Find the Density Function
43:11
Summary
45:03
Example V: Find the Density Function
48:32
Step 1: Exponential
48:33
Step 2: Independence
50:48
Step 2: Find the Distribution Function
51:47
Step 3: Find the Density Function
1:00:17
Summary
1:02:05
Transformations

1h 16s

Intro
0:00
Premise
0:32
Premise
0:33
Goal
1:37
Goal Number 1: Find the Full Distribution Function
1:38
Goal Number 2: Find the Density Function
1:49
Goal Number 3: Calculate Probabilities
2:04
Three Methods
2:34
Method 1: Distribution Functions
2:35
Method 2: Transformations
2:57
Method 3: Moment-generating Functions
3:05
Requirements for Transformation Method
3:22
The Transformation Method Only Works for Single-variable Situations
3:23
Must be a Strictly Monotonic Function
3:50
Example: Strictly Monotonic Function
4:50
If the Function is Monotonic, Then It is Invertible
5:30
Formula for Transformations
7:09
Formula for Transformations
7:11
Example I: Determine whether the Function is Monotonic, and if so, Find Its Inverse
8:26
Example II: Find the Density Function
12:07
Example III: Determine whether the Function is Monotonic, and if so, Find Its Inverse
17:12
Example IV: Find the Density Function for the Magnitude of the Next Earthquake
21:30
Example V: Find the Expected Magnitude of the Next Earthquake
33:20
Example VI: Find the Density Function, Including the Range of Possible Values for u
47:42
Moment-Generating Functions

1h 18m 52s

Intro
0:00
Premise
0:30
Premise
0:31
Goal
1:40
Goal Number 1: Find the Full Distribution Function
1:41
Goal Number 2: Find the Density Function
1:51
Goal Number 3: Calculate Probabilities
2:01
Three Methods
2:39
Method 1: Distribution Functions
2:40
Method 2: Transformations
2:50
Method 3: Moment-Generating Functions
2:55
Review of Moment-Generating Functions
3:04
Recall: The Moment-Generating Function for a Random Variable Y
3:05
The Moment-Generating Function is a Function of t (Not y)
3:45
Moment-Generating Functions for the Discrete Distributions
4:31
Binomial
4:50
Geometric
5:12
Negative Binomial
5:24
Hypergeometric
5:33
Poisson
5:42
Moment-Generating Functions for the Continuous Distributions
6:08
Uniform
6:09
Normal
6:17
Gamma
6:29
Exponential
6:34
Chi-square
7:05
Beta
7:48
Useful Formulas with the Moment-Generating Functions
8:48
Useful Formula 1
8:49
Useful Formula 2
9:51
How to Use Moment-Generating Functions
10:41
How to Use Moment-Generating Functions
10:42
Example I: Find the Density Function
12:22
Example II: Find the Density Function
30:58
Example III: Find the Probability Function
43:29
Example IV: Find the Probability Function
51:43
Example V: Find the Distribution
1:00:14
Example VI: Find the Density Function
1:12:10
Order Statistics

1h 4m 56s

Intro
0:00
Premise
0:11
Example Question: How Tall Will the Tallest Student in My Next Semester's Probability Class Be?
0:12
Setting
0:56
Definition 1
1:49
Definition 2
2:01
Question: What are the Distributions & Densities?
4:08
Formulas
4:47
Distribution of Max
5:11
Density of Max
6:00
Distribution of Min
7:08
Density of Min
7:18
Example I: Distribution & Density Functions
8:29
Example I: Distribution
8:30
Example I: Density
11:07
Example I: Summary
12:33
Example II: Distribution & Density Functions
14:25
Example II: Distribution
14:26
Example II: Density
17:21
Example II: Summary
19:00
Example III: Mean & Variance
20:32
Example III: Mean
20:33
Example III: Variance
25:48
Example III: Summary
30:57
Example IV: Distribution & Density Functions
35:43
Example IV: Distribution
35:44
Example IV: Density
43:03
Example IV: Summary
46:11
Example V: Find the Expected Time Until the Team's First Injury
51:14
Example V: Solution
51:15
Example V: Summary
1:01:11
Sampling from a Normal Distribution

1h 7s

Intro
0:00
Setting
0:36
Setting
0:37
Assumptions and Notation
2:18
Assumption Forever
2:19
Assumption for this Lecture Only
3:21
Notation
3:49
The Sample Mean
4:15
Statistic We'll Study the Sample Mean
4:16
Theorem
5:40
Standard Normal Distribution
7:03
Standard Normal Distribution
7:04
Converting to Standard Normal
10:11
Recall
10:12
Corollary to Theorem
10:41
Example I: Heights of Students
13:18
Example II: What Happens to This Probability as n → ∞
22:36
Example III: Units at a University
32:24
Example IV: Probability of Sample Mean
40:53
Example V: How Many Samples Should We Take?
48:34
The Central Limit Theorem

1h 9m 55s

Intro
0:00
Setting
0:52
Setting
0:53
Assumptions and Notation
2:53
Our Samples are Independent (Independent Identically Distributed)
2:54
No Longer Assume that the Population is Normally Distributed
3:30
The Central Limit Theorem
4:36
The Central Limit Theorem Overview
4:38
The Central Limit Theorem in Practice
6:24
Standard Normal Distribution
8:09
Standard Normal Distribution
8:13
Converting to Standard Normal
10:13
Recall: If Y is Normal, Then …
10:14
Corollary to Theorem
11:09
Example I: Probability of Finishing Your Homework
12:56
Example I: Solution
12:57
Example I: Summary
18:20
Example I: Confirming with the Standard Normal Distribution Chart
20:18
Example II: Probability of Selling Muffins
21:26
Example II: Solution
21:27
Example II: Summary
29:09
Example II: Confirming with the Standard Normal Distribution Chart
31:09
Example III: Probability that a Soda Dispenser Gives the Correct Amount of Soda
32:41
Example III: Solution
32:42
Example III: Summary
38:03
Example III: Confirming with the Standard Normal Distribution Chart
40:58
Example IV: How Many Samples Should She Take?
42:06
Example IV: Solution
42:07
Example IV: Summary
49:18
Example IV: Confirming with the Standard Normal Distribution Chart
51:57
Example V: Restaurant Revenue
54:41
Example V: Solution
54:42
Example V: Summary
1:04:21
Example V: Confirming with the Standard Normal Distribution Chart
1:06:48
Bookmark & Share Embed

## Copy & Paste this embed code into your website’s HTML

Please ensure that your website editor is in text mode when you paste the code.
(In Wordpress, the mode button is on the top right corner.)
×
• - Allow users to view the embedded video in full-size.
Since this lesson is not free, only the preview will appear on your website.

• ## Transcription 1 answer Last reply by: Dr. William MurrayFri Nov 10, 2017 12:48 PMPost by Natalia Stein on November 7, 2017What  happens when the average of 5 laptops are usually defective in each batch?

### Markov's Inequality

Lecture Slides are screen-captured images of important points in the lecture. Students can download and print out these lecture slide images to do practice problems as well as take notes while watching the lecture.

• Intro 0:00
• Markov's Inequality 0:25
• Markov's Inequality: Definition & Condition
• Markov's Inequality: Equation
• Markov's Inequality: Reverse Equation
• Example I: Money 4:11
• Example II: Rental Car 9:23
• Example III: Probability of an Earthquake 12:22
• Example IV: Defective Laptops 16:52
• Example V: Cans of Tuna 21:06

### Transcription: Markov's Inequality

Hi and welcome back to the probability lectures here on www.educator.com, my name is Will Murray.0000

Today's lecture is on Markov’s inequality.0005

Markov’s inequality is one of two inequalities that you can use to estimate probabilities quickly,0008

the other one was Tchebysheff's inequality.0015

The next lecture will be on Tchebysheff's inequality.0017

If you are looking for that one, just skip ahead on the next video.0020

If you are looking for Markov’s inequality, here we go.0023

Markov’s inequality is a quick way of estimating probabilities based only on the mean of a random variable.0027

You all you have to know is the mean of a random variable or the expected value.0033

Remember that mean and expected value mean exactly the same thing.0038

One important condition that you need to use Markov’s inequality is that, your random variable has only positive values.0042

You have to be estimating things that can only be counted in positive numbers, like a number of customer at a business,0052

or the number of miles a car is driven, or the amount of money you have assuming you are not allowed to go into debt,0059

things like that, that can be only measured using positive values.0065

If it is a random variable that takes on negative values then Markov’s inequality is not necessarily true.0069

The way it works is you have some constant number, that is this value A here.0076

What we are going to do is estimate the probability that the variable will be bigger than that value A.0082

Markov’s inequality gives you an answer for that.0089

It says that the probability is less than the expected value of Y or that is the same as the mean divided by A.0092

Maybe another form you might have seen this inequality is μ/A0101

because remember that people use μ as the expected value as a shorthand for the expected value.0106

There is one thing I want to emphasize about Markov’s inequality which is that it is really a one sided estimation.0114

This is a one sided bound, it gives you an upper bound.0123

It does not tell you that the probability is equal to that, it just gives you an upper bound, one sided bound on the probability.0132

Whenever you answer a question using Markov's inequality,0146

your answer will always be something like the probability is less than something or the probability is greater than something.0151

You can never say that the probability is exactly equal to something based on Markov’s inequality.0157

It just gives you a one sided upper bound.0163

You can reverse this, we said we are calculating the probability that Y is greater than or equal to A to the basic form of Markov’s inequality.0167

You can also switch that around and ask what the probability of Y being less than A?0179

That is exactly the opposite of Y being greater than or equal to A, we get the complement of that.0185

The probability is now greater than or equal to 1 - the expected value of Y/A.0192

Remember, we can also write that as the expected value of Y as μ.0200

You can say it is greater than or equal to 1 - μ/A.0205

Again, this is a one sided thing.0210

It can tell you that the probability that Y is greater than A is less than something.0213

It can tell you that the probability that Y is less than A is greater than something, but you can never reverse those.0218

You have to be very careful about how you use Markov’s inequality which way into inequality go,0225

more practices as we go through some of the examples.0231

You also have to be careful never to say this probability is equal to something0234

because Markov’s inequality will never tell you that.0239

It will always just say this probability is less than something or this probability is greater than something.0241

Let us check this out with some examples.0248

First example, have we done a survey on a particular college campus and apparently,0252

the students on this campus are all going to have some cash in their wallets.0259

It turns out that the average amount of cash that these students are carrying is \$20.00.0263

The question is, if we meet a student at random and ask here how much cash she is caring,0270

what is the chance that it should be carrying more than \$100?0275

Let us estimate the chance that she is carrying less than \$80.00.0278

Let me first emphasize that this is a situation in which Markov’s inequality does apply0282

because the amount of cash students are carrying is always going to be a positive amount.0291

The smallest you can be caring is 0.0296

You could be carrying a definite amount of cash.0299

You could be carrying thousands of dollars in cash but you cannot be carrying a negative amount of cash.0305

This is always the amount of cash that you are carrying is always positive.0311

It is a situation in which we can apply Markov’s inequality.0318

Let me go ahead and write down Markov’s inequality and we will see how to apply it to the situation.0323

Markov’s inequality, remember, we said the probability that Y is greater than0328

or equal to a certain value A is less than or equal to the expected value of A, the mean of Y divided by A.0334

In this case, the first question here we want to estimate the chance that a student is carrying more than \$100.0343

We want to find the probability that a student will have more than 100 and I’m going to fill in the expected value of Y,0350

that is the average value of Y which we have been given as \$20.00.0358

It is 20 for the expected value and then the value of A that we are using is 100, 20/100 simplifies down to 1/5.0362

The answer to our first question there is the probability is less than or equal to 1/5.0372

Notice that I'm being very careful here not to say that it is equal to 1/5.0379

It might be considerably less than 1/5.0384

I do not know if it is less than or less or equal to.0387

To be safe, I’m just going to say the probability is less than or equal 1/5.0390

That is really all we can tell using Markov’s inequality.0394

We also want to estimate the chance that a student is carrying less than \$80.00.0399

That is the other direction of Markov’s inequality, the probability that Y is less than the value A.0405

Reversing Markov’s inequality, that probability is greater than or equal to 1 – E of Y, the expected value /A.0414

That is the equation that we learned back there on the first side.0423

In this case, our A is 80.0426

The probability that Y is less than 80 is greater than or equal to 1 -, the expected value is still 20, that was given in the problem.0430

20/80 is ¼, this is equal to 1 – 1/4 which is ¾.0440

The probability is greater than or equal to ¾.0455

What we can say there, if we meet a random student is that the probability that she will have less than \$80.00 is at least ¾.0462

At least 75% chance she has less than \$80.00 in cash on her.0471

That answers both of our questions here.0477

Note that I could not give you exact probabilities.0480

In either case, I have to give you just inequality because that is all Markov’s inequality gave you.0482

Let me remind you how we did that.0489

I start off with the basic formula of Markov’s inequality.0490

This is just the same equation we got on the first slide here.0493

Since, we are asked about carrying more than \$100, I filled in A =100 here.0497

A equals 100 here and the expected value that is the average of \$20.00.0502

I plugged that in right here and then I just simplify that down 20/100 is 1/5.0509

It is important to get the inequality the right way.0514

What we can say here is that it is unlikely that a student will have more than \$100 and0518

how unlikely it is less than 1/5 or less than 20% chance.0524

On the other side, we are asked about the chance that she is carrying on less than \$80.00.0529

I’m using the less than form of Markov’s inequality, that was the second version that I gave you.0536

We were told that A equals 80, we plug in the expected value is 20.0543

Simplify that down to ¾ and what I can tell you is that if I meet a student,0548

there is at least a 75% chance that she will have less than \$80.00.0554

Let us keep that going with the next example here.0561

Here we have a rental car agency, they are doing some statistical analysis of their cars.0566

They require that customer return cars after a weeks rental, they put an average of 210 miles on the cars.0574

We just had a new customer check out a car for a week and we want to estimate the probability0582

that the customer will put more than 350 miles on the car.0589

This is a classic Markov’s inequality problem, let me write down Markov’s inequality to get us started.0594

The probability that Y is greater than or equal to A is less than or equal to the expected value or the mean value of Y/A.0600

In this case, we want to estimate the probability that he will put more than 350 miles.0610

The 350 is the A there, 350 is less or equal to.0617

The expected value is the average number of miles that these customers are putting on the cars, that is 210/350.0623

If I divide top and bottom by 70 there, we also have a factor of 70, that would simplify down to 3/5.0634

That is as simple as it is going to get.0644

The probability is less than or equal to, 3/5 is 60%.0648

I will write that as a percentage.0653

What that tells me, what we can tell our associates for this rental car company is that this particular customer,0657

there is less than 60% chance that this customer is going to put 350 miles on the car.0664

That is the best we can say, we can never give an exact answer with Markov’s inequality.0675

We can just put a bound on it, above and below.0680

Here, we put an upper bound of 60% chance that the customer is going to put that many miles on the car.0682

To show you how I got that, start out with the basic version of the Markov’s inequality.0689

I figured out that the A I was looking for was 350, that came from the stem of the problem here.0694

I plugged that in, 350 in both places.0701

210 is the average number of miles the customers put on the car, that is the expected value of the random variable.0704

That simplifies down to 3/5 and the important thing here is that you give your answer as an inequality.0713

You do not want to just say 60%.0719

When I taught probability, a lot of times my students will just try to give me a number as an answer.0721

This 60% as the answer and it does not tell me what I want to know because you are saying that is equal to 60%.0727

And we do not know that, all we know is that it is less than or equal to 60%.0736

That is all Markov’s inequality tells us.0740

In example 3 here, we have done some tracking of the history of earthquakes in California.0745

Apparently, there is a major earthquake in California on average, once every 10 years.0752

We want to describe the probability that there would be an earthquake in the next 30 years.0757

Maybe, we are planning a major investment in California and we are wondering0763

how likely it will be that there will be an earthquake in the next 30 years.0766

Let me described carefully here what the random variable is,0774

because I think it is a little less obvious in this one than in some of the previous ones.0778

Y here is going to be the waiting time until the next major earthquake.0782

What they have really told us when we say that it occurs on average once every 10 years,0802

they told us that the average waiting time for one earthquake to the next is 10.0807

E of Y is equal to 10, that is what they have given us.0813

We want to find the probability that there will be an earthquake in the next 30 years.0817

The probability that our waiting time for the next earthquake is less than 30, that is what we are trying to calculate.0823

Within the next 30 years is what we are trying to find.0836

We are going to use Markov’s inequality but since we are trying to estimate the probability that Y is less than a cut off,0842

we are going to use the version of Markov’s inequality, the reversed version of Markov’s inequality,0850

P of Y less than A is greater than or equal to 1 – P of Y/A.0858

In this case, our A is 30.0866

This is 1 – E of Y which is 10/A is 30.0869

I put that that was equal to, I have committed the crime in Markov’s inequality0877

that I have been telling you not to do which I said the probability was equal to something.0882

We never know that for sure, not for Markov’s inequality.0887

We always get a one sided bound so the probability is greater than or equal to 1 – 10/300890

and that simplifies down to 1 -1/3 which is 2/3.0900

What we can say here is that the probability that Y is less than 30, remember,0906

that is the probability that we will have an earthquake in the next 30 years is greater than or equal to 2/3.0914

That is the conclusion we can make from the information we are given and from Markov’s inequality.0921

What that saying is that it is pretty likely that there will be an earthquake in California sometime in the next 30 years, or at least a 2/3,0928

or 67% chance that there will be an earthquake in California in the next 30 years.0938

To show you how I figure that out, the important thing here was setting up the random variable.0945

We said Y is going to be the waiting time, how long the wait until we see the next major earthquake.0949

Since, they occur once every 10 years on average, that does not mean they occur with clockwork regularity every 10 years.0956

It just means they occur on average, once every 10 years.0964

The expected value of that variable is 10.0966

We want to find the probability that it is less than 30 because if it is less than 300971

that means we will have an earthquake in the next 30 years sometime.0976

According to reversed formula for Markov’s inequality, that is a bigger than 1 - the expected value/30.0980

Remember, I use equals and that was a mistake, it is really greater than or equal to, that simplifies down to 2/3.0990

Our final conclusion here is that the probability is greater than 2/3, greater than or equal to 2/3.0998

I think there is at least a 67% chance that we will have an earthquake in the next 30 years here in California.1006

For example 4, we have a factory that produces batches of 1000 laptops.1015

I guess each day they runoff a batch of 1000 laptops and send them out for distribution.1020

They find that on average they will do some testing, on average 2 laptops per batch are defective.1026

They have some kind of a serious defect in them.1033

We want to estimate the probability that in the next batch, fewer than 5 laptops will be defected.1036

Again, this is a Markov’s inequality problem.1044

Let me go ahead and set up the generic inequality for Markov.1047

That is the probability that Y is greater than or equal to A is less than or equal to the expected value of Y divided by A.1054

In this case, we want to reverse that because we want estimate the probability that fewer than 5 laptops will be defective.1064

Let me go ahead and do the reverse version of that.1075

The probability that y is less than A is greater than or equal to 1 – E of Y/A.1078

We are just taking Markov’s inequality and then taking the complement of it.1087

That should not be something you really have to memorize, it should be something you can figure out from the original Markov’s inequality.1090

In this case our A is 5, we want the probability that fewer than 5 laptops will be defective.1097

It is greater than or equal to 1 -, E of Y is the expected value or the mean or the average number of laptops per batch.1104

We are given that that is 2, this is 1 - 2/5 here.1114

I'm filling in 5 for the value of A because that is what we had on the left hand side.1121

1 - 2/5 is 3/5 and we could simplify it, we can convert that into a percentage.1125

The probability here is greater than or equal to 3/5 is 60%.1134

If you are a company manager, and you got some quality control specifications1141

that says you cannot have any more than 5 laptops per batch be defective,1148

which you can say is that in the next batch, there is at least a 60% chance that we would not have 5 or more laptops defective.1154

That is the best you can say with Markov’s inequality.1164

You cannot put a precise value on the probability, you can just give a lower bound and say1167

at least 60% of the time that we will have more than 5 laptops be defective.1172

Where that came from was, I started with the original version of Markov’s inequality and then1179

I realize that I needed to turn this around because the original version has Y being bigger than the cutoff A.1186

In this case, I want to estimate the probability that fewer than 5, that is why less than A.1194

That is the reverse of Markov’s inequality.1201

The probability that Y is less than A is greater than or equal to 1 – E of Y/A.1204

Plug in A equals 5 because that is coming from the stem of the problem.1210

I plugged in the expected value that is the average number of laptops per batch, where that come from, that comes from here.1215

That is where that 2 come from.1223

Let me simplify the numbers down to 3/5 which is 60%.1225

What I can say from this is that the probability is at least 60% that fewer than 5 laptops are defective in the next batch.1229

If you are the factory manager, you can decide whether that is acceptable.1250

Are you willing to accept the 60% chance of having fewer than 5 defectives1254

or do you need to tighten your quality control procedures based on that probability?1258

Let us keep moving onto the next example here.1265

In our final example, we got a grocery store that selling an average of 30 cans of tuna per day.1269

We want to estimate the probability that it will sell more than 80 cans tomorrow.1275

You are the manager of this store and you are worried about whether you are going to run out of your stock of cans of tuna tomorrow1278

Should you order some more or can you hold out for a couple more days?1286

Again, this is kind of a classic Markov’s inequality problem.1291

Something that makes it Markov’s inequality, I did not mention this on some of the previous examples,1294

is that we are only calculating values that are going to be positive here.1300

The number of cans of tuna that a grocery store is going to sell in any given day, that is going to be positive.1306

It could be 0, it could be significantly higher than 0 but it is almost certainly not going to be negative.1312

We are not going to be having many cans of tuna return.1318

This is a positive quantity, it is okay to use Markov’s inequality.1323

Our Y here is the number of cans of tuna sold each day.1336

Let us write down our Markov’s inequality.1348

The probability that Y is less than or equal to A is greater than or equal to E of Y divided by A.1351

That is just our generic formula for Markov’s inequality, we learned that back in the first slide of this lecture.1360

In this case, our A is our cutoff value.1365

The probability in this case, we are trying to estimate the probability that it will sell more than 80 cans.1369

I wrote down my Markov’s inequality, I wrote the inequality so I got them switched here.1380

Probably, the Y is greater than or equal to A is less than or equal to E of Y/A, that is the original version of Markov’s inequality.1386

The one that I was kind of channeling there was the opposite one,1394

the probability that Y is less than A is greater than or equal to 1 - E of Y/A.1398

We have to look at our problem and figure out which one of those is going be relevant.1407

In this case, we want the probability that it will sell more than 80 cans tomorrow so that is a greater than or equal to.1410

The probability that Y is greater than or equal to 80 is less than or equal to1418

the expected value not the average number of cans it sells on a normal day, that is 30 cans.1425

Let me fill in 80 here for my value of A.1433

If I simplify 30/80 that this reduces to 3/8.1439

If you convert that into a percentage, that is very easy to convert into a percentage.1443

It is halfway between 1/4 and ½.1448

It is halfway between 25% and 50% , that is a 37.5%.1452

The probability is less than or equal to 37.5%.1461

If you are a store manager and you are wondering, maybe, you got 80 cans of tuna in stock.1471

You are worry about whether you are going to sell out tomorrow, maybe you are going to need to order some more.1476

The probability that you are going to sell all 80 of those cans is at most 37.5%.1481

You do not know exactly what the probability is, because Markov’s inequality never tells you1488

an exact value but it tells you that it is less than or equal to 37.5%.1492

Let me recap how we did that.1499

We started with the basic version of Markov’s inequality, P of Y is greater than or equal to A less than E of Y/A.1501

I went ahead and wrote down the reverse version, because when I wrote down the basic version,1509

I accidentally switched the inequalities.1513

I wrote down the reverse version just to make sure that we are keeping everything straight there.1515

Since we are selling more than 80 cans that means we want the positive version of Markov’s inequality, the more than version.1522

I fill that in with A equals 80, filled in A = 80 in the denominator here.1532

The E of Y that is the average number of cans sold, that is the 30 from the problem stem.1537

That simplifies down to 3/8 or 37.5%.1545

What Markov’s inequality tells us is that the probability is less than or equal to 37.5%.1548

That less than or equal to is really an important part of your answer.1556

You are giving an upper bound, you are not saying it is equal to 37.5%.1561

You are just saying that is the most it could possibly be.1565

That wraps up our lecture on Markov’s inequality.1569

Next, we are going to be talking about Tchebysheff's inequality which is a little bit stronger than Markov’s inequality.1572

You usually get a stronger version with Tchebysheff's than Markov1578

but after a little bit work form to Tchebysheff's inequality, you have to know the standard deviation.1582

Markov’s inequality, we just had to know the expected value or the mean of the random variable.1588

I hope you stick around and we will learn about Tchebysheff's inequality in the next video.1594

In the meantime, you have been watching the probability videos here on www.educator.com.1597

My name is Will Murray, thank you for watching today, bye.1603

OR

### Start Learning Now

Our free lessons will get you started (Adobe Flash® required).