William Murray

William Murray

Tchebysheff's Inequality

Slide Duration:

Table of Contents

Section 1: Probability by Counting
Experiments, Outcomes, Samples, Spaces, Events

59m 30s

Intro
0:00
Terminology
0:19
Experiment
0:26
Outcome
0:56
Sample Space
1:16
Event
1:55
Key Formula
2:47
Formula for Finding the Probability of an Event
2:48
Example: Drawing a Card
3:36
Example I
5:01
Experiment
5:38
Outcomes
5:54
Probability of the Event
8:11
Example II
12:00
Experiment
12:17
Outcomes
12:34
Probability of the Event
13:49
Example III
16:33
Experiment
17:09
Outcomes
17:33
Probability of the Event
18:25
Example IV
21:20
Experiment
21:21
Outcomes
22:00
Probability of the Event
23:22
Example V
31:41
Experiment
32:14
Outcomes
32:35
Probability of the Event
33:27
Alternate Solution
40:16
Example VI
43:33
Experiment
44:08
Outcomes
44:24
Probability of the Event
53:35
Combining Events: Multiplication & Addition

1h 2m 47s

Intro
0:00
Unions of Events
0:40
Unions of Events
0:41
Disjoint Events
3:42
Intersections of Events
4:18
Intersections of Events
4:19
Conditional Probability
5:47
Conditional Probability
5:48
Independence
8:20
Independence
8:21
Warning: Independent Does Not Mean Disjoint
9:53
If A and B are Independent
11:20
Example I: Choosing a Number at Random
12:41
Solving by Counting
12:52
Solving by Probability
17:26
Example II: Combination
22:07
Combination Deal at a Restaurant
22:08
Example III: Rolling Two Dice
24:18
Define the Events
24:20
Solving by Counting
27:35
Solving by Probability
29:32
Example IV: Flipping a Coin
35:07
Flipping a Coin Four Times
35:08
Example V: Conditional Probabilities
41:22
Define the Events
42:23
Calculate the Conditional Probabilities
46:21
Example VI: Independent Events
53:42
Define the Events
53:43
Are Events Independent?
55:21
Choices: Combinations & Permutations

56m 3s

Intro
0:00
Choices: With or Without Replacement?
0:12
Choices: With or Without Replacement?
0:13
Example: With Replacement
2:17
Example: Without Replacement
2:55
Choices: Ordered or Unordered?
4:10
Choices: Ordered or Unordered?
4:11
Example: Unordered
4:52
Example: Ordered
6:08
Combinations
9:23
Definition & Equation: Combinations
9:24
Example: Combinations
12:12
Permutations
13:56
Definition & Equation: Permutations
13:57
Example: Permutations
15:00
Key Formulas
17:19
Number of Ways to Pick r Things from n Possibilities
17:20
Example I: Five Different Candy Bars
18:31
Example II: Five Identical Candy Bars
24:53
Example III: Five Identical Candy Bars
31:56
Example IV: Five Different Candy Bars
39:21
Example V: Pizza & Toppings
45:03
Inclusion & Exclusion

43m 40s

Intro
0:00
Inclusion/Exclusion: Two Events
0:09
Inclusion/Exclusion: Two Events
0:10
Inclusion/Exclusion: Three Events
2:30
Inclusion/Exclusion: Three Events
2:31
Example I: Inclusion & Exclusion
6:24
Example II: Inclusion & Exclusion
11:01
Example III: Inclusion & Exclusion
18:41
Example IV: Inclusion & Exclusion
28:24
Example V: Inclusion & Exclusion
39:33
Independence

46m 9s

Intro
0:00
Formula and Intuition
0:12
Definition of Independence
0:19
Intuition
0:49
Common Misinterpretations
1:37
Myth & Truth 1
1:38
Myth & Truth 2
2:23
Combining Independent Events
3:56
Recall: Formula for Conditional Probability
3:58
Combining Independent Events
4:10
Example I: Independence
5:36
Example II: Independence
14:14
Example III: Independence
21:10
Example IV: Independence
32:45
Example V: Independence
41:13
Bayes' Rule

1h 2m 10s

Intro
0:00
When to Use Bayes' Rule
0:08
When to Use Bayes' Rule: Disjoint Union of Events
0:09
Bayes' Rule for Two Choices
2:50
Bayes' Rule for Two Choices
2:51
Bayes' Rule for Multiple Choices
5:03
Bayes' Rule for Multiple Choices
5:04
Example I: What is the Chance that She is Diabetic?
6:55
Example I: Setting up the Events
6:56
Example I: Solution
11:33
Example II: What is the chance that It Belongs to a Woman?
19:28
Example II: Setting up the Events
19:29
Example II: Solution
21:45
Example III: What is the Probability that She is a Democrat?
27:31
Example III: Setting up the Events
27:32
Example III: Solution
32:08
Example IV: What is the chance that the Fruit is an Apple?
39:11
Example IV: Setting up the Events
39:12
Example IV: Solution
43:50
Example V: What is the Probability that the Oldest Child is a Girl?
51:16
Example V: Setting up the Events
51:17
Example V: Solution
53:07
Section 2: Random Variables
Random Variables & Probability Distribution

38m 21s

Intro
0:00
Intuition
0:15
Intuition for Random Variable
0:16
Example: Random Variable
0:44
Intuition, Cont.
2:52
Example: Random Variable as Payoff
2:57
Definition
5:11
Definition of a Random Variable
5:13
Example: Random Variable in Baseball
6:02
Probability Distributions
7:18
Probability Distributions
7:19
Example I: Probability Distribution for the Random Variable
9:29
Example II: Probability Distribution for the Random Variable
14:52
Example III: Probability Distribution for the Random Variable
21:52
Example IV: Probability Distribution for the Random Variable
27:25
Example V: Probability Distribution for the Random Variable
34:12
Expected Value (Mean)

46m 14s

Intro
0:00
Definition of Expected Value
0:20
Expected Value of a (Discrete) Random Variable or Mean
0:21
Indicator Variables
3:03
Indicator Variable
3:04
Linearity of Expectation
4:36
Linearity of Expectation for Random Variables
4:37
Expected Value of a Function
6:03
Expected Value of a Function
6:04
Example I: Expected Value
7:30
Example II: Expected Value
14:14
Example III: Expected Value of Flipping a Coin
21:42
Example III: Part A
21:43
Example III: Part B
30:43
Example IV: Semester Average
36:39
Example V: Expected Value of a Function of a Random Variable
41:28
Variance & Standard Deviation

47m 23s

Intro
0:00
Definition of Variance
0:11
Variance of a Random Variable
0:12
Variance is a Measure of the Variability, or Volatility
1:06
Most Useful Way to Calculate Variance
2:46
Definition of Standard Deviation
3:44
Standard Deviation of a Random Variable
3:45
Example I: Which of the Following Sets of Data Has the Largest Variance?
5:34
Example II: Which of the Following Would be the Least Useful in Understanding a Set of Data?
9:02
Example III: Calculate the Mean, Variance, & Standard Deviation
11:48
Example III: Mean
12:56
Example III: Variance
14:06
Example III: Standard Deviation
15:42
Example IV: Calculate the Mean, Variance, & Standard Deviation
17:54
Example IV: Mean
18:47
Example IV: Variance
20:36
Example IV: Standard Deviation
25:34
Example V: Calculate the Mean, Variance, & Standard Deviation
29:56
Example V: Mean
30:13
Example V: Variance
33:28
Example V: Standard Deviation
34:48
Example VI: Calculate the Mean, Variance, & Standard Deviation
37:29
Example VI: Possible Outcomes
38:09
Example VI: Mean
39:29
Example VI: Variance
41:22
Example VI: Standard Deviation
43:28
Markov's Inequality

26m 45s

Intro
0:00
Markov's Inequality
0:25
Markov's Inequality: Definition & Condition
0:26
Markov's Inequality: Equation
1:15
Markov's Inequality: Reverse Equation
2:48
Example I: Money
4:11
Example II: Rental Car
9:23
Example III: Probability of an Earthquake
12:22
Example IV: Defective Laptops
16:52
Example V: Cans of Tuna
21:06
Tchebysheff's Inequality

42m 11s

Intro
0:00
Tchebysheff's Inequality (Also Known as Chebyshev's Inequality)
0:52
Tchebysheff's Inequality: Definition
0:53
Tchebysheff's Inequality: Equation
1:19
Tchebysheff's Inequality: Intuition
3:21
Tchebysheff's Inequality in Reverse
4:09
Tchebysheff's Inequality in Reverse
4:10
Intuition
5:13
Example I: Money
5:55
Example II: College Units
13:20
Example III: Using Tchebysheff's Inequality to Estimate Proportion
16:40
Example IV: Probability of an Earthquake
25:21
Example V: Using Tchebysheff's Inequality to Estimate Proportion
32:57
Section 3: Discrete Distributions
Binomial Distribution (Bernoulli Trials)

52m 36s

Intro
0:00
Binomial Distribution
0:29
Binomial Distribution (Bernoulli Trials) Overview
0:30
Prototypical Examples: Flipping a Coin n Times
1:36
Process with Two Outcomes: Games Between Teams
2:12
Process with Two Outcomes: Rolling a Die to Get a 6
2:42
Formula for the Binomial Distribution
3:45
Fixed Parameters
3:46
Formula for the Binomial Distribution
6:27
Key Properties of the Binomial Distribution
9:54
Mean
9:55
Variance
10:56
Standard Deviation
11:13
Example I: Games Between Teams
11:36
Example II: Exam Score
17:01
Example III: Expected Grade & Standard Deviation
25:59
Example IV: Pogo-sticking Championship, Part A
33:25
Example IV: Pogo-sticking Championship, Part B
38:24
Example V: Expected Championships Winning & Standard Deviation
45:22
Geometric Distribution

52m 50s

Intro
0:00
Geometric Distribution
0:22
Geometric Distribution: Definition
0:23
Prototypical Example: Flipping a Coin Until We Get a Head
1:08
Geometric Distribution vs. Binomial Distribution.
1:31
Formula for the Geometric Distribution
2:13
Fixed Parameters
2:14
Random Variable
2:49
Formula for the Geometric Distribution
3:16
Key Properties of the Geometric Distribution
6:47
Mean
6:48
Variance
7:10
Standard Deviation
7:25
Geometric Series
7:46
Recall from Calculus II: Sum of Infinite Series
7:47
Application to Geometric Distribution
10:10
Example I: Drawing Cards from a Deck (With Replacement) Until You Get an Ace
13:02
Example I: Question & Solution
13:03
Example II: Mean & Standard Deviation of Winning Pin the Tail on the Donkey
16:32
Example II: Mean
16:33
Example II: Standard Deviation
18:37
Example III: Rolling a Die
22:09
Example III: Setting Up
22:10
Example III: Part A
24:18
Example III: Part B
26:01
Example III: Part C
27:38
Example III: Summary
32:02
Example IV: Job Interview
35:16
Example IV: Setting Up
35:15
Example IV: Part A
37:26
Example IV: Part B
38:33
Example IV: Summary
39:37
Example V: Mean & Standard Deviation of Time to Conduct All the Interviews
41:13
Example V: Setting Up
42:50
Example V: Mean
46:05
Example V: Variance
47:37
Example V: Standard Deviation
48:22
Example V: Summary
49:36
Negative Binomial Distribution

51m 39s

Intro
0:00
Negative Binomial Distribution
0:11
Negative Binomial Distribution: Definition
0:12
Prototypical Example: Flipping a Coin Until We Get r Successes
0:46
Negative Binomial Distribution vs. Binomial Distribution
1:04
Negative Binomial Distribution vs. Geometric Distribution
1:33
Formula for Negative Binomial Distribution
3:39
Fixed Parameters
3:40
Random Variable
4:57
Formula for Negative Binomial Distribution
5:18
Key Properties of Negative Binomial
7:44
Mean
7:47
Variance
8:03
Standard Deviation
8:09
Example I: Drawing Cards from a Deck (With Replacement) Until You Get Four Aces
8:32
Example I: Question & Solution
8:33
Example II: Chinchilla Grooming
12:37
Example II: Mean
12:38
Example II: Variance
15:09
Example II: Standard Deviation
15:51
Example II: Summary
17:10
Example III: Rolling a Die Until You Get Four Sixes
18:27
Example III: Setting Up
19:38
Example III: Mean
19:38
Example III: Variance
20:31
Example III: Standard Deviation
21:21
Example IV: Job Applicants
24:00
Example IV: Setting Up
24:01
Example IV: Part A
26:16
Example IV: Part B
29:53
Example V: Mean & Standard Deviation of Time to Conduct All the Interviews
40:10
Example V: Setting Up
40:11
Example V: Mean
45:24
Example V: Variance
46:22
Example V: Standard Deviation
47:01
Example V: Summary
48:16
Hypergeometric Distribution

36m 27s

Intro
0:00
Hypergeometric Distribution
0:11
Hypergeometric Distribution: Definition
0:12
Random Variable
1:38
Formula for the Hypergeometric Distribution
1:50
Fixed Parameters
1:51
Formula for the Hypergeometric Distribution
2:53
Key Properties of Hypergeometric
6:14
Mean
6:15
Variance
6:42
Standard Deviation
7:16
Example I: Students Committee
7:30
Example II: Expected Number of Women on the Committee in Example I
11:08
Example III: Pairs of Shoes
13:49
Example IV: What is the Expected Number of Left Shoes in Example III?
20:46
Example V: Using Indicator Variables & Linearity of Expectation
25:40
Poisson Distribution

52m 19s

Intro
0:00
Poisson Distribution
0:18
Poisson Distribution: Definition
0:19
Formula for the Poisson Distribution
2:16
Fixed Parameter
2:17
Formula for the Poisson Distribution
2:59
Key Properties of the Poisson Distribution
5:30
Mean
5:34
Variance
6:07
Standard Deviation
6:27
Example I: Forest Fires
6:41
Example II: Call Center, Part A
15:56
Example II: Call Center, Part B
20:50
Example III: Confirming that the Mean of the Poisson Distribution is λ
26:53
Example IV: Find E (Y²) for the Poisson Distribution
35:24
Example V: Earthquakes, Part A
37:57
Example V: Earthquakes, Part B
44:02
Section 4: Continuous Distributions
Density & Cumulative Distribution Functions

57m 17s

Intro
0:00
Density Functions
0:43
Density Functions
0:44
Density Function to Calculate Probabilities
2:41
Cumulative Distribution Functions
4:28
Cumulative Distribution Functions
4:29
Using F to Calculate Probabilities
5:58
Properties of the CDF (Density & Cumulative Distribution Functions)
7:27
F(-∞) = 0
7:34
F(∞) = 1
8:30
F is Increasing
9:14
F'(y) = f(y)
9:21
Example I: Density & Cumulative Distribution Functions, Part A
9:43
Example I: Density & Cumulative Distribution Functions, Part B
14:16
Example II: Density & Cumulative Distribution Functions, Part A
21:41
Example II: Density & Cumulative Distribution Functions, Part B
26:16
Example III: Density & Cumulative Distribution Functions, Part A
32:17
Example III: Density & Cumulative Distribution Functions, Part B
37:08
Example IV: Density & Cumulative Distribution Functions
43:34
Example V: Density & Cumulative Distribution Functions, Part A
51:53
Example V: Density & Cumulative Distribution Functions, Part B
54:19
Mean & Variance for Continuous Distributions

36m 18s

Intro
0:00
Mean
0:32
Mean for a Continuous Random Variable
0:33
Expectation is Linear
2:07
Variance
2:55
Variance for Continuous random Variable
2:56
Easier to Calculate Via the Mean
3:26
Standard Deviation
5:03
Standard Deviation
5:04
Example I: Mean & Variance for Continuous Distributions
5:43
Example II: Mean & Variance for Continuous Distributions
10:09
Example III: Mean & Variance for Continuous Distributions
16:05
Example IV: Mean & Variance for Continuous Distributions
26:40
Example V: Mean & Variance for Continuous Distributions
30:12
Uniform Distribution

32m 49s

Intro
0:00
Uniform Distribution
0:15
Uniform Distribution
0:16
Each Part of the Region is Equally Probable
1:39
Key Properties of the Uniform Distribution
2:45
Mean
2:46
Variance
3:27
Standard Deviation
3:48
Example I: Newspaper Delivery
5:25
Example II: Picking a Real Number from a Uniform Distribution
8:21
Example III: Dinner Date
11:02
Example IV: Proving that a Variable is Uniformly Distributed
18:50
Example V: Ice Cream Serving
27:22
Normal (Gaussian) Distribution

1h 3m 54s

Intro
0:00
Normal (Gaussian) Distribution
0:35
Normal (Gaussian) Distribution & The Bell Curve
0:36
Fixed Parameters
0:55
Formula for the Normal Distribution
1:32
Formula for the Normal Distribution
1:33
Calculating on the Normal Distribution can be Tricky
3:32
Standard Normal Distribution
5:12
Standard Normal Distribution
5:13
Graphing the Standard Normal Distribution
6:13
Standard Normal Distribution, Cont.
8:30
Standard Normal Distribution Chart
8:31
Nonstandard Normal Distribution
14:44
Nonstandard Normal Variable & Associated Standard Normal
14:45
Finding Probabilities for Z
15:39
Example I: Chance that Standard Normal Variable Will Land Between 1 and 2?
16:46
Example I: Setting Up the Equation & Graph
16:47
Example I: Solving for z Using the Standard Normal Chart
19:05
Example II: What Proportion of the Data Lies within Two Standard Deviations of the Mean?
20:41
Example II: Setting Up the Equation & Graph
20:42
Example II: Solving for z Using the Standard Normal Chart
24:38
Example III: Scores on an Exam
27:34
Example III: Setting Up the Equation & Graph, Part A
27:35
Example III: Setting Up the Equation & Graph, Part B
33:48
Example III: Solving for z Using the Standard Normal Chart, Part A
38:23
Example III: Solving for z Using the Standard Normal Chart, Part B
40:49
Example IV: Temperatures
42:54
Example IV: Setting Up the Equation & Graph
42:55
Example IV: Solving for z Using the Standard Normal Chart
47:03
Example V: Scores on an Exam
48:41
Example V: Setting Up the Equation & Graph, Part A
48:42
Example V: Setting Up the Equation & Graph, Part B
53:20
Example V: Solving for z Using the Standard Normal Chart, Part A
57:45
Example V: Solving for z Using the Standard Normal Chart, Part B
59:17
Gamma Distribution (with Exponential & Chi-square)

1h 8m 27s

Intro
0:00
Gamma Function
0:49
The Gamma Function
0:50
Properties of the Gamma Function
2:07
Formula for the Gamma Distribution
3:50
Fixed Parameters
3:51
Density Function for Gamma Distribution
4:07
Key Properties of the Gamma Distribution
7:13
Mean
7:14
Variance
7:25
Standard Deviation
7:30
Exponential Distribution
8:03
Definition of Exponential Distribution
8:04
Density
11:23
Mean
13:26
Variance
13:48
Standard Deviation
13:55
Chi-square Distribution
14:34
Chi-square Distribution: Overview
14:35
Chi-square Distribution: Mean
16:27
Chi-square Distribution: Variance
16:37
Chi-square Distribution: Standard Deviation
16:55
Example I: Graphing Gamma Distribution
17:30
Example I: Graphing Gamma Distribution
17:31
Example I: Describe the Effects of Changing α and β on the Shape of the Graph
23:33
Example II: Exponential Distribution
27:11
Example II: Using the Exponential Distribution
27:12
Example II: Summary
35:34
Example III: Earthquake
37:05
Example III: Estimate Using Markov's Inequality
37:06
Example III: Estimate Using Tchebysheff's Inequality
40:13
Example III: Summary
44:13
Example IV: Finding Exact Probability of Earthquakes
46:45
Example IV: Finding Exact Probability of Earthquakes
46:46
Example IV: Summary
51:44
Example V: Prove and Interpret Why the Exponential Distribution is Called 'Memoryless'
52:51
Example V: Prove
52:52
Example V: Interpretation
57:44
Example V: Summary
1:03:54
Beta Distribution

52m 45s

Intro
0:00
Beta Function
0:29
Fixed parameters
0:30
Defining the Beta Function
1:19
Relationship between the Gamma & Beta Functions
2:02
Beta Distribution
3:31
Density Function for the Beta Distribution
3:32
Key Properties of the Beta Distribution
6:56
Mean
6:57
Variance
7:16
Standard Deviation
7:37
Example I: Calculate B(3,4)
8:10
Example II: Graphing the Density Functions for the Beta Distribution
12:25
Example III: Show that the Uniform Distribution is a Special Case of the Beta Distribution
24:57
Example IV: Show that this Triangular Distribution is a Special Case of the Beta Distribution
31:20
Example V: Morning Commute
37:39
Example V: Identify the Density Function
38:45
Example V: Morning Commute, Part A
42:22
Example V: Morning Commute, Part B
44:19
Example V: Summary
49:13
Moment-Generating Functions

51m 58s

Intro
0:00
Moments
0:30
Definition of Moments
0:31
Moment-Generating Functions (MGFs)
3:53
Moment-Generating Functions
3:54
Using the MGF to Calculate the Moments
5:21
Moment-Generating Functions for the Discrete Distributions
8:22
Moment-Generating Functions for Binomial Distribution
8:36
Moment-Generating Functions for Geometric Distribution
9:06
Moment-Generating Functions for Negative Binomial Distribution
9:28
Moment-Generating Functions for Hypergeometric Distribution
9:43
Moment-Generating Functions for Poisson Distribution
9:57
Moment-Generating Functions for the Continuous Distributions
11:34
Moment-Generating Functions for the Uniform Distributions
11:43
Moment-Generating Functions for the Normal Distributions
12:24
Moment-Generating Functions for the Gamma Distributions
12:36
Moment-Generating Functions for the Exponential Distributions
12:44
Moment-Generating Functions for the Chi-square Distributions
13:11
Moment-Generating Functions for the Beta Distributions
13:48
Useful Formulas with Moment-Generating Functions
15:02
Useful Formulas with Moment-Generating Functions 1
15:03
Useful Formulas with Moment-Generating Functions 2
16:21
Example I: Moment-Generating Function for the Binomial Distribution
17:33
Example II: Use the MGF for the Binomial Distribution to Find the Mean of the Distribution
24:40
Example III: Find the Moment Generating Function for the Poisson Distribution
29:28
Example IV: Use the MGF for Poisson Distribution to Find the Mean and Variance of the Distribution
36:27
Example V: Find the Moment-generating Function for the Uniform Distribution
44:47
Section 5: Multivariate Distributions
Bivariate Density & Distribution Functions

50m 52s

Intro
0:00
Bivariate Density Functions
0:21
Two Variables
0:23
Bivariate Density Function
0:52
Properties of the Density Function
1:57
Properties of the Density Function 1
1:59
Properties of the Density Function 2
2:20
We Can Calculate Probabilities
2:53
If You Have a Discrete Distribution
4:36
Bivariate Distribution Functions
5:25
Bivariate Distribution Functions
5:26
Properties of the Bivariate Distribution Functions 1
7:19
Properties of the Bivariate Distribution Functions 2
7:36
Example I: Bivariate Density & Distribution Functions
8:08
Example II: Bivariate Density & Distribution Functions
14:40
Example III: Bivariate Density & Distribution Functions
24:33
Example IV: Bivariate Density & Distribution Functions
32:04
Example V: Bivariate Density & Distribution Functions
40:26
Marginal Probability

42m 38s

Intro
0:00
Discrete Case
0:48
Marginal Probability Functions
0:49
Continuous Case
3:07
Marginal Density Functions
3:08
Example I: Compute the Marginal Probability Function
5:58
Example II: Compute the Marginal Probability Function
14:07
Example III: Marginal Density Function
24:01
Example IV: Marginal Density Function
30:47
Example V: Marginal Density Function
36:05
Conditional Probability & Conditional Expectation

1h 2m 24s

Intro
0:00
Review of Marginal Probability
0:46
Recall the Marginal Probability Functions & Marginal Density Functions
0:47
Conditional Probability, Discrete Case
3:14
Conditional Probability, Discrete Case
3:15
Conditional Probability, Continuous Case
4:15
Conditional Density of Y₁ given that Y₂ = y₂
4:16
Interpret This as a Density on Y₁ & Calculate Conditional Probability
5:03
Conditional Expectation
6:44
Conditional Expectation: Continuous
6:45
Conditional Expectation: Discrete
8:03
Example I: Conditional Probability
8:29
Example II: Conditional Probability
23:59
Example III: Conditional Probability
34:28
Example IV: Conditional Expectation
43:16
Example V: Conditional Expectation
48:28
Independent Random Variables

51m 39s

Intro
0:00
Intuition
0:55
Experiment with Two Random Variables
0:56
Intuition Formula
2:17
Definition and Formulas
4:43
Definition
4:44
Short Version: Discrete
5:10
Short Version: Continuous
5:48
Theorem
9:33
For Continuous Random Variables, Y₁ & Y₂ are Independent If & Only If: Condition 1
9:34
For Continuous Random Variables, Y₁ & Y₂ are Independent If & Only If: Condition 2
11:22
Example I: Use the Definition to Determine if Y₁ and Y₂ are Independent
12:49
Example II: Use the Definition to Determine if Y₁ and Y₂ are Independent
21:33
Example III: Are Y₁ and Y₂ Independent?
27:01
Example IV: Are Y₁ and Y₂ Independent?
34:51
Example V: Are Y₁ and Y₂ Independent?
43:44
Expected Value of a Function of Random Variables

37m 7s

Intro
0:00
Review of Single Variable Case
0:29
Expected Value of a Single Variable
0:30
Expected Value of a Function g(Y)
1:12
Bivariate Case
2:11
Expected Value of a Function g(Y₁, Y₂)
2:12
Linearity of Expectation
3:24
Linearity of Expectation 1
3:25
Linearity of Expectation 2
3:38
Linearity of Expectation 3: Additivity
4:03
Example I: Calculate E (Y₁ + Y₂)
4:39
Example II: Calculate E (Y₁Y₂)
14:47
Example III: Calculate E (U₁) and E(U₂)
19:33
Example IV: Calculate E (Y₁) and E(Y₂)
22:50
Example V: Calculate E (2Y₁ + 3Y₂)
33:05
Covariance, Correlation & Linear Functions

59m 50s

Intro
0:00
Definition and Formulas for Covariance
0:38
Definition of Covariance
0:39
Formulas to Calculate Covariance
1:36
Intuition for Covariance
3:54
Covariance is a Measure of Dependence
3:55
Dependence Doesn't Necessarily Mean that the Variables Do the Same Thing
4:12
If Variables Move Together
4:47
If Variables Move Against Each Other
5:04
Both Cases Show Dependence!
5:30
Independence Theorem
8:10
Independence Theorem
8:11
The Converse is Not True
8:32
Correlation Coefficient
9:33
Correlation Coefficient
9:34
Linear Functions of Random Variables
11:57
Linear Functions of Random Variables: Expected Value
11:58
Linear Functions of Random Variables: Variance
12:58
Linear Functions of Random Variables, Cont.
14:30
Linear Functions of Random Variables: Covariance
14:35
Example I: Calculate E (Y₁), E (Y₂), and E (Y₁Y₂)
15:31
Example II: Are Y₁ and Y₂ Independent?
29:16
Example III: Calculate V (U₁) and V (U₂)
36:14
Example IV: Calculate the Covariance Correlation Coefficient
42:12
Example V: Find the Mean and Variance of the Average
52:19
Section 6: Distributions of Functions of Random Variables
Distribution Functions

1h 7m 35s

Intro
0:00
Premise
0:44
Premise
0:45
Goal
1:38
Goal Number 1: Find the Full Distribution Function
1:39
Goal Number 2: Find the Density Function
1:55
Goal Number 3: Calculate Probabilities
2:17
Three Methods
3:05
Method 1: Distribution Functions
3:06
Method 2: Transformations
3:38
Method 3: Moment-generating Functions
3:47
Distribution Functions
4:03
Distribution Functions
4:04
Example I: Find the Density Function
6:41
Step 1: Find the Distribution Function
6:42
Step 2: Find the Density Function
10:20
Summary
11:51
Example II: Find the Density Function
14:36
Step 1: Find the Distribution Function
14:37
Step 2: Find the Density Function
18:19
Summary
19:22
Example III: Find the Cumulative Distribution & Density Functions
20:39
Step 1: Find the Cumulative Distribution
20:40
Step 2: Find the Density Function
28:58
Summary
30:20
Example IV: Find the Density Function
33:01
Step 1: Setting Up the Equation & Graph
33:02
Step 2: If u ≤ 1
38:32
Step 3: If u ≥ 1
41:02
Step 4: Find the Distribution Function
42:40
Step 5: Find the Density Function
43:11
Summary
45:03
Example V: Find the Density Function
48:32
Step 1: Exponential
48:33
Step 2: Independence
50:48
Step 2: Find the Distribution Function
51:47
Step 3: Find the Density Function
1:00:17
Summary
1:02:05
Transformations

1h 16s

Intro
0:00
Premise
0:32
Premise
0:33
Goal
1:37
Goal Number 1: Find the Full Distribution Function
1:38
Goal Number 2: Find the Density Function
1:49
Goal Number 3: Calculate Probabilities
2:04
Three Methods
2:34
Method 1: Distribution Functions
2:35
Method 2: Transformations
2:57
Method 3: Moment-generating Functions
3:05
Requirements for Transformation Method
3:22
The Transformation Method Only Works for Single-variable Situations
3:23
Must be a Strictly Monotonic Function
3:50
Example: Strictly Monotonic Function
4:50
If the Function is Monotonic, Then It is Invertible
5:30
Formula for Transformations
7:09
Formula for Transformations
7:11
Example I: Determine whether the Function is Monotonic, and if so, Find Its Inverse
8:26
Example II: Find the Density Function
12:07
Example III: Determine whether the Function is Monotonic, and if so, Find Its Inverse
17:12
Example IV: Find the Density Function for the Magnitude of the Next Earthquake
21:30
Example V: Find the Expected Magnitude of the Next Earthquake
33:20
Example VI: Find the Density Function, Including the Range of Possible Values for u
47:42
Moment-Generating Functions

1h 18m 52s

Intro
0:00
Premise
0:30
Premise
0:31
Goal
1:40
Goal Number 1: Find the Full Distribution Function
1:41
Goal Number 2: Find the Density Function
1:51
Goal Number 3: Calculate Probabilities
2:01
Three Methods
2:39
Method 1: Distribution Functions
2:40
Method 2: Transformations
2:50
Method 3: Moment-Generating Functions
2:55
Review of Moment-Generating Functions
3:04
Recall: The Moment-Generating Function for a Random Variable Y
3:05
The Moment-Generating Function is a Function of t (Not y)
3:45
Moment-Generating Functions for the Discrete Distributions
4:31
Binomial
4:50
Geometric
5:12
Negative Binomial
5:24
Hypergeometric
5:33
Poisson
5:42
Moment-Generating Functions for the Continuous Distributions
6:08
Uniform
6:09
Normal
6:17
Gamma
6:29
Exponential
6:34
Chi-square
7:05
Beta
7:48
Useful Formulas with the Moment-Generating Functions
8:48
Useful Formula 1
8:49
Useful Formula 2
9:51
How to Use Moment-Generating Functions
10:41
How to Use Moment-Generating Functions
10:42
Example I: Find the Density Function
12:22
Example II: Find the Density Function
30:58
Example III: Find the Probability Function
43:29
Example IV: Find the Probability Function
51:43
Example V: Find the Distribution
1:00:14
Example VI: Find the Density Function
1:12:10
Order Statistics

1h 4m 56s

Intro
0:00
Premise
0:11
Example Question: How Tall Will the Tallest Student in My Next Semester's Probability Class Be?
0:12
Setting
0:56
Definition 1
1:49
Definition 2
2:01
Question: What are the Distributions & Densities?
4:08
Formulas
4:47
Distribution of Max
5:11
Density of Max
6:00
Distribution of Min
7:08
Density of Min
7:18
Example I: Distribution & Density Functions
8:29
Example I: Distribution
8:30
Example I: Density
11:07
Example I: Summary
12:33
Example II: Distribution & Density Functions
14:25
Example II: Distribution
14:26
Example II: Density
17:21
Example II: Summary
19:00
Example III: Mean & Variance
20:32
Example III: Mean
20:33
Example III: Variance
25:48
Example III: Summary
30:57
Example IV: Distribution & Density Functions
35:43
Example IV: Distribution
35:44
Example IV: Density
43:03
Example IV: Summary
46:11
Example V: Find the Expected Time Until the Team's First Injury
51:14
Example V: Solution
51:15
Example V: Summary
1:01:11
Sampling from a Normal Distribution

1h 7s

Intro
0:00
Setting
0:36
Setting
0:37
Assumptions and Notation
2:18
Assumption Forever
2:19
Assumption for this Lecture Only
3:21
Notation
3:49
The Sample Mean
4:15
Statistic We'll Study the Sample Mean
4:16
Theorem
5:40
Standard Normal Distribution
7:03
Standard Normal Distribution
7:04
Converting to Standard Normal
10:11
Recall
10:12
Corollary to Theorem
10:41
Example I: Heights of Students
13:18
Example II: What Happens to This Probability as n → ∞
22:36
Example III: Units at a University
32:24
Example IV: Probability of Sample Mean
40:53
Example V: How Many Samples Should We Take?
48:34
The Central Limit Theorem

1h 9m 55s

Intro
0:00
Setting
0:52
Setting
0:53
Assumptions and Notation
2:53
Our Samples are Independent (Independent Identically Distributed)
2:54
No Longer Assume that the Population is Normally Distributed
3:30
The Central Limit Theorem
4:36
The Central Limit Theorem Overview
4:38
The Central Limit Theorem in Practice
6:24
Standard Normal Distribution
8:09
Standard Normal Distribution
8:13
Converting to Standard Normal
10:13
Recall: If Y is Normal, Then …
10:14
Corollary to Theorem
11:09
Example I: Probability of Finishing Your Homework
12:56
Example I: Solution
12:57
Example I: Summary
18:20
Example I: Confirming with the Standard Normal Distribution Chart
20:18
Example II: Probability of Selling Muffins
21:26
Example II: Solution
21:27
Example II: Summary
29:09
Example II: Confirming with the Standard Normal Distribution Chart
31:09
Example III: Probability that a Soda Dispenser Gives the Correct Amount of Soda
32:41
Example III: Solution
32:42
Example III: Summary
38:03
Example III: Confirming with the Standard Normal Distribution Chart
40:58
Example IV: How Many Samples Should She Take?
42:06
Example IV: Solution
42:07
Example IV: Summary
49:18
Example IV: Confirming with the Standard Normal Distribution Chart
51:57
Example V: Restaurant Revenue
54:41
Example V: Solution
54:42
Example V: Summary
1:04:21
Example V: Confirming with the Standard Normal Distribution Chart
1:06:48
Loading...
This is a quick preview of the lesson. For full access, please Log In or Sign up.
For more information, please see full course syllabus of Probability
Bookmark & Share Embed

Share this knowledge with your friends!

Copy & Paste this embed code into your website’s HTML

Please ensure that your website editor is in text mode when you paste the code.
(In Wordpress, the mode button is on the top right corner.)
  ×
  • - Allow users to view the embedded video in full-size.
Since this lesson is not free, only the preview will appear on your website.
  • Discussion

  • Answer Engine

  • Study Guides

  • Download Lecture Slides

  • Table of Contents

  • Transcription

Lecture Comments (4)

1 answer

Last reply by: Dr. William Murray
Mon Oct 3, 2016 2:30 PM

Post by Thuy Nguyen on September 30, 2016

Hi, in my class I learned that Chebyshev's Inequality is:

P(|T-mean| >= a) <= variance / a^2.

I believe a = k * standard deviation.

Because variance / (k * standard deviation)^2 = k^2.

Is that right?

Also, does it matter if we write P(|T-mean| > a) vs. P(|T-mean| >= a)?

1 answer

Last reply by: Dr. William Murray
Mon Oct 3, 2016 2:30 PM

Post by Thuy Nguyen on September 30, 2016

Hello, for the college credit example, P(credit > 95) <= 1/9.  Isn't 1/9 the combination of both tail ends?  Meaning, P(credit < 5) + P(credit >95)?

If I were to sketch the distribution, then the probability of being 3 standard deviation away from the mean on BOTH sides is 1/9.  

So  why didn't we have to split the 1/9 for the left and right tail ends?

Thanks.

Tchebysheff's Inequality

Download Quick Notes

Tchebysheff's Inequality

Lecture Slides are screen-captured images of important points in the lecture. Students can download and print out these lecture slide images to do practice problems as well as take notes while watching the lecture.

  • Intro 0:00
  • Tchebysheff's Inequality (Also Known as Chebyshev's Inequality) 0:52
    • Tchebysheff's Inequality: Definition
    • Tchebysheff's Inequality: Equation
    • Tchebysheff's Inequality: Intuition
  • Tchebysheff's Inequality in Reverse 4:09
    • Tchebysheff's Inequality in Reverse
    • Intuition
  • Example I: Money 5:55
  • Example II: College Units 13:20
  • Example III: Using Tchebysheff's Inequality to Estimate Proportion 16:40
  • Example IV: Probability of an Earthquake 25:21
  • Example V: Using Tchebysheff's Inequality to Estimate Proportion 32:57

Transcription: Tchebysheff's Inequality

Hi and welcome back to the probability lectures here on www.educator.com, my name is Will Murray.0000

Today, we are going to talk about Chebyshev’s inequality.0005

This is the second lecture on using inequalities to estimate probability.0009

The first one we had was Markov’s inequality.0014

If you are looking for Markov’s inequality, there is another video covering that one.0017

It is the one video before this one.0021

This one is on Chebyshev’s inequality.0023

You get some similar kinds of answers which Chebyshev’s inequality.0026

The difference is that we use a little more information because now,0031

we are going to use the standard deviation of random variable, as well as the expected value or mean.0035

In return for using a little more information and doing a little more calculation,0044

we get stronger results using Chebyshev’s inequality.0048

Let us dive into that.0052

Chebyshev’s inequality is a quick way of estimating probabilities.0055

It never tells you the exact probability, by the way.0058

That gives you an upper and lower bound for the probability but it never tells you it is exactly equal to something.0061

It is based on the mean and standard deviation of our random variable.0068

We are going to use the Greek letter μ for the mean and we are going to use the greek letter σ, for the standard deviation.0073

Let me go ahead and tell you what Chebyshev’s inequality says.0079

Suppose y is a random variable and K is a constant.0084

K is often a whole number like 2, 3, or 4.0088

Chebyshev’s inequality says the probability that the absolute value of 1 - μ is greater than K σ,0093

is less than or equal to 1/K².0101

That is quite a mouthful, starting with the name Chebyshev’s himself is a little tough to deal with.0104

But inequality is a little bit complicated.0111

I want to think about the intuition of that first.0113

What we are really saying there, first of all, σ is a standard deviation.0117

K is a measure of how many standard deviations you are willing to go away you from the mean.0122

That is what they are really measuring right here.0130

The absolute value of 1 – μ, μ is the mean.0132

The absolute value of 1 - μ is the distance, that is the value of A - b is the distance between A and B.0137

This is the distance from Y to μ.0145

What this is really saying is, how far are you willing to go from your expected value, from your mean, μ?0156

The question is, what is the probability of you deviating more than K standard deviations from your mean?0165

Maybe, I can try to graph that.0175

If we have a certain amount of data here in and it is grouped like that, that is kind of a common distribution of data.0178

You have a mean right there in the middle.0184

What is the chance the probability of being more than K standard deviations from your mean?0187

In other words, how much area could there be out there in the tail of the distribution?0195

What Chebyshev’s inequality says, is that there is not that much.0204

It is unlikely that the variable will be far from its mean, 1/K² is your balance.0208

That is usually a fairly small number, for example if K is 3 then 1/K² would be 1/9.0216

The probability would be less than 1/9 that you would be 3 standard deviations from your mean.0224

That is what Chebyshev’s inequality is saying.0230

It is that, your probability of being many standard deviations from your mean is quite low.0233

The more standard deviations you go, the smaller the probability gets.0240

There is also a reverse version of Chebyshev’s inequality.0245

You just turn around the inequalities there.0248

This is the same inequality that I just showed you.0253

If you turn that around, if you change this greater than to a less than or equal to, that is the change here.0256

We are asking the opposite question, what is the chance that you were close to your mean?0264

What is the chance that you will win K standard deviations from your mean?0270

Since, the probability of being far away from your mean is very well,0275

that means the probability of being close to your mean is very high.0279

The answer we get, the probability of being close to your mean is greater than or equal to 1 -1/K², its exact opposite.0284

It was 1/K² before, now it is 1 - 1/K² and where we had less than or equal to,0294

before, meaning the probability of being very far away from mean is quite low,0305

the probability of being very close the mean is very high.0310

What we are saying here is that, it is unlikely that the variable will be close to the mean.0314

We say close, we mean within a few standard deviations.0322

Remember, σ is the standard deviation and K is the number of standard deviations away from your mean that you are willing to accept.0326

If your chance of being with many standard deviations of your mean is quite high,0339

that is what Chebyshev’s inequality is saying, if you turn it around.0348

Let us check that out in the context of some examples and see how it plays out.0352

This first example is quite similar to an example we had back when we are studying Markov's inequality.0357

You will recognize the numbers.0365

The difference is that we are now going to incorporate the standard deviation, before we just incorporated the mean or the expected value.0367

Now, we incorporate the standard deviation and we are going to use Chebyshev’s inequality,0375

instead of Markov’s inequality, and we get much stronger results this time.0379

In this case, we done a survey of students on a campus and we discovered that on average,0384

they are carrying about $20.00 in cash with them.0391

We also know that their standard deviation is $10.00.0393

If we need a student at random, we want to estimate the chance that that student is carrying more than $100.0398

We also want to estimate the chance that she is carrying less than $80.00.0405

Let me start out by writing down Chebyshev’s inequality, just reminding you what the formula was.0410

The probability that Y - μ is greater than K σ, it is called the Chebyshev’s inequality, that probability is less than 1/K².0417

Let us fill in what we can use here.0429

We know that the μ, that is the average value of the variable or the mean or the expected value, which we are given here is 20.0432

And we are also given the standard deviation is 10, that is the σ there.0442

We want to find the chance that a student is carrying more than $100.0449

Let us figure out how many standard deviations away from the mean, we would have to be.0456

The mean is 20, if I want to have more than $100 then that $80 more than the mean which is 8 standard deviations away from mean.0460

We would have to be 8 standard deviations away from the mean.0474

The probability that Y -20 is greater than 8 × 10.0478

As you have to be $80.00 away from the mean.0492

According to this, that 8 is the K there.0496

It is also equal to 1/K² so 1/8² which is 1/64.0499

What that is telling us is that, if we meet a student on campus, we say what is the likelihood that that student has more than $100?0509

It is very unlikely, the probability is less than 1/64.0518

If you interview all your students, most likely every 64 students you interview,0525

at most 1 of them will actually be carrying more than $100, according to these numbers here.0539

There is a second part to this problem, we also want to estimate the chance that she is carrying less than $80.00.0545

That is going to use the reverse incarnation of Chebyshev’s inequality.0553

Let me go ahead and write that down.0558

The probability of Y - μ is less than or equal to K σ.0561

According to Chebyshev’s inequality is greater than or equal to 1 -1/K², that is the reverse version of Chebyshev’s inequality.0567

In this case, our μ is still 20, our σ is still 10.0578

In this case, we want 80 dollars.0584

We want 80 -20 is 60, we want to be $60.00 away from the mean.0586

60 is 6 standard deviation, 6 × 10.0596

We will use K = 6 there in our own Chebyshev’s inequality.0601

We get the probability that Y -20 is less than or equal to 6 × 10, is greater than or equal to,0606

that is what Chebyshev’s inequality tells us here, greater than or equal to 1 -1/ the K was 6, 6² there.0620

That is 1 - 1/36, and that simplifies down to the probability is greater than 35/36, because that is 1 -1/360629

If we meet a student and we want to say, what is the chance that she is carrying less than $80.00 in cash?0646

The chance is at least 35/36.0653

Remember, just like the Markov’s inequality, you never want to just give an answer,0657

when you are being asked the question using Chebyshev’s inequality0662

because it never gives you a specific numerical answer, it always gives you inequality.0665

It will never tell you what the probability is, it will just tell you that the probability is less than this or greater than that.0671

In all of these cases, it is very important to include the inequality signs in our answers.0678

All we are doing is we are giving upper and lower bounds for the probability.0684

We are not saying that we know what the probability is.0687

Let me show you where I got each of those steps there.0692

We start out with the basic version of Chebyshev’s inequality.0695

The probability that Y – μ is greater than K σ, is less than 1/K².0698

In this case, our standard deviation is 10, that is the σ, the mean is 20.0704

We want to have more than $100, that means we really want to be 8 standard deviations away from the mean.0712

A hundred is $80.00 away from 20.0719

That is 8 standard deviations away from the mean.0722

That is where we got the value of K = 8 there.0725

Chebyshev’s tells us that the probability is less than 1/K², that is where we got 1/64 for our probability.0729

Actually, our probably is less than or equal to 1/64 there.0738

For the second part of the problem, we want to estimate the chance that she is carrying less than $80.00.0744

I use the less than version of Chebyshev’s inequality and it tells me0749

that the probability is greater than or equal to 1/1 – K².0755

You are very likely to be within many standard deviations of your mean.0759

How many standard deviations are we talking about?0764

The mean is 20 and we want to have less than $80.00.0766

80 -20 is 60, that is 6 standard deviations.0770

I put that value in for K and put that K in, we get 1 -1/6².0774

That simplifies down to 35/36, that tells me that the probability that a student will have less than $80.00 is at least 35/36.0782

It does not say equal to 35/36 but at least 35/36.0795

In example 2 here, we got students on a college campus have completed an average of 50 units and their standard deviation is 15 units.0803

We meet a randomly selected student, we want to estimate the chance that this person has completed more than 95 units.0814

Let me write down Chebyshev’s inequality for that.0823

The probability of Y - μ is greater than K σ is less than or equal to 1 /K², that is Chebyshev’s basic inequality.0826

We are going use the basic version because we are trying to estimate the chance that something is more than 95.0841

Let us try to figure out what the relevant numbers here are.0850

The average of 50 units, that is the μ, that is going to be 50.0853

The σ is the standard deviation, that is 15 units there.0858

We have to figure out what K is here.0868

We have to figure out how many standard deviations away from the mean are we expected to be here.0870

In this case, we want to have more than 95 units.0876

95 -50 is 45 which is 3 standard deviations, three × 15.0880

That means that our K value is going to be three here.0892

The probability that Y -50 is a greater than three × 15, is according to Chebyshev’s inequality less than 1/3² which is 1/9.0896

What we conclude here is that the probability is less than 1/9.0916

That means fewer than 1 in 9 students will have more than 95 units.0921

That is what we can conclude from that.0929

We cannot say it is equal to 1/9, we cannot just give 1/9 as an answer.0932

We have to say the probability is less than 1/9.0936

To recap how we derived that, we started with the basic version of Chebyshev’s inequality.0941

The probability that Y - μ is greater than K σ is less than 1/K².0947

And then I filled in what I knew here, the 50 is the average number of units, that is the μ there.0952

That is where that 50 comes from, 15 also came from the problem because that is the standard deviation.0961

I want to figure out what K should be.0966

In order to calculate that, I figured out how many standard deviations away from the mean are we interested in?0968

95 -50 is 45 and that is three standard deviations because the standard deviation is 15.0976

Our K there is three.0984

We pop that right there into Chebyshev’s inequality, we get 1/9.0986

1/9 is not our answer by itself, we have to say the probability is less than or equal to 1/9.0991

That is how we answer that.0997

In example three here, we got the scores on a national exam are symmetrically distributed.1002

I see you got a small typo there in the distributed, let me fix that.1008

Symmetrically distributed around a mean of 76 with the variance of 64.1015

The minimum passing score is 60.1021

We want to use Chebyshev’s inequality to estimate the proportion of students that will pass this exam.1024

Let me draw a little graph of what is going on here.1031

We are given that the scores are symmetrically distributed.1034

It will look something like this.1040

The mean is 76, let me fill that in.1043

The mean here is 76, that is our μ =76, right there in the middle of the data.1048

The minimum passing score is 68.1058

We want to try to find out how many students are going to be scoring above 60.1061

Let me put cutoff down there at 60.1066

There is 60, that is the minimum passing score.1070

We want to see how many students will be above that cutoff.1073

In order to do that, we are going to use Chebyshev’s inequality.1077

Let me go ahead and set up Chebyshev’s inequality.1080

It says the probability that Y - μ is greater than K σ.1083

That probability is less than or equal to 1 /K².1092

Let us figure out what we know here.1096

We know that μ is 76, we are given that.1099

Σ is the standard deviation.1104

We were not given the standard deviation.1106

The variance is 64, the variance of Y is 64 that is not the same as the standard deviation.1108

You have to be careful here.1116

The standard deviation is the square root of the variance.1117

That standard deviation would just be √ 64, that would be 8.1123

If I plug that in, σ is equal to 8.1128

We want to figure out what K is going to be, in order to use Chebyshev’s inequality.1131

That means I want to figure out how many standard deviations away from the mean do I need to be.1136

In this case, we are interested in a cutoff of 60, that is the value we are interested in.1142

60-76, absolute of value of that is 16 which is 2 × the standard deviation of 8.1150

That tells me that the K value that I’m interested in is 2.1160

The probability that Y - μ is greater than 2 × 8, according to Chebyshev’s inequality is less than or equal to 1 or 2², that is just ¼.1165

Here is a twist that we have not yet seen before with Chebyshev’s inequality.1184

You got to follow me closely on this.1189

That is telling me the probability that Y - μ will be bigger than 16 in either direction.1192

What that is really doing is, that is giving you a bound on the probability of being less than 60 or bigger then,1199

let us see 76 + 16 will be 92.1209

That is really going 2 standard deviations down below the mean or 2 standard deviations up above the mean.1214

Remember, our σ was 8.1222

What we are really found is the probability of being in either one of those together is less than ¼, is less than or equal to ¼.1224

We are also given that it was symmetrically distributed.1243

That means the probability of each one of those is less than ½ of 1/4 which is 1/8.1247

In each one of those, the probability of being in that region is less than ½ of ¼ which is 1/8.1255

The probability that Y is less than 60 is less than 1/8.1266

What we are interested in, is the probability that Y is bigger than 60 because those are the students that are passing the exam.1276

In that case, you turn it around and you do 1 -1/8.1290

Notice, I'm not worrying about all those scores bigger than 92.1298

I’m not worrying about the 1/8 of the students that score bigger than 92, because all the students passed anyway.1303

I’m just interested in these 4 students below 60, those students fail the exams.1309

That is less than 1/8 of the population, that means more then 7/8 population will pass the exam,1315

will get higher than a 60 score on the exam.1325

Let me write this in words.1329

At least 7/8 of the students will pass the exam.1332

That is what Chebyshev’s inequality let us conclude.1359

It does not tell us that exactly 7/8 will pass.1363

It tells us that at least 7/8 students will pass.1365

It could be higher but Chebyshev’s inequality would not give us a specific value.1370

To recap what happened here, we start out with Chebyshev’s inequality.1376

The probability of Y - μ being greater than K σ is less than 1 /K².1381

In this case, our μ was 76, our σ was 8.1390

We got that from the variance of 64.1395

But remember, standard deviation is the square root of variance.1400

The standard aviation is √ 64 which is just 8.1403

I’m wondering, how many standard deviations away from the mean are we interested in going here?1408

We are interested in a cutoff of 60, because that is the passing score for the exam.1414

How far is that from 76?1419

That is 16 units away, that is 8 × the standard deviation of 8.1422

That means we use K = 2 in Chebyshev’s inequality.1427

The probability that you are more than 2 standard deviations away from the mean is less than 1/2² or ¼.1431

Here is the subtlety, that ¼ of the population could be 2 standard deviations below the mean1440

or that could be 2 standard deviations above the mean.1448

Since, we are given that the scores are symmetric, we know that half of them are below the mean and half of them are above the mean.1451

We divide that 1/4 into two parts and we find that the probability of being less than 60,1459

being 2 standard deviations on the low side is less than 1/8.1466

That really came from doing 1/4/2,that is how we got that 1/8.1473

The probability of being bigger than 60, in other words,1478

the probability of scoring higher than 60 passing the exam is at least 1 -1/8 which is 7/8.1482

In the end, we know that at least 7/8 of the students will pass.1490

It would not be accurate, if you are given this problem and you just gave your answer and you gave 7/8.1495

My probability students sometimes do that.1501

I will give a complicated problem and they will just say 7/8.1503

That does not really tell us exactly what is going on.1505

You really have to say, it is at least 7/8, the proportion of students that will pass is greater than or equal to 7/8.1509

That is what Chebyshev’s inequality tells you.1518

In example 4, and this is one that is very similar to one we had back in the lecture on Markov’s inequality1523

but there is a little more information than it now.1529

We are going to get a different answer.1531

Be very careful that you do not get this two problems mixed up.1533

In example 4, we have seismic data telling us that California has a major earthquake on average, once every 10 years.1536

Up to there, it is the exact same as a problem we had back in the video on Markov’s inequality.1543

Now, here is the new stuff.1549

We have a standard deviation of 10 years, what can we say about the probability that1551

there will be an earthquake in the next 30 years?1555

This is all exactly the same as the problem we have for Markov’s inequality, except for this one key phrase here,1558

with a standard deviation of 10 years, that is the new information.1566

That is going to let us use Chebyshev’s inequality, instead of Markov’s inequality.1571

Because Chebyshev’s inequality depends on the standard deviation and the mean,1578

Markov’s inequality do not use the standard deviation.1583

Let me set up Chebyshev’s inequality.1586

The probability that Y - μ is greater than or equal to K σ.1589

I think when we originally get Chebyshev’s equality is greater than in that place.1602

This says, the probability of being that far away from your mean is less than ¼.1610

We have to be careful what Y is here.1617

Here, what I mean by Y is, Y is the waiting time until the next major earthquake.1621

Starting today, how long do we expect to wait in years before there is a next major earthquake in California?1627

What we are given here is that, the average waiting time is 10 years.1643

That means that my μ is 10 years.1649

That also says that the standard deviation is 10 years too, σ is also 10.1651

We want to figure out what K is.1657

I wrote down 1/4 there, that was kind of looking ahead to starting to solve the problem.1660

Chebyshev’s inequality just says that it is less than or equal to 1 / K².1670

We need to figure out what K is and I spoiled the ending here but we will go ahead and figure it.1674

The question here is how many standard deviations away from the mean are we expected to be here?1684

We want to talk about the probability that there will be an earthquake in the next 30 years.1693

That means we want to find the probability that Y will be less than 30.1702

30 -10 is 20 which is 2 × our σ of 10 here.1706

That means that our K is 2, as I inadvertently let slip earlier, K is 2 here.1717

I will plug that value of 2 in.1723

The probability that Y - 10 is greater than 2 × 10, according to Chebyshev’s inequality1725

is less than or equal to 1/2² which of course is ¼.1737

That is why I had written down before.1742

That is the probability that Y -10 is greater than 20.1746

What we really found there is, the probability that Y is greater than 30 is less than or equal to ¼.1751

That is not exactly what the problem is asking for, because this is saying, if Y is greater than 30 that means1762

we are waiting longer than 30 years for an earthquake, which means we do not have an earthquake in the next 30 years.1768

The problem is asking, what is the probability that there will be an earthquake in the next 30 years?1775

That means that are waiting time will be less than 30, which is the opposite of it being greater than 30.1782

We are going to switch this around, it is greater than 1 -1/4 which is ¾.1791

Our conclusion here is, there is a greater than or equal to 3/4 chance that there will be an earthquake in the next 30 years.1799

Notice that, I'm being very careful to include the inequality in there.1836

I'm not saying that there is exactly ¾ chance, I do not know that.1840

I cannot figure that out from the information that we are given.1844

What Chebyshev’s inequality does is, it gives me a bound, it allows me to say that there is at least a ¾ chance,1847

at least a 75% chance that there will be an earthquake in the next 30 years.1855

To recap that, I started out with the generic form of Chebyshev’s inequality.1861

The probability of being K standard deviations away from your mean is less than 1/K².1866

I filled in the mean is 10, that is this 10 right here is where we get the μ from.1874

The standard deviation is 10 as well.1882

That is where we get the σ from there.1886

I just had to figure out what the value of K was.1890

To do that, I remember I was interested in the probability of Y being 30.1894

That is why I got that 30 from the problem here.1900

Filled in 30 for Y, 10 from μ, and that came out to be 20.1904

20 is 2 × 10, that 10 is the σ, that 10 was the μ.1911

It is a little confusing because there is a lot of 10 going around here.1916

That 20 is 2 × σ, that is where we get our K being 2 which tells us that the probability is less than or equal to 1/2² which is ¼.1920

Our probability of Y being greater than 30 is less than ¼.1932

That is not what the problem asks, the problem asks what is the probability that1937

we will have an earthquake in the next 30 years, which means our waiting time is less than 30 years.1940

We will see an earthquake within 30 years.1946

It will be less than 30 years until we see an earthquake.1949

We have to turn that around, probability that Y is less than 30 is greater than or equal to 1 -1/4 which is ¾.1952

I do not just say there is a ¾ chance, I do not just give ¾ as an answer.1961

I’m giving a complete sentence which is that there is a greater than or equal to ¾ chance,1965

at least a 75% chance that we will see a quake in the next 30 years.1971

In example 5, we are looking at housing prices in a small town USA.1979

Apparently, they are symmetrically distributed with a mean of $50,000 and a standard deviation of $20,000.1985

We are going to use Chebyshev’s inequality to estimate, what proportion of the houses cost less than $90,000?1993

Let me do a little graph here because again, we are dealing with a symmetric distribution.2001

We will actually find out that this problem is quite similar to an earlier one that we did.2005

I think it was examples 3 in this lecture.2010

If you remember how to do that one, you might want to try doing this yourself before you watch me give the answers here.2013

It works out pretty similarly.2020

Let me make a little graph of the housing prices in small town.2023

They are symmetrically distributed, I'm going to draw something nice and symmetric here.2028

Something that looks like a bell curve.2033

We will learn later that this is actually the normal distribution but we have not gotten to that point in the videos yet.2036

They are distributed around a mean of $50,000.2044

Μ here is 50, I would not bother with the thousands.2050

We want to estimate the proportion of houses that cost less than $90,000.2055

Let me put in the 90,000 here, at somewhere out beyond 50, that is 90.2061

We are told that we have a standard deviation of 20,000.2069

I guess that means the distance from the mean, to the cutoff we are interested in is 40.2074

That is 2 σ, 2 standard deviations because that is 2 × 20.2080

Because we are going to be using that, let me go ahead and put in 2 σ in the other direction.2085

2 σ in the other direction which I guess will get you down to 50 -40 is 10 on the low side there.2090

That is just setting up a picture, we still need to bring Chebyshev’s inequality.2099

Chebyshev’s says the probability that Y will be more than K standard deviations away from its mean,2103

is less than or equal to 1/K².2117

In this case, we are interested in being 2 standard deviations away from the mean.2121

Where I got that was 90 -50 is 40 which is 2 × 20, that is 2 standard deviations there.2126

The probability that Y -50 is the mean here, is greater than or equal to K was 2,2137

σ is the standard deviation, 2 × 20, that will be 40.2149

According to Chebyshev’s inequality that is less than 1/K², 1/2² which is ¼.2154

The probability of being that far away from the mean is less than ¼.2162

Let me go ahead and fill that in here.2167

That is this probability but it is also the probability on the lower end because2170

we are told that we have a symmetric distribution here.2178

What we know is that, all that shaded region there has combined probability.2184

The probability of the shaded region is less than or equal to ¼.2190

Since, we know it is symmetric, we know that each one of those tails must be less than 1/8.2196

1/8 being ½ of 1/4 there, the probability is less than or equal to 1/8.2204

Let me go ahead and fill that in.2211

In particular, that Y is bigger than 90, the probability that Y is bigger than 90, according to Chebyshev’s inequality,2214

since, we are allowed to split up between the top and the low end, we said that the distribution was symmetrical.2224

It is less than or equal to ½ × 1/4 of which is 1/8.2231

What was the probability that the house was greater than 90?2240

We want to estimate the proportion of houses that cost less than 90,000.2244

The probability that Y is less than 90, let me turn that around.2248

It is greater than or equal to 1 – 1/8, that is 7/8.2254

If we convert that into a percentage, that is halfway between ¾ and 1.2262

It is halfway between 75 and 100, that is 87.5%.2268

The proportion or the probability, the proportion of houses that cost less than $90,000 in this town is,2280

I cannot say it is the equal to 87% but it is at least 87.5%.2289

I can say that at least 87% of the houses in this town must cost less than $90,000,2296

that is the interpretation that I can put on that.2303

At least 87% of these houses cost less than $90,000.2307

Let me recap where that is coming from.2312

The probability that Y - μ is greater that K σ is less than 1/K².2315

That is just the original version of Chebyshev’s inequality.2322

In this case, my μ was 50, that came from the mean housing price their.2325

The standard deviation, the σ is 20, that was also given to us in the problem.2333

I have not figure out what K would be.2341

In order to figure out what K would be, I want to know what I was being asked about.2343

I was being asked about houses costing less than $90,000.2347

We are going to use 90 as our cut off and 90 -50.2352

50 is the mean, 90 is what we are interested in.2358

The difference there is 40 which is 2 standard deviations, 2 × 20.2360

That is where I get my K is equal to 2 there, K =2.2365

I plug that in to Chebyshev’s inequality and I get that the probability is less than ¼.2371

That is the probability of being 2 standard deviations away from mean, in either direction.2377

That includes both of these regions here, both the high region and the low region.2384

But I'm really only interest in, how many of the houses are costing too much on the high side?2390

Let me cut that region in half and get a probability of less than 1/8.2398

That is why that is less than 1/8/.2403

That was describing the proportion of houses that cost more than $90,000.2409

I want to find the proportion of houses that cost less than $90,000.2417

Let us switch that around.2421

Instead of talking about 1/8, I will talk about 1 -1/8.2422

Instead of talking about less than or equal to, I have a greater than or equal to.2426

1 -1/8 simplifies down to 87.5%.2430

My answer here is, I do not just say 87.5% is my answer.2435

My answer is that, at least 87.5% of the houses in this town cost more than $90,000.2439

The at least part is very important part of the answer there.2448

That wraps up our lecture on Chebyshev’s inequality.2454

You can figure this as companion to the lecture on Markov’s inequality which2458

we have on the previous videos is on Markov’s inequality, they go hand in hand.2462

Markov’s inequality, you just need to know the mean.2468

Chebyshev’s inequality, you need to know the mean and the standard deviation because of that σ there.2470

You need to know both for Chebyshev’s inequality.2476

You do a little more computation for Chebyshev’s inequality but the trail off that is that you usually get stronger results.2479

You usually get some more information about the probabilities for Chebyshev’s than you do for Markov.2488

Remember, for either one of these inequalities, you have to give your answer as an inequality.2495

You never give numerical value because those numerical values are just lower or upper bounds.2501

Your answer will always be that the probability is less than this or greater than that.2506

That does it for Chebyshev’s inequality, kind of wrapping up a chapter here.2512

We will jump in later on with the binomial distribution.2518

I hope you will stick around for that.2521

You are enjoying the probability lecture series here on www.educator.com.2523

My name is Will Murray, thank you so much for watching, bye.2528

Educator®

Please sign in to participate in this lecture discussion.

Resetting Your Password?
OR

Start Learning Now

Our free lessons will get you started (Adobe Flash® required).
Get immediate access to our entire library.

Membership Overview

  • Available 24/7. Unlimited Access to Our Entire Library.
  • Search and jump to exactly what you want to learn.
  • *Ask questions and get answers from the community and our teachers!
  • Practice questions with step-by-step solutions.
  • Download lecture slides for taking notes.
  • Track your course viewing progress.
  • Accessible anytime, anywhere with our Android and iOS apps.