Dr. Ji Son

Dr. Ji Son

Correlation

Slide Duration:

Table of Contents

Section 1: Introduction
Descriptive Statistics vs. Inferential Statistics

25m 31s

Intro
0:00
Roadmap
0:10
Roadmap
0:11
Statistics
0:35
Statistics
0:36
Let's Think About High School Science
1:12
Measurement and Find Patterns (Mathematical Formula)
1:13
Statistics = Math of Distributions
4:58
Distributions
4:59
Problematic… but also GREAT
5:58
Statistics
7:33
How is It Different from Other Specializations in Mathematics?
7:34
Statistics is Fundamental in Natural and Social Sciences
7:53
Two Skills of Statistics
8:20
Description (Exploration)
8:21
Inference
9:13
Descriptive Statistics vs. Inferential Statistics: Apply to Distributions
9:58
Descriptive Statistics
9:59
Inferential Statistics
11:05
Populations vs. Samples
12:19
Populations vs. Samples: Is it the Truth?
12:20
Populations vs. Samples: Pros & Cons
13:36
Populations vs. Samples: Descriptive Values
16:12
Putting Together Descriptive/Inferential Stats & Populations/Samples
17:10
Putting Together Descriptive/Inferential Stats & Populations/Samples
17:11
Example 1: Descriptive Statistics vs. Inferential Statistics
19:09
Example 2: Descriptive Statistics vs. Inferential Statistics
20:47
Example 3: Sample, Parameter, Population, and Statistic
21:40
Example 4: Sample, Parameter, Population, and Statistic
23:28
Section 2: About Samples: Cases, Variables, Measurements
About Samples: Cases, Variables, Measurements

32m 14s

Intro
0:00
Data
0:09
Data, Cases, Variables, and Values
0:10
Rows, Columns, and Cells
2:03
Example: Aircrafts
3:52
How Do We Get Data?
5:38
Research: Question and Hypothesis
5:39
Research Design
7:11
Measurement
7:29
Research Analysis
8:33
Research Conclusion
9:30
Types of Variables
10:03
Discrete Variables
10:04
Continuous Variables
12:07
Types of Measurements
14:17
Types of Measurements
14:18
Types of Measurements (Scales)
17:22
Nominal
17:23
Ordinal
19:11
Interval
21:33
Ratio
24:24
Example 1: Cases, Variables, Measurements
25:20
Example 2: Which Scale of Measurement is Used?
26:55
Example 3: What Kind of a Scale of Measurement is This?
27:26
Example 4: Discrete vs. Continuous Variables.
30:31
Section 3: Visualizing Distributions
Introduction to Excel

8m 9s

Intro
0:00
Before Visualizing Distribution
0:10
Excel
0:11
Excel: Organization
0:45
Workbook
0:46
Column x Rows
1:50
Tools: Menu Bar, Standard Toolbar, and Formula Bar
3:00
Excel + Data
6:07
Exce and Data
6:08
Frequency Distributions in Excel

39m 10s

Intro
0:00
Roadmap
0:08
Data in Excel and Frequency Distributions
0:09
Raw Data to Frequency Tables
0:42
Raw Data to Frequency Tables
0:43
Frequency Tables: Using Formulas and Pivot Tables
1:28
Example 1: Number of Births
7:17
Example 2: Age Distribution
20:41
Example 3: Height Distribution
27:45
Example 4: Height Distribution of Males
32:19
Frequency Distributions and Features

25m 29s

Intro
0:00
Roadmap
0:10
Data in Excel, Frequency Distributions, and Features of Frequency Distributions
0:11
Example #1
1:35
Uniform
1:36
Example #2
2:58
Unimodal, Skewed Right, and Asymmetric
2:59
Example #3
6:29
Bimodal
6:30
Example #4a
8:29
Symmetric, Unimodal, and Normal
8:30
Point of Inflection and Standard Deviation
11:13
Example #4b
12:43
Normal Distribution
12:44
Summary
13:56
Uniform, Skewed, Bimodal, and Normal
13:57
Sketch Problem 1: Driver's License
17:34
Sketch Problem 2: Life Expectancy
20:01
Sketch Problem 3: Telephone Numbers
22:01
Sketch Problem 4: Length of Time Used to Complete a Final Exam
23:43
Dotplots and Histograms in Excel

42m 42s

Intro
0:00
Roadmap
0:06
Roadmap
0:07
Previously
1:02
Data, Frequency Table, and visualization
1:03
Dotplots
1:22
Dotplots Excel Example
1:23
Dotplots: Pros and Cons
7:22
Pros and Cons of Dotplots
7:23
Dotplots Excel Example Cont.
9:07
Histograms
12:47
Histograms Overview
12:48
Example of Histograms
15:29
Histograms: Pros and Cons
31:39
Pros
31:40
Cons
32:31
Frequency vs. Relative Frequency
32:53
Frequency
32:54
Relative Frequency
33:36
Example 1: Dotplots vs. Histograms
34:36
Example 2: Age of Pennies Dotplot
36:21
Example 3: Histogram of Mammal Speeds
38:27
Example 4: Histogram of Life Expectancy
40:30
Stemplots

12m 23s

Intro
0:00
Roadmap
0:05
Roadmap
0:06
What Sets Stemplots Apart?
0:46
Data Sets, Dotplots, Histograms, and Stemplots
0:47
Example 1: What Do Stemplots Look Like?
1:58
Example 2: Back-to-Back Stemplots
5:00
Example 3: Quiz Grade Stemplot
7:46
Example 4: Quiz Grade & Afterschool Tutoring Stemplot
9:56
Bar Graphs

22m 49s

Intro
0:00
Roadmap
0:05
Roadmap
0:08
Review of Frequency Distributions
0:44
Y-axis and X-axis
0:45
Types of Frequency Visualizations Covered so Far
2:16
Introduction to Bar Graphs
4:07
Example 1: Bar Graph
5:32
Example 1: Bar Graph
5:33
Do Shapes, Center, and Spread of Distributions Apply to Bar Graphs?
11:07
Do Shapes, Center, and Spread of Distributions Apply to Bar Graphs?
11:08
Example 2: Create a Frequency Visualization for Gender
14:02
Example 3: Cases, Variables, and Frequency Visualization
16:34
Example 4: What Kind of Graphs are Shown Below?
19:29
Section 4: Summarizing Distributions
Central Tendency: Mean, Median, Mode

38m 50s

Intro
0:00
Roadmap
0:07
Roadmap
0:08
Central Tendency 1
0:56
Way to Summarize a Distribution of Scores
0:57
Mode
1:32
Median
2:02
Mean
2:36
Central Tendency 2
3:47
Mode
3:48
Median
4:20
Mean
5:25
Summation Symbol
6:11
Summation Symbol
6:12
Population vs. Sample
10:46
Population vs. Sample
10:47
Excel Examples
15:08
Finding Mode, Median, and Mean in Excel
15:09
Median vs. Mean
21:45
Effect of Outliers
21:46
Relationship Between Parameter and Statistic
22:44
Type of Measurements
24:00
Which Distributions to Use With
24:55
Example 1: Mean
25:30
Example 2: Using Summation Symbol
29:50
Example 3: Average Calorie Count
32:50
Example 4: Creating an Example Set
35:46
Variability

42m 40s

Intro
0:00
Roadmap
0:05
Roadmap
0:06
Variability (or Spread)
0:45
Variability (or Spread)
0:46
Things to Think About
5:45
Things to Think About
5:46
Range, Quartiles and Interquartile Range
6:37
Range
6:38
Interquartile Range
8:42
Interquartile Range Example
10:58
Interquartile Range Example
10:59
Variance and Standard Deviation
12:27
Deviations
12:28
Sum of Squares
14:35
Variance
16:55
Standard Deviation
17:44
Sum of Squares (SS)
18:34
Sum of Squares (SS)
18:35
Population vs. Sample SD
22:00
Population vs. Sample SD
22:01
Population vs. Sample
23:20
Mean
23:21
SD
23:51
Example 1: Find the Mean and Standard Deviation of the Variable Friends in the Excel File
27:21
Example 2: Find the Mean and Standard Deviation of the Tagged Photos in the Excel File
35:25
Example 3: Sum of Squares
38:58
Example 4: Standard Deviation
41:48
Five Number Summary & Boxplots

57m 15s

Intro
0:00
Roadmap
0:06
Roadmap
0:07
Summarizing Distributions
0:37
Shape, Center, and Spread
0:38
5 Number Summary
1:14
Boxplot: Visualizing 5 Number Summary
3:37
Boxplot: Visualizing 5 Number Summary
3:38
Boxplots on Excel
9:01
Using 'Stocks' and Using Stacked Columns
9:02
Boxplots on Excel Example
10:14
When are Boxplots Useful?
32:14
Pros
32:15
Cons
32:59
How to Determine Outlier Status
33:24
Rule of Thumb: Upper Limit
33:25
Rule of Thumb: Lower Limit
34:16
Signal Outliers in an Excel Data File Using Conditional Formatting
34:52
Modified Boxplot
48:38
Modified Boxplot
48:39
Example 1: Percentage Values & Lower and Upper Whisker
49:10
Example 2: Boxplot
50:10
Example 3: Estimating IQR From Boxplot
53:46
Example 4: Boxplot and Missing Whisker
54:35
Shape: Calculating Skewness & Kurtosis

41m 51s

Intro
0:00
Roadmap
0:16
Roadmap
0:17
Skewness Concept
1:09
Skewness Concept
1:10
Calculating Skewness
3:26
Calculating Skewness
3:27
Interpreting Skewness
7:36
Interpreting Skewness
7:37
Excel Example
8:49
Kurtosis Concept
20:29
Kurtosis Concept
20:30
Calculating Kurtosis
24:17
Calculating Kurtosis
24:18
Interpreting Kurtosis
29:01
Leptokurtic
29:35
Mesokurtic
30:10
Platykurtic
31:06
Excel Example
32:04
Example 1: Shape of Distribution
38:28
Example 2: Shape of Distribution
39:29
Example 3: Shape of Distribution
40:14
Example 4: Kurtosis
41:10
Normal Distribution

34m 33s

Intro
0:00
Roadmap
0:13
Roadmap
0:14
What is a Normal Distribution
0:44
The Normal Distribution As a Theoretical Model
0:45
Possible Range of Probabilities
3:05
Possible Range of Probabilities
3:06
What is a Normal Distribution
5:07
Can Be Described By
5:08
Properties
5:49
'Same' Shape: Illusion of Different Shape!
7:35
'Same' Shape: Illusion of Different Shape!
7:36
Types of Problems
13:45
Example: Distribution of SAT Scores
13:46
Shape Analogy
19:48
Shape Analogy
19:49
Example 1: The Standard Normal Distribution and Z-Scores
22:34
Example 2: The Standard Normal Distribution and Z-Scores
25:54
Example 3: Sketching and Normal Distribution
28:55
Example 4: Sketching and Normal Distribution
32:32
Standard Normal Distributions & Z-Scores

41m 44s

Intro
0:00
Roadmap
0:06
Roadmap
0:07
A Family of Distributions
0:28
Infinite Set of Distributions
0:29
Transforming Normal Distributions to 'Standard' Normal Distribution
1:04
Normal Distribution vs. Standard Normal Distribution
2:58
Normal Distribution vs. Standard Normal Distribution
2:59
Z-Score, Raw Score, Mean, & SD
4:08
Z-Score, Raw Score, Mean, & SD
4:09
Weird Z-Scores
9:40
Weird Z-Scores
9:41
Excel
16:45
For Normal Distributions
16:46
For Standard Normal Distributions
19:11
Excel Example
20:24
Types of Problems
25:18
Percentage Problem: P(x)
25:19
Raw Score and Z-Score Problems
26:28
Standard Deviation Problems
27:01
Shape Analogy
27:44
Shape Analogy
27:45
Example 1: Deaths Due to Heart Disease vs. Deaths Due to Cancer
28:24
Example 2: Heights of Male College Students
33:15
Example 3: Mean and Standard Deviation
37:14
Example 4: Finding Percentage of Values in a Standard Normal Distribution
37:49
Normal Distribution: PDF vs. CDF

55m 44s

Intro
0:00
Roadmap
0:15
Roadmap
0:16
Frequency vs. Cumulative Frequency
0:56
Frequency vs. Cumulative Frequency
0:57
Frequency vs. Cumulative Frequency
4:32
Frequency vs. Cumulative Frequency Cont.
4:33
Calculus in Brief
6:21
Derivative-Integral Continuum
6:22
PDF
10:08
PDF for Standard Normal Distribution
10:09
PDF for Normal Distribution
14:32
Integral of PDF = CDF
21:27
Integral of PDF = CDF
21:28
Example 1: Cumulative Frequency Graph
23:31
Example 2: Mean, Standard Deviation, and Probability
24:43
Example 3: Mean and Standard Deviation
35:50
Example 4: Age of Cars
49:32
Section 5: Linear Regression
Scatterplots

47m 19s

Intro
0:00
Roadmap
0:04
Roadmap
0:05
Previous Visualizations
0:30
Frequency Distributions
0:31
Compare & Contrast
2:26
Frequency Distributions Vs. Scatterplots
2:27
Summary Values
4:53
Shape
4:54
Center & Trend
6:41
Spread & Strength
8:22
Univariate & Bivariate
10:25
Example Scatterplot
10:48
Shape, Trend, and Strength
10:49
Positive and Negative Association
14:05
Positive and Negative Association
14:06
Linearity, Strength, and Consistency
18:30
Linearity
18:31
Strength
19:14
Consistency
20:40
Summarizing a Scatterplot
22:58
Summarizing a Scatterplot
22:59
Example 1: Gapminder.org, Income x Life Expectancy
26:32
Example 2: Gapminder.org, Income x Infant Mortality
36:12
Example 3: Trend and Strength of Variables
40:14
Example 4: Trend, Strength and Shape for Scatterplots
43:27
Regression

32m 2s

Intro
0:00
Roadmap
0:05
Roadmap
0:06
Linear Equations
0:34
Linear Equations: y = mx + b
0:35
Rough Line
5:16
Rough Line
5:17
Regression - A 'Center' Line
7:41
Reasons for Summarizing with a Regression Line
7:42
Predictor and Response Variable
10:04
Goal of Regression
12:29
Goal of Regression
12:30
Prediction
14:50
Example: Servings of Mile Per Year Shown By Age
14:51
Intrapolation
17:06
Extrapolation
17:58
Error in Prediction
20:34
Prediction Error
20:35
Residual
21:40
Example 1: Residual
23:34
Example 2: Large and Negative Residual
26:30
Example 3: Positive Residual
28:13
Example 4: Interpret Regression Line & Extrapolate
29:40
Least Squares Regression

56m 36s

Intro
0:00
Roadmap
0:13
Roadmap
0:14
Best Fit
0:47
Best Fit
0:48
Sum of Squared Errors (SSE)
1:50
Sum of Squared Errors (SSE)
1:51
Why Squared?
3:38
Why Squared?
3:39
Quantitative Properties of Regression Line
4:51
Quantitative Properties of Regression Line
4:52
So How do we Find Such a Line?
6:49
SSEs of Different Line Equations & Lowest SSE
6:50
Carl Gauss' Method
8:01
How Do We Find Slope (b1)
11:00
How Do We Find Slope (b1)
11:01
Hoe Do We Find Intercept
15:11
Hoe Do We Find Intercept
15:12
Example 1: Which of These Equations Fit the Above Data Best?
17:18
Example 2: Find the Regression Line for These Data Points and Interpret It
26:31
Example 3: Summarize the Scatterplot and Find the Regression Line.
34:31
Example 4: Examine the Mean of Residuals
43:52
Correlation

43m 58s

Intro
0:00
Roadmap
0:05
Roadmap
0:06
Summarizing a Scatterplot Quantitatively
0:47
Shape
0:48
Trend
1:11
Strength: Correlation ®
1:45
Correlation Coefficient ( r )
2:30
Correlation Coefficient ( r )
2:31
Trees vs. Forest
11:59
Trees vs. Forest
12:00
Calculating r
15:07
Average Product of z-scores for x and y
15:08
Relationship between Correlation and Slope
21:10
Relationship between Correlation and Slope
21:11
Example 1: Find the Correlation between Grams of Fat and Cost
24:11
Example 2: Relationship between r and b1
30:24
Example 3: Find the Regression Line
33:35
Example 4: Find the Correlation Coefficient for this Set of Data
37:37
Correlation: r vs. r-squared

52m 52s

Intro
0:00
Roadmap
0:07
Roadmap
0:08
R-squared
0:44
What is the Meaning of It? Why Squared?
0:45
Parsing Sum of Squared (Parsing Variability)
2:25
SST = SSR + SSE
2:26
What is SST and SSE?
7:46
What is SST and SSE?
7:47
r-squared
18:33
Coefficient of Determination
18:34
If the Correlation is Strong…
20:25
If the Correlation is Strong…
20:26
If the Correlation is Weak…
22:36
If the Correlation is Weak…
22:37
Example 1: Find r-squared for this Set of Data
23:56
Example 2: What Does it Mean that the Simple Linear Regression is a 'Model' of Variance?
33:54
Example 3: Why Does r-squared Only Range from 0 to 1
37:29
Example 4: Find the r-squared for This Set of Data
39:55
Transformations of Data

27m 8s

Intro
0:00
Roadmap
0:05
Roadmap
0:06
Why Transform?
0:26
Why Transform?
0:27
Shape-preserving vs. Shape-changing Transformations
5:14
Shape-preserving = Linear Transformations
5:15
Shape-changing Transformations = Non-linear Transformations
6:20
Common Shape-Preserving Transformations
7:08
Common Shape-Preserving Transformations
7:09
Common Shape-Changing Transformations
8:59
Powers
9:00
Logarithms
9:39
Change Just One Variable? Both?
10:38
Log-log Transformations
10:39
Log Transformations
14:38
Example 1: Create, Graph, and Transform the Data Set
15:19
Example 2: Create, Graph, and Transform the Data Set
20:08
Example 3: What Kind of Model would You Choose for this Data?
22:44
Example 4: Transformation of Data
25:46
Section 6: Collecting Data in an Experiment
Sampling & Bias

54m 44s

Intro
0:00
Roadmap
0:05
Roadmap
0:06
Descriptive vs. Inferential Statistics
1:04
Descriptive Statistics: Data Exploration
1:05
Example
2:03
To tackle Generalization…
4:31
Generalization
4:32
Sampling
6:06
'Good' Sample
6:40
Defining Samples and Populations
8:55
Population
8:56
Sample
11:16
Why Use Sampling?
13:09
Why Use Sampling?
13:10
Goal of Sampling: Avoiding Bias
15:04
What is Bias?
15:05
Where does Bias Come from: Sampling Bias
17:53
Where does Bias Come from: Response Bias
18:27
Sampling Bias: Bias from Bas Sampling Methods
19:34
Size Bias
19:35
Voluntary Response Bias
21:13
Convenience Sample
22:22
Judgment Sample
23:58
Inadequate Sample Frame
25:40
Response Bias: Bias from 'Bad' Data Collection Methods
28:00
Nonresponse Bias
29:31
Questionnaire Bias
31:10
Incorrect Response or Measurement Bias
37:32
Example 1: What Kind of Biases?
40:29
Example 2: What Biases Might Arise?
44:46
Example 3: What Kind of Biases?
48:34
Example 4: What Kind of Biases?
51:43
Sampling Methods

14m 25s

Intro
0:00
Roadmap
0:05
Roadmap
0:06
Biased vs. Unbiased Sampling Methods
0:32
Biased Sampling
0:33
Unbiased Sampling
1:13
Probability Sampling Methods
2:31
Simple Random
2:54
Stratified Random Sampling
4:06
Cluster Sampling
5:24
Two-staged Sampling
6:22
Systematic Sampling
7:25
Example 1: Which Type(s) of Sampling was this?
8:33
Example 2: Describe How to Take a Two-Stage Sample from this Book
10:16
Example 3: Sampling Methods
11:58
Example 4: Cluster Sample Plan
12:48
Research Design

53m 54s

Intro
0:00
Roadmap
0:06
Roadmap
0:07
Descriptive vs. Inferential Statistics
0:51
Descriptive Statistics: Data Exploration
0:52
Inferential Statistics
1:02
Variables and Relationships
1:44
Variables
1:45
Relationships
2:49
Not Every Type of Study is an Experiment…
4:16
Category I - Descriptive Study
4:54
Category II - Correlational Study
5:50
Category III - Experimental, Quasi-experimental, Non-experimental
6:33
Category III
7:42
Experimental, Quasi-experimental, and Non-experimental
7:43
Why CAN'T the Other Strategies Determine Causation?
10:18
Third-variable Problem
10:19
Directionality Problem
15:49
What Makes Experiments Special?
17:54
Manipulation
17:55
Control (and Comparison)
21:58
Methods of Control
26:38
Holding Constant
26:39
Matching
29:11
Random Assignment
31:48
Experiment Terminology
34:09
'true' Experiment vs. Study
34:10
Independent Variable (IV)
35:16
Dependent Variable (DV)
35:45
Factors
36:07
Treatment Conditions
36:23
Levels
37:43
Confounds or Extraneous Variables
38:04
Blind
38:38
Blind Experiments
38:39
Double-blind Experiments
39:29
How Categories Relate to Statistics
41:35
Category I - Descriptive Study
41:36
Category II - Correlational Study
42:05
Category III - Experimental, Quasi-experimental, Non-experimental
42:43
Example 1: Research Design
43:50
Example 2: Research Design
47:37
Example 3: Research Design
50:12
Example 4: Research Design
52:00
Between and Within Treatment Variability

41m 31s

Intro
0:00
Roadmap
0:06
Roadmap
0:07
Experimental Designs
0:51
Experimental Designs: Manipulation & Control
0:52
Two Types of Variability
2:09
Between Treatment Variability
2:10
Within Treatment Variability
3:31
Updated Goal of Experimental Design
5:47
Updated Goal of Experimental Design
5:48
Example: Drugs and Driving
6:56
Example: Drugs and Driving
6:57
Different Types of Random Assignment
11:27
All Experiments
11:28
Completely Random Design
12:02
Randomized Block Design
13:19
Randomized Block Design
15:48
Matched Pairs Design
15:49
Repeated Measures Design
19:47
Between-subject Variable vs. Within-subject Variable
22:43
Completely Randomized Design
22:44
Repeated Measures Design
25:03
Example 1: Design a Completely Random, Matched Pair, and Repeated Measures Experiment
26:16
Example 2: Block Design
31:41
Example 3: Completely Randomized Designs
35:11
Example 4: Completely Random, Matched Pairs, or Repeated Measures Experiments?
39:01
Section 7: Review of Probability Axioms
Sample Spaces

37m 52s

Intro
0:00
Roadmap
0:07
Roadmap
0:08
Why is Probability Involved in Statistics
0:48
Probability
0:49
Can People Tell the Difference between Cheap and Gourmet Coffee?
2:08
Taste Test with Coffee Drinkers
3:37
If No One can Actually Taste the Difference
3:38
If Everyone can Actually Taste the Difference
5:36
Creating a Probability Model
7:09
Creating a Probability Model
7:10
D'Alembert vs. Necker
9:41
D'Alembert vs. Necker
9:42
Problem with D'Alembert's Model
13:29
Problem with D'Alembert's Model
13:30
Covering Entire Sample Space
15:08
Fundamental Principle of Counting
15:09
Where Do Probabilities Come From?
22:54
Observed Data, Symmetry, and Subjective Estimates
22:55
Checking whether Model Matches Real World
24:27
Law of Large Numbers
24:28
Example 1: Law of Large Numbers
27:46
Example 2: Possible Outcomes
30:43
Example 3: Brands of Coffee and Taste
33:25
Example 4: How Many Different Treatments are there?
35:33
Addition Rule for Disjoint Events

20m 29s

Intro
0:00
Roadmap
0:08
Roadmap
0:09
Disjoint Events
0:41
Disjoint Events
0:42
Meaning of 'or'
2:39
In Regular Life
2:40
In Math/Statistics/Computer Science
3:10
Addition Rule for Disjoin Events
3:55
If A and B are Disjoint: P (A and B)
3:56
If A and B are Disjoint: P (A or B)
5:15
General Addition Rule
5:41
General Addition Rule
5:42
Generalized Addition Rule
8:31
If A and B are not Disjoint: P (A or B)
8:32
Example 1: Which of These are Mutually Exclusive?
10:50
Example 2: What is the Probability that You will Have a Combination of One Heads and Two Tails?
12:57
Example 3: Engagement Party
15:17
Example 4: Home Owner's Insurance
18:30
Conditional Probability

57m 19s

Intro
0:00
Roadmap
0:05
Roadmap
0:06
'or' vs. 'and' vs. Conditional Probability
1:07
'or' vs. 'and' vs. Conditional Probability
1:08
'and' vs. Conditional Probability
5:57
P (M or L)
5:58
P (M and L)
8:41
P (M|L)
11:04
P (L|M)
12:24
Tree Diagram
15:02
Tree Diagram
15:03
Defining Conditional Probability
22:42
Defining Conditional Probability
22:43
Common Contexts for Conditional Probability
30:56
Medical Testing: Positive Predictive Value
30:57
Medical Testing: Sensitivity
33:03
Statistical Tests
34:27
Example 1: Drug and Disease
36:41
Example 2: Marbles and Conditional Probability
40:04
Example 3: Cards and Conditional Probability
45:59
Example 4: Votes and Conditional Probability
50:21
Independent Events

24m 27s

Intro
0:00
Roadmap
0:05
Roadmap
0:06
Independent Events & Conditional Probability
0:26
Non-independent Events
0:27
Independent Events
2:00
Non-independent and Independent Events
3:08
Non-independent and Independent Events
3:09
Defining Independent Events
5:52
Defining Independent Events
5:53
Multiplication Rule
7:29
Previously…
7:30
But with Independent Evens
8:53
Example 1: Which of These Pairs of Events are Independent?
11:12
Example 2: Health Insurance and Probability
15:12
Example 3: Independent Events
17:42
Example 4: Independent Events
20:03
Section 8: Probability Distributions
Introduction to Probability Distributions

56m 45s

Intro
0:00
Roadmap
0:08
Roadmap
0:09
Sampling vs. Probability
0:57
Sampling
0:58
Missing
1:30
What is Missing?
3:06
Insight: Probability Distributions
5:26
Insight: Probability Distributions
5:27
What is a Probability Distribution?
7:29
From Sample Spaces to Probability Distributions
8:44
Sample Space
8:45
Probability Distribution of the Sum of Two Die
11:16
The Random Variable
17:43
The Random Variable
17:44
Expected Value
21:52
Expected Value
21:53
Example 1: Probability Distributions
28:45
Example 2: Probability Distributions
35:30
Example 3: Probability Distributions
43:37
Example 4: Probability Distributions
47:20
Expected Value & Variance of Probability Distributions

53m 41s

Intro
0:00
Roadmap
0:06
Roadmap
0:07
Discrete vs. Continuous Random Variables
1:04
Discrete vs. Continuous Random Variables
1:05
Mean and Variance Review
4:44
Mean: Sample, Population, and Probability Distribution
4:45
Variance: Sample, Population, and Probability Distribution
9:12
Example Situation
14:10
Example Situation
14:11
Some Special Cases…
16:13
Some Special Cases…
16:14
Linear Transformations
19:22
Linear Transformations
19:23
What Happens to Mean and Variance of the Probability Distribution?
20:12
n Independent Values of X
25:38
n Independent Values of X
25:39
Compare These Two Situations
30:56
Compare These Two Situations
30:57
Two Random Variables, X and Y
32:02
Two Random Variables, X and Y
32:03
Example 1: Expected Value & Variance of Probability Distributions
35:35
Example 2: Expected Values & Standard Deviation
44:17
Example 3: Expected Winnings and Standard Deviation
48:18
Binomial Distribution

55m 15s

Intro
0:00
Roadmap
0:05
Roadmap
0:06
Discrete Probability Distributions
1:42
Discrete Probability Distributions
1:43
Binomial Distribution
2:36
Binomial Distribution
2:37
Multiplicative Rule Review
6:54
Multiplicative Rule Review
6:55
How Many Outcomes with k 'Successes'
10:23
Adults and Bachelor's Degree: Manual List of Outcomes
10:24
P (X=k)
19:37
Putting Together # of Outcomes with the Multiplicative Rule
19:38
Expected Value and Standard Deviation in a Binomial Distribution
25:22
Expected Value and Standard Deviation in a Binomial Distribution
25:23
Example 1: Coin Toss
33:42
Example 2: College Graduates
38:03
Example 3: Types of Blood and Probability
45:39
Example 4: Expected Number and Standard Deviation
51:11
Section 9: Sampling Distributions of Statistics
Introduction to Sampling Distributions

48m 17s

Intro
0:00
Roadmap
0:08
Roadmap
0:09
Probability Distributions vs. Sampling Distributions
0:55
Probability Distributions vs. Sampling Distributions
0:56
Same Logic
3:55
Logic of Probability Distribution
3:56
Example: Rolling Two Die
6:56
Simulating Samples
9:53
To Come Up with Probability Distributions
9:54
In Sampling Distributions
11:12
Connecting Sampling and Research Methods with Sampling Distributions
12:11
Connecting Sampling and Research Methods with Sampling Distributions
12:12
Simulating a Sampling Distribution
14:14
Experimental Design: Regular Sleep vs. Less Sleep
14:15
Logic of Sampling Distributions
23:08
Logic of Sampling Distributions
23:09
General Method of Simulating Sampling Distributions
25:38
General Method of Simulating Sampling Distributions
25:39
Questions that Remain
28:45
Questions that Remain
28:46
Example 1: Mean and Standard Error of Sampling Distribution
30:57
Example 2: What is the Best Way to Describe Sampling Distributions?
37:12
Example 3: Matching Sampling Distributions
38:21
Example 4: Mean and Standard Error of Sampling Distribution
41:51
Sampling Distribution of the Mean

1h 8m 48s

Intro
0:00
Roadmap
0:05
Roadmap
0:06
Special Case of General Method for Simulating a Sampling Distribution
1:53
Special Case of General Method for Simulating a Sampling Distribution
1:54
Computer Simulation
3:43
Using Simulations to See Principles behind Shape of SDoM
15:50
Using Simulations to See Principles behind Shape of SDoM
15:51
Conditions
17:38
Using Simulations to See Principles behind Center (Mean) of SDoM
20:15
Using Simulations to See Principles behind Center (Mean) of SDoM
20:16
Conditions: Does n Matter?
21:31
Conditions: Does Number of Simulation Matter?
24:37
Using Simulations to See Principles behind Standard Deviation of SDoM
27:13
Using Simulations to See Principles behind Standard Deviation of SDoM
27:14
Conditions: Does n Matter?
34:45
Conditions: Does Number of Simulation Matter?
36:24
Central Limit Theorem
37:13
SHAPE
38:08
CENTER
39:34
SPREAD
39:52
Comparing Population, Sample, and SDoM
43:10
Comparing Population, Sample, and SDoM
43:11
Answering the 'Questions that Remain'
48:24
What Happens When We Don't Know What the Population Looks Like?
48:25
Can We Have Sampling Distributions for Summary Statistics Other than the Mean?
49:42
How Do We Know whether a Sample is Sufficiently Unlikely?
53:36
Do We Always Have to Simulate a Large Number of Samples in Order to get a Sampling Distribution?
54:40
Example 1: Mean Batting Average
55:25
Example 2: Mean Sampling Distribution and Standard Error
59:07
Example 3: Sampling Distribution of the Mean
1:01:04
Sampling Distribution of Sample Proportions

54m 37s

Intro
0:00
Roadmap
0:06
Roadmap
0:07
Intro to Sampling Distribution of Sample Proportions (SDoSP)
0:51
Categorical Data (Examples)
0:52
Wish to Estimate Proportion of Population from Sample…
2:00
Notation
3:34
Population Proportion and Sample Proportion Notations
3:35
What's the Difference?
9:19
SDoM vs. SDoSP: Type of Data
9:20
SDoM vs. SDoSP: Shape
11:24
SDoM vs. SDoSP: Center
12:30
SDoM vs. SDoSP: Spread
15:34
Binomial Distribution vs. Sampling Distribution of Sample Proportions
19:14
Binomial Distribution vs. SDoSP: Type of Data
19:17
Binomial Distribution vs. SDoSP: Shape
21:07
Binomial Distribution vs. SDoSP: Center
21:43
Binomial Distribution vs. SDoSP: Spread
24:08
Example 1: Sampling Distribution of Sample Proportions
26:07
Example 2: Sampling Distribution of Sample Proportions
37:58
Example 3: Sampling Distribution of Sample Proportions
44:42
Example 4: Sampling Distribution of Sample Proportions
45:57
Section 10: Inferential Statistics
Introduction to Confidence Intervals

42m 53s

Intro
0:00
Roadmap
0:06
Roadmap
0:07
Inferential Statistics
0:50
Inferential Statistics
0:51
Two Problems with This Picture…
3:20
Two Problems with This Picture…
3:21
Solution: Confidence Intervals (CI)
4:59
Solution: Hypotheiss Testing (HT)
5:49
Which Parameters are Known?
6:45
Which Parameters are Known?
6:46
Confidence Interval - Goal
7:56
When We Don't Know m but know s
7:57
When We Don't Know
18:27
When We Don't Know m nor s
18:28
Example 1: Confidence Intervals
26:18
Example 2: Confidence Intervals
29:46
Example 3: Confidence Intervals
32:18
Example 4: Confidence Intervals
38:31
t Distributions

1h 2m 6s

Intro
0:00
Roadmap
0:04
Roadmap
0:05
When to Use z vs. t?
1:07
When to Use z vs. t?
1:08
What is z and t?
3:02
z-score and t-score: Commonality
3:03
z-score and t-score: Formulas
3:34
z-score and t-score: Difference
5:22
Why not z? (Why t?)
7:24
Why not z? (Why t?)
7:25
But Don't Worry!
15:13
Gossett and t-distributions
15:14
Rules of t Distributions
17:05
t-distributions are More Normal as n Gets Bigger
17:06
t-distributions are a Family of Distributions
18:55
Degrees of Freedom (df)
20:02
Degrees of Freedom (df)
20:03
t Family of Distributions
24:07
t Family of Distributions : df = 2 , 4, and 60
24:08
df = 60
29:16
df = 2
29:59
How to Find It?
31:01
'Student's t-distribution' or 't-distribution'
31:02
Excel Example
33:06
Example 1: Which Distribution Do You Use? Z or t?
45:26
Example 2: Friends on Facebook
47:41
Example 3: t Distributions
52:15
Example 4: t Distributions , confidence interval, and mean
55:59
Introduction to Hypothesis Testing

1h 6m 33s

Intro
0:00
Roadmap
0:06
Roadmap
0:07
Issues to Overcome in Inferential Statistics
1:35
Issues to Overcome in Inferential Statistics
1:36
What Happens When We Don't Know What the Population Looks Like?
2:57
How Do We Know whether a sample is Sufficiently Unlikely
3:43
Hypothesizing a Population
6:44
Hypothesizing a Population
6:45
Null Hypothesis
8:07
Alternative Hypothesis
8:56
Hypotheses
11:58
Hypotheses
11:59
Errors in Hypothesis Testing
14:22
Errors in Hypothesis Testing
14:23
Steps of Hypothesis Testing
21:15
Steps of Hypothesis Testing
21:16
Single Sample HT ( When Sigma Available)
26:08
Example: Average Facebook Friends
26:09
Step1
27:08
Step 2
27:58
Step 3
28:17
Step 4
32:18
Single Sample HT (When Sigma Not Available)
36:33
Example: Average Facebook Friends
36:34
Step1: Hypothesis Testing
36:58
Step 2: Significance Level
37:25
Step 3: Decision Stage
37:40
Step 4: Sample
41:36
Sigma and p-value
45:04
Sigma and p-value
45:05
On tailed vs. Two Tailed Hypotheses
45:51
Example 1: Hypothesis Testing
48:37
Example 2: Heights of Women in the US
57:43
Example 3: Select the Best Way to Complete This Sentence
1:03:23
Confidence Intervals for the Difference of Two Independent Means

55m 14s

Intro
0:00
Roadmap
0:14
Roadmap
0:15
One Mean vs. Two Means
1:17
One Mean vs. Two Means
1:18
Notation
2:41
A Sample! A Set!
2:42
Mean of X, Mean of Y, and Difference of Two Means
3:56
SE of X
4:34
SE of Y
6:28
Sampling Distribution of the Difference between Two Means (SDoD)
7:48
Sampling Distribution of the Difference between Two Means (SDoD)
7:49
Rules of the SDoD (similar to CLT!)
15:00
Mean for the SDoD Null Hypothesis
15:01
Standard Error
17:39
When can We Construct a CI for the Difference between Two Means?
21:28
Three Conditions
21:29
Finding CI
23:56
One Mean CI
23:57
Two Means CI
25:45
Finding t
29:16
Finding t
29:17
Interpreting CI
30:25
Interpreting CI
30:26
Better Estimate of s (s pool)
34:15
Better Estimate of s (s pool)
34:16
Example 1: Confidence Intervals
42:32
Example 2: SE of the Difference
52:36
Hypothesis Testing for the Difference of Two Independent Means

50m

Intro
0:00
Roadmap
0:06
Roadmap
0:07
The Goal of Hypothesis Testing
0:56
One Sample and Two Samples
0:57
Sampling Distribution of the Difference between Two Means (SDoD)
3:42
Sampling Distribution of the Difference between Two Means (SDoD)
3:43
Rules of the SDoD (Similar to CLT!)
6:46
Shape
6:47
Mean for the Null Hypothesis
7:26
Standard Error for Independent Samples (When Variance is Homogenous)
8:18
Standard Error for Independent Samples (When Variance is not Homogenous)
9:25
Same Conditions for HT as for CI
10:08
Three Conditions
10:09
Steps of Hypothesis Testing
11:04
Steps of Hypothesis Testing
11:05
Formulas that Go with Steps of Hypothesis Testing
13:21
Step 1
13:25
Step 2
14:18
Step 3
15:00
Step 4
16:57
Example 1: Hypothesis Testing for the Difference of Two Independent Means
18:47
Example 2: Hypothesis Testing for the Difference of Two Independent Means
33:55
Example 3: Hypothesis Testing for the Difference of Two Independent Means
44:22
Confidence Intervals & Hypothesis Testing for the Difference of Two Paired Means

1h 14m 11s

Intro
0:00
Roadmap
0:09
Roadmap
0:10
The Goal of Hypothesis Testing
1:27
One Sample and Two Samples
1:28
Independent Samples vs. Paired Samples
3:16
Independent Samples vs. Paired Samples
3:17
Which is Which?
5:20
Independent SAMPLES vs. Independent VARIABLES
7:43
independent SAMPLES vs. Independent VARIABLES
7:44
T-tests Always…
10:48
T-tests Always…
10:49
Notation for Paired Samples
12:59
Notation for Paired Samples
13:00
Steps of Hypothesis Testing for Paired Samples
16:13
Steps of Hypothesis Testing for Paired Samples
16:14
Rules of the SDoD (Adding on Paired Samples)
18:03
Shape
18:04
Mean for the Null Hypothesis
18:31
Standard Error for Independent Samples (When Variance is Homogenous)
19:25
Standard Error for Paired Samples
20:39
Formulas that go with Steps of Hypothesis Testing
22:59
Formulas that go with Steps of Hypothesis Testing
23:00
Confidence Intervals for Paired Samples
30:32
Confidence Intervals for Paired Samples
30:33
Example 1: Confidence Intervals & Hypothesis Testing for the Difference of Two Paired Means
32:28
Example 2: Confidence Intervals & Hypothesis Testing for the Difference of Two Paired Means
44:02
Example 3: Confidence Intervals & Hypothesis Testing for the Difference of Two Paired Means
52:23
Type I and Type II Errors

31m 27s

Intro
0:00
Roadmap
0:18
Roadmap
0:19
Errors and Relationship to HT and the Sample Statistic?
1:11
Errors and Relationship to HT and the Sample Statistic?
1:12
Instead of a Box…Distributions!
7:00
One Sample t-test: Friends on Facebook
7:01
Two Sample t-test: Friends on Facebook
13:46
Usually, Lots of Overlap between Null and Alternative Distributions
16:59
Overlap between Null and Alternative Distributions
17:00
How Distributions and 'Box' Fit Together
22:45
How Distributions and 'Box' Fit Together
22:46
Example 1: Types of Errors
25:54
Example 2: Types of Errors
27:30
Example 3: What is the Danger of the Type I Error?
29:38
Effect Size & Power

44m 41s

Intro
0:00
Roadmap
0:05
Roadmap
0:06
Distance between Distributions: Sample t
0:49
Distance between Distributions: Sample t
0:50
Problem with Distance in Terms of Standard Error
2:56
Problem with Distance in Terms of Standard Error
2:57
Test Statistic (t) vs. Effect Size (d or g)
4:38
Test Statistic (t) vs. Effect Size (d or g)
4:39
Rules of Effect Size
6:09
Rules of Effect Size
6:10
Why Do We Need Effect Size?
8:21
Tells You the Practical Significance
8:22
HT can be Deceiving…
10:25
Important Note
10:42
What is Power?
11:20
What is Power?
11:21
Why Do We Need Power?
14:19
Conditional Probability and Power
14:20
Power is:
16:27
Can We Calculate Power?
19:00
Can We Calculate Power?
19:01
How Does Alpha Affect Power?
20:36
How Does Alpha Affect Power?
20:37
How Does Effect Size Affect Power?
25:38
How Does Effect Size Affect Power?
25:39
How Does Variability and Sample Size Affect Power?
27:56
How Does Variability and Sample Size Affect Power?
27:57
How Do We Increase Power?
32:47
Increasing Power
32:48
Example 1: Effect Size & Power
35:40
Example 2: Effect Size & Power
37:38
Example 3: Effect Size & Power
40:55
Section 11: Analysis of Variance
F-distributions

24m 46s

Intro
0:00
Roadmap
0:04
Roadmap
0:05
Z- & T-statistic and Their Distribution
0:34
Z- & T-statistic and Their Distribution
0:35
F-statistic
4:55
The F Ration ( the Variance Ratio)
4:56
F-distribution
12:29
F-distribution
12:30
s and p-value
15:00
s and p-value
15:01
Example 1: Why Does F-distribution Stop At 0 But Go On Until Infinity?
18:33
Example 2: F-distributions
19:29
Example 3: F-distributions and Heights
21:29
ANOVA with Independent Samples

1h 9m 25s

Intro
0:00
Roadmap
0:05
Roadmap
0:06
The Limitations of t-tests
1:12
The Limitations of t-tests
1:13
Two Major Limitations of Many t-tests
3:26
Two Major Limitations of Many t-tests
3:27
Ronald Fisher's Solution… F-test! New Null Hypothesis
4:43
Ronald Fisher's Solution… F-test! New Null Hypothesis (Omnibus Test - One Test to Rule Them All!)
4:44
Analysis of Variance (ANoVA) Notation
7:47
Analysis of Variance (ANoVA) Notation
7:48
Partitioning (Analyzing) Variance
9:58
Total Variance
9:59
Within-group Variation
14:00
Between-group Variation
16:22
Time out: Review Variance & SS
17:05
Time out: Review Variance & SS
17:06
F-statistic
19:22
The F Ratio (the Variance Ratio)
19:23
S²bet = SSbet / dfbet
22:13
What is This?
22:14
How Many Means?
23:20
So What is the dfbet?
23:38
So What is SSbet?
24:15
S²w = SSw / dfw
26:05
What is This?
26:06
How Many Means?
27:20
So What is the dfw?
27:36
So What is SSw?
28:18
Chart of Independent Samples ANOVA
29:25
Chart of Independent Samples ANOVA
29:26
Example 1: Who Uploads More Photos: Unknown Ethnicity, Latino, Asian, Black, or White Facebook Users?
35:52
Hypotheses
35:53
Significance Level
39:40
Decision Stage
40:05
Calculate Samples' Statistic and p-Value
44:10
Reject or Fail to Reject H0
55:54
Example 2: ANOVA with Independent Samples
58:21
Repeated Measures ANOVA

1h 15m 13s

Intro
0:00
Roadmap
0:05
Roadmap
0:06
The Limitations of t-tests
0:36
Who Uploads more Pictures and Which Photo-Type is Most Frequently Used on Facebook?
0:37
ANOVA (F-test) to the Rescue!
5:49
Omnibus Hypothesis
5:50
Analyze Variance
7:27
Independent Samples vs. Repeated Measures
9:12
Same Start
9:13
Independent Samples ANOVA
10:43
Repeated Measures ANOVA
12:00
Independent Samples ANOVA
16:00
Same Start: All the Variance Around Grand Mean
16:01
Independent Samples
16:23
Repeated Measures ANOVA
18:18
Same Start: All the Variance Around Grand Mean
18:19
Repeated Measures
18:33
Repeated Measures F-statistic
21:22
The F Ratio (The Variance Ratio)
21:23
S²bet = SSbet / dfbet
23:07
What is This?
23:08
How Many Means?
23:39
So What is the dfbet?
23:54
So What is SSbet?
24:32
S² resid = SS resid / df resid
25:46
What is This?
25:47
So What is SS resid?
26:44
So What is the df resid?
27:36
SS subj and df subj
28:11
What is This?
28:12
How Many Subject Means?
29:43
So What is df subj?
30:01
So What is SS subj?
30:09
SS total and df total
31:42
What is This?
31:43
What is the Total Number of Data Points?
32:02
So What is df total?
32:34
so What is SS total?
32:47
Chart of Repeated Measures ANOVA
33:19
Chart of Repeated Measures ANOVA: F and Between-samples Variability
33:20
Chart of Repeated Measures ANOVA: Total Variability, Within-subject (case) Variability, Residual Variability
35:50
Example 1: Which is More Prevalent on Facebook: Tagged, Uploaded, Mobile, or Profile Photos?
40:25
Hypotheses
40:26
Significance Level
41:46
Decision Stage
42:09
Calculate Samples' Statistic and p-Value
46:18
Reject or Fail to Reject H0
57:55
Example 2: Repeated Measures ANOVA
58:57
Example 3: What's the Problem with a Bunch of Tiny t-tests?
1:13:59
Section 12: Chi-square Test
Chi-Square Goodness-of-Fit Test

58m 23s

Intro
0:00
Roadmap
0:05
Roadmap
0:06
Where Does the Chi-Square Test Belong?
0:50
Where Does the Chi-Square Test Belong?
0:51
A New Twist on HT: Goodness-of-Fit
7:23
HT in General
7:24
Goodness-of-Fit HT
8:26
Hypotheses about Proportions
12:17
Null Hypothesis
12:18
Alternative Hypothesis
13:23
Example
14:38
Chi-Square Statistic
17:52
Chi-Square Statistic
17:53
Chi-Square Distributions
24:31
Chi-Square Distributions
24:32
Conditions for Chi-Square
28:58
Condition 1
28:59
Condition 2
30:20
Condition 3
30:32
Condition 4
31:47
Example 1: Chi-Square Goodness-of-Fit Test
32:23
Example 2: Chi-Square Goodness-of-Fit Test
44:34
Example 3: Which of These Statements Describe Properties of the Chi-Square Goodness-of-Fit Test?
56:06
Chi-Square Test of Homogeneity

51m 36s

Intro
0:00
Roadmap
0:09
Roadmap
0:10
Goodness-of-Fit vs. Homogeneity
1:13
Goodness-of-Fit HT
1:14
Homogeneity
2:00
Analogy
2:38
Hypotheses About Proportions
5:00
Null Hypothesis
5:01
Alternative Hypothesis
6:11
Example
6:33
Chi-Square Statistic
10:12
Same as Goodness-of-Fit Test
10:13
Set Up Data
12:28
Setting Up Data Example
12:29
Expected Frequency
16:53
Expected Frequency
16:54
Chi-Square Distributions & df
19:26
Chi-Square Distributions & df
19:27
Conditions for Test of Homogeneity
20:54
Condition 1
20:55
Condition 2
21:39
Condition 3
22:05
Condition 4
22:23
Example 1: Chi-Square Test of Homogeneity
22:52
Example 2: Chi-Square Test of Homogeneity
32:10
Section 13: Overview of Statistics
Overview of Statistics

18m 11s

Intro
0:00
Roadmap
0:07
Roadmap
0:08
The Statistical Tests (HT) We've Covered
0:28
The Statistical Tests (HT) We've Covered
0:29
Organizing the Tests We've Covered…
1:08
One Sample: Continuous DV and Categorical DV
1:09
Two Samples: Continuous DV and Categorical DV
5:41
More Than Two Samples: Continuous DV and Categorical DV
8:21
The Following Data: OK Cupid
10:10
The Following Data: OK Cupid
10:11
Example 1: Weird-MySpace-Angle Profile Photo
10:38
Example 2: Geniuses
12:30
Example 3: Promiscuous iPhone Users
13:37
Example 4: Women, Aging, and Messaging
16:07
Loading...
This is a quick preview of the lesson. For full access, please Log In or Sign up.
For more information, please see full course syllabus of Statistics
Bookmark & Share Embed

Share this knowledge with your friends!

Copy & Paste this embed code into your website’s HTML

Please ensure that your website editor is in text mode when you paste the code.
(In Wordpress, the mode button is on the top right corner.)
  ×
  • - Allow users to view the embedded video in full-size.
Since this lesson is not free, only the preview will appear on your website.
  • Discussion

  • Answer Engine

  • Download Lecture Slides

  • Table of Contents

  • Transcription

  • Related Books

Lecture Comments (4)

3 answers

Last reply by: SCHOLASTICA TURNER-MOORE
Mon Sep 3, 2018 5:46 PM

Post by NASER HOTI on January 23, 2012

it will be very helpful if we have handouts of these lecture...

Correlation

Lecture Slides are screen-captured images of important points in the lecture. Students can download and print out these lecture slide images to do practice problems as well as take notes while watching the lecture.

  • Intro 0:00
  • Roadmap 0:05
    • Roadmap
  • Summarizing a Scatterplot Quantitatively 0:47
    • Shape
    • Trend
    • Strength: Correlation ®
  • Correlation Coefficient ( r ) 2:30
    • Correlation Coefficient ( r )
  • Trees vs. Forest 11:59
    • Trees vs. Forest
  • Calculating r 15:07
    • Average Product of z-scores for x and y
  • Relationship between Correlation and Slope 21:10
    • Relationship between Correlation and Slope
  • Example 1: Find the Correlation between Grams of Fat and Cost 24:11
  • Example 2: Relationship between r and b1 30:24
  • Example 3: Find the Regression Line 33:35
  • Example 4: Find the Correlation Coefficient for this Set of Data 37:37

Transcription: Correlation

Hi and welcome to www.educator.com.0000

Today we are going to talk about correlation.0002

First let us go back in and just briefly review summarizing scatter plots quantitatively0007

and talk about all the other things we have talked about scatter plots.0013

Then we will talk about eyeballing the correlation coefficient or what we call r, persons r.0018

Actually if you have a set of data that looks a particular way often you could sort of ballpark where the correlation coefficient falls.0026

We already talked about precisely calculating it.0036

Then we are going to go back and talk about the relationship between r and b1 or slope of our regression line.0039

First let us talk about summarizing a scatter plot quantitatively.0049

We did not deal with shape.0056

We just looked at it and maybe that is pretty good.0058

We will talk about shape in the next couple of lessons, but for now we are going to leave shape alone in terms of quantitatively calculating it.0062

We did look at how to precisely calculate the trend or by looking at the regression line,0072

that middle line in between all those lines that summarizes the middle of all those points.0080

And that middle line really gives us the relationship between X and Y, because it is the function that gives us if we have x, we get y and if we get y we get x.0087

We can get the relationship between those two variables.0101

Finally today we are going to talk about how calculating strength and not just looking at it as pretty strong right,0106

but instead we are going to actually calculate the correlation coefficients r.0115

That idea is simply, how pact around the regression line is our data points.0120

Are they tightly packed?0130

Is it a strong correlation like it strongly packed around that regression line?0132

Or is it very loose?0138

Is it dispersed?0139

It is not really sticking close to that line, then we would have low strength or low correlation.0141

For instance I am just eyeballing it and there are a lot of data.0152

You might have no relationship between two variables, and in that case, the spread looks something like this where there is no real line in there.0157

It is just sort of this cloud of dots.0168

Remember each of these points is a case.0172

Each of those cases has two variables.0178

One variable x is on it the x-axis and the other variable we will call it y is represented on the y-axis.0183

That point represents x here and y here.0192

In this case there is no relationship between X and Y just because you know what x is.0199

Let us say we know x is here.0204

Do you have any certainty as to where y might be?0209

There is some y down here and some y up here.0213

Even more so what about if we got X was here do we have any reason to say y is in a particular place?0218

No not really.0226

Because of that a line would not help us here.0228

A regression line does not actually summarize this very well, and it is because the correlation coefficient is very low.0233

There is very low strength.0240

There is very low adherence to the line.0241

Moving out a little bit further, you see that this one is starting to have more of a elongated shape.0245

This is still a fairly low correlation, but you can see that they are starting to be a linear relationship between X and Y, namely as x goes up, y also goes up.0255

This is what we call a positive correlation.0270

There is a relationship between x and y that is linear and positive.0273

Notice that y on the other side of this is the exact opposite where it is the same shape,0280

but it is been almost like flipped around like we put a mirror here and looked at the mirror reflection.0286

In this case it is the same shape cloud but now as x goes down y goes up.0294

Here we see the opposite relationship between X and Y and we call it a negative correlation.0305

Because of that the signs act accordingly.0312

Here the sign for this slope that is negative is -.4.0316

Here for slope that is positive as x goes up y goes up, as x goes down y goes down, that is a positive number.0325

We could easily just by looking at the correlation coefficient immediately know what kind of roughly what kind of relationship x and y have.0336

Notice that as we go out even further, not only do the numbers get bigger and bigger out from 0 but the numbers correspond to how whiny the data are.0346

How much they correspond to a line.0361

It is not really about having more dots but it is about how much do all those dots fit to a line.0364

That is what we often call collect fitting the data to a regression line.0373

We want to see is it a good fit? Is it a bad fit?0378

Correlation coefficient gives us the strength of that it is really strong fit or is it very week and loose.0381

The actual maximum for a correlation coefficient is 1 and the minimum -1.0390

That is as far as it will go.0398

You cannot have a correlation of like 1.1.0400

We will talk a little bit about y.0407

Here what you see is that it might have the same number of points of all of these, but it is just that there is very, very little variation from the line.0409

There is not a lot of variation out from the line, whereas coming like .8 you could see it is better than .4 but it is not quite as whiny as 1.0.0423

That is one way to just very quickly eyeball correlation coefficient.0435

You can just look at a data and it is elongated a little bit.0440

Maybe it is .4 but if it looks like tighter eclipse we will use .8.0445

If it looks very close to a line perhaps it is close to 1.0453

I want you to notice something.0466

Correlation coefficient, other than caring about positive and negative slope, it does not otherwise care very much about slope.0467

For instance, look at all of the situations, these are all lines.0477

They are all very lined, they are maximum lined.0482

Notice that these lines have positive slope but they all have the coefficient correlation of 1.0486

It does not matter whether x changes y changes very quickly or as x changes 1 y changes very slowly.0497

A slope does not matter, except for just the positive or negative part.0507

The same thing with the negative slopes, even though the slopes are all different they all have a correlation coefficient of 1.0514

There is an exception to this rule and it is this line right here, the perfect horizontal line.0527

Let us think about what the equation for the regular horizontal line is.0533

In a regular horizontal line it does not matter what x is y is always the same.0539

Let us say here like y=3 that would be like a horizontal line or y=-2 or y=.1.0547

Those are all example of horizontal lines.0556

Let us think about in the case of a horizontal line.0561

It is a perfect prediction because we know where the x is.0564

You could tell me whatever x you want.0570

I can exactly tell you the y because y is always negative.0572

y in this case is 3.0577

It is perfect prediction and it is perfect lining this but the correlation coefficient is 0.0578

We will try to figure out y as the horizontal line as we figure out the formula for correlation coefficient0589

and that hopefully that will become more clear.0599

There are many, many ways in which data can have seemingly no linear pattern0606

or very weak linear pattern because that is what the correlation coefficients tell us.0613

If you see our data have a 0 as its correlation coefficient do we know that it looks just like a cloud?0618

No, in fact it can look like anyone of these crazy shapes down here.0627

All of these distributions, all of the scatter plots have a very, very weak correlation because remember correlation just means how whiny is it.0634

This one is whiny.0642

Even though some of these are very, very regular shapes correlation coefficient cannot tell you that this has an interesting shape.0647

All it tells you is whether it coheres to that regression line or not.0656

Although these are very interesting set of data for instance here there is a certain 4 rough cluster and even though we could see and eyeball it,0661

The correlation coefficient would not tell us that.0672

Or in this case, this sort of our data set but even here the correlation coefficient would not tell us that either.0675

Are all of these data the correlation coefficient is very close to 0.0683

I want you to see there are many ways in which you can have a correlation coefficient of 1 or -1.0689

There are many ways in which you can have a coefficient of 0.0696

Just because we get the correlation coefficient does not mean we can see the shape of the distribution.0703

That is often useful to do a scatter plot anyway even for ourselves just so that we know what the numbers are probably going to be describing.0709

Let us say we have this graph and this shows us this nice correlation.0721

It was probably pretty high like r=.8.0728

It is closer to 1 than 0 but not quite 1.0736

This is a pretty good correlation and you might have two variables here.0740

For instance, perhaps this gives us the z scores for some variable like we are looking at twins and then we want to know does the intelligence of one twin,0748

does the IQ of one twin helps us predict the IQ of the other twin?0759

Maybe it is true.0765

Maybe that does seem to be the case.0767

Here we might put something like the intelligence of twin 1 on this axis and then we will put the intelligence of twin 2 on the z score from their IQ score.0771

We will put that on the y-axis.0783

When we have the scatter plot it is very important that we could toggle between the 3’s, the individual little dots and the forest, the big overall pattern.0786

When we will we look at correlation coefficients we are looking back.0801

We are sort of getting a bird’s eye view and looking very far away and trying to see the overall pattern and that is the forest.0804

It is really important to remember what are my trees?0816

What are my cases?0820

It is important to remember what this dot mean.0822

That is what I mean by the trees like you want to remember what are your cases?0826

What are your variables?0832

That is always step one of looking at a scatter plot.0834

In this case, it is not that each of these dots represents just one twin it is these dots represent a set of twins, a pair of twins.0838

This represents both twin 1 who is a little bit below average and twin 2 was actually a little bit above average.0848

Let us pick out another one.0862

Let us say this one.0866

This twin has a little bit above average and guess what, so is their twin.0868

Their twin is also little bit above average.0878

Each of these dots actually represents 2 people in this case, a set of twins.0881

You want to be able to switch your perspective and to zoom in and see the trees but also zoom out and see the forest0888

and try to estimate things like correlation coefficient or even try to estimate the regression line and try to eyeball where that might be.0896

Okay, now let us get to the business of calculating r.0909

You could think of the correlation coefficient as roughly that average product of z scores for x and y.0913

Let us recap a little bit what the z scores are.0922

z scores are just giving you how many standard deviations away you are but we do not want to know it in terms of the raw numbers.0927

We want to know it in terms of standard deviation.0936

We do not want to know, like how many feet away, but we want to know how many standard deviations away.0941

We could think of the standard deviation as jumps away from the mean.0947

How many of those jumps away are you?0952

That is the z score.0954

Here is how we calculate r.0957

The average product of z scores for x and y.0958

Let us put the z scores for x and y and multiply them together because we are getting the product.0965

The product is z(x) × z(y) and I’m going to sum them together and then divide by n-1.0971

Later on we will talk more about y and -1 as more frequent.0987

You can roughly see it is about the average and mostly because we are jumping from samples to populations we need to make a little bit of correction.0992

This formula of adding something up and dividing by n is an average and the thing that we are averaging are the product of the 2 z scores.1005

Now for all of these formulas you can think of these little z scores as you can double-click them and if you double-click what is inside.1017

Each z score let me write this in blue, so each z score is the distance away from the mean,1030

but not the raw distance and I want it in terms of standard deviation jumps away from the mean.1038

That would just be something like my y - y bar for mean and so that distance divided by the standard deviation.1045

Here I will just use little s and also for z(x) that is just x - x bar.1058

That is the raw distance away from the mean but divided by x standard deviation.1069

I will put a little x to indicate the standard deviation of x’s and a little y there to indicate the standard deviation of the y’s.1077

I’m going to multiply those together and add them up for every single data point that I have.1085

If that is my twin data for every single set of twins that I have.1092

When we divide all of that by n-1 and n is my number of cases.1098

How many twins how you got?1105

How many sets of twins have you got?1107

If we want to do although it goes without saying this just implicitly have an i that goes from 1 all the way to n1109

because it is for every single one of my data points that I need to do this.1117

Furthermore, we can double click on each of these little standard deviation.1125

Now how do we find standard deviation?1135

A standard deviation is the square root of the average distance away from the mean.1137

The average distance away.1146

The square root of average squared distance which is average distance away.1148

S sub y and this is think about the distance we already know how to do distance because we have already done it.1154

Its average squared distance because remember its sum of squares over n-1.1168

It is sum of squared distances because if we just got the sum of the differences then we just get something very close to 0.1179

We want that and we divide the n-1 because that sum of squares is very small so we need to correct for that by going from samples to populations.1193

That is what we do by n-1.1203

Because we want the standard deviation and not the variance we are going to square root this whole thing.1205

Same thing for s(x) it is a same thing except we substitute an x here instead of y.1211

I forgot to put my little sigma notation because I want to do this for every single y.1220

Although it looks sort of complicated if we write the whole thing out1227

but if we wrote the actual n or double-click diversion of s sub y in there it might look very crazy.1232

What you do have to remember it alternately less is the main idea you want to get out of today and you want to take a moment to think what z score.1242

Once you unpack z score you want to take a moment to think what standard deviation and hopefully you will be able to unlock those things as you go.1256

Then you do not have to remember all of that stuff at once you can just remember them one at a time.1265

Now that you know the formula for correlation coefficient let us talk about the relationship between correlation and slope.1272

We already know that b1 and r have the same sign.1280

If B1 is negative r will be negative.1285

If b1 is positive r will be positive and vice versa.1288

We already know that they have the same sign and because of that they already slant in the correct way.1292

Remember r does not have any thing about y’s and run in it.1300

All it cares about this is how much like a line it is.1305

B1 and r have a very strict relationship where r when you multiply it by the proportion of standard deviation1309

of all y over the standard deviation of x as long as you multiply r by this and you can almost see rise over run then you get this slope.1321

Let us just think about this in our head and let us say r is 1 it is always 1 then whatever this proportion is that will perfectly get us b1.1335

Also if r is 1 as always 1 these 2 have a very similar standard deviation.1349

The spread of y is very similar to the spread of x then we should have perfect correlation of 1.1357

In that case you would be able to sort of say that makes sense if y is varying in a similar way as x then they should have correlation version of a slope of about 1.1368

If y is changing more slowly than x for every x you only go a tiny bit of y.1385

In that case this number would be smaller than this one and then that would give us less rise more run.1399

Something that looks sort of less slanted.1411

Something like this versus a slope of 1.1415

Something a little more shallow and that make sense less rise more run.1424

On the other side if for every y you go will go a little x then that would look something like this more rise less run.1431

This gives us this perfect relationship between r and b1.1446

Using that information let us try to solve this problem.1454

Example 1, here are the 3 pizza companies that we have looked at before, Papa John's, Dominoes and Pizza Hut.1458

It says find the correlation between grams of fat and cost.1466

I think these are for whole pizza and let us make this 17.50.1475

Let us make this $18 and $20 because this is really cheap to have $1.75 pizza.1485

It would be ridiculous to have 100g of fat in one slice of pizza.1492

If you look at the examples provided in the download below we can use the data in order to find correlation coefficient.1498

In order to find correlation coefficient I will break it down into the component pieces and the big component pieces1513

and the big component pieces I’m going to need are the z scores for x and the z scores for y.1519

I will say that the score for fat and that the z score for cost.1525

Z score for fat and z score for cost.1529

In order to find the z score I would need to put in the difference between this and the average.1541

One thing that might be easier is if we actually just create a column for averages because we are probably going to need this again and again.1549

Let me go ahead and get those averages.1564

I’m just getting the average cost, as well as average grams of fat.1571

I’m going to color it in a different color so that we know that this is the entirely different thing here.1578

We have that it would be easier for us to find the score for fat.1586

Here we want to get x of fat - the average and probably want to lock that in place and we want to divide that by standard deviation1591

and the nice thing about Excel is that it already has the function for standard deviation.1620

This one will give us the n -1 version so I can just lock that data down in order to move.1624

I probably want to copy it over to E later so I’m just going to unlock the B part.1645

As long as they in the same column, as long as I stay in column D it will use column B.1652

If I move over to Column E it should use column C.1657

Let us try that.1663

Here we see that the z score is -1, that is 0 and 1 and that makes sense.1664

Your z score is totaled together shared roughly equal 0 because you are getting how many distance away on the positive side.1669

How many distance away on the negative side, and they should balance out if you really have the mean.1677

Let us check this formula yet it is using B3 that it has average that is getting that Standard deviation perfect.1683

Once I have that I can actually just copy and paste this over here.1692

Here we see now it is using C and this average and getting the standard deviation of this data.1700

We see roughly the negative side as to the positive side.1711

We have these individual z scores, now we need to get the z scores for fat multiplied by the z score for costs.1719

That is real easy, this times this for every single data point or case that we have and we have 3 cases here.1729

The 3 different brands of pizza.1736

Once we have that instead of the average actually we could just get the average all at once because we could put it in-one formula.1739

We could just sum these together, sum those together, and we want to divide by n -1.1751

In this case, it is 2.1762

If you wanted to put in a formula you could put in counts -1, but I'm just going to put for our purposes 2 here.1765

We get a very, very high correlation where it is very, very close to 1 as cost goes up fat goes up.1774

As cost goes down fat goes down.1788

They have a very positive correlation and it is very whiny it.1792

Here it is very closely to the line.1796

Here we could see that this data is very highly correlated.1802

It has a strong correlation.1813

We do not have a lot of points, but apparently they fall very, very close to the line.1815

Previously, we found that the regression line for this data is this.1827

I believe that in that case, the cost is 17.50 that previously is $18.00 and $20.1834

Previously in the regression we already found this so check that the relationship between r from the previous example and b1.1844

It is asking us is this really true that b1 in this example, and we are not going to do this formula proof but just to see for ourselves.1856

Is b1 really equal the proportion of r times the proportion of the variation from y over the variation from x.1867

Is this relationship really true?1878

While we already know b1 .125 and we already know r.1882

We know r this is .94 and so we know this .94 multiply by s sub y over s(x) does all of this equal .125.1894

Let us see.1915

That is not too hard and then move that up here.1916

We have r over here I'm just going to find s sub y s sub x and multiplied by it.1922

I will just create another column for standard deviation, and let us get the standard deviation for x and the standard deviation for y.1935

Now you know that this r × the standard deviation of y over the standard deviation of x and that is going to be equal to .125.1948

That relationship holds here we have the b1 and r over to this side so we know what these things are.1968

There you have it.1988

We see that the relationship between r and b1 holds.1989

There is sort of a little bit of y for all a lot of run and we know that this line is pretty shallow and that makes sense.1994

This is a pretty shallow slope.2005

There is little rise over run and because of that is the fraction less than 1.2006

Example 3, the mean score on a math achievement test for community college was 504 with a standard deviation of 112.2018

For the corresponding reading achievement test the mean was 515 and a standard deviation was 160.2026

The correlation coefficient is very high.2035

Use this information to find the regression line.2038

Here we see that we have the correlation coefficient, but we but they do not give us the data.2041

Can we still do this?2049

Yes, we can because there is a relationship between the correlation coefficient and the standard deviation.2051

There is a relationship between the correlation coefficient and slope at all and we need to know are the standard deviation in order to find this.2058

B1 = r × s sub y / s sub x.2066

We actually know s sub y and s sub x and r so we could find b1.2073

Once we know b1 and we have the point of averages.2077

We have point of averages, which is x bar and y bar.2083

In fact, let us say this is x and let us say the reading is y so here we have 504 – 515.2090

We could get the slope and we can have one point of averages and we could find the intercept.2103

Let us go ahead and r is going to be .7 and s sub y which is the reading one is 116.2111

S sub x is 112.2122

We can find b1 and I’m just going to use a little bit of space down here to just do the calculations.2127

Feel free to do this on your calculator.2134

.7 × 116/112 and that is .725.2137

I have here .725 as my slope.2148

Once I have my slope I could put that into my slope intercept line 4.2154

My y is 515 and I'm looking for the intercept.2160

I add that to .725 × x which is 504.2173

When I go ahead and solve that in here and let me go ahead just solve that in here that is going to be 515 -.725 × 504.2186

I will get 149.6.2194

My b sub 0 = 149.6.2210

We have these two ideas we can now find the regression line.2216

A regression line in order to predict y is going to b sub 0 or the intercept 149.6 + .725 × x and that is our regression line.2221

Here we see that this slope is less than 1 y is more run.2237

More shallow slope and you do not need to have all the points in order to find the regression line.2247

Example number 4, find the correlation coefficient for this set of data and this set of data is provided for you on the download below.2259

If you go ahead and click on example 4 that data is all there.2269

Previously we looked at the data and we thought this is pretty good, pretty found linear correlation.2275

Let us see if our eyeballing was actually right.2282

I’m just going to move this one over a little bit because we are not going to need that as much.2285

Let me shrink this down a little bit.2296

It always helps me think about I’m trying to find correlation coefficient I know it is the average product of the z scores.2298

I need to find that the z scores.2311

I need to find that the z scores for the student faculty ratio the SFR.2315

I want to find the z score for cost per unit CPU.2323

Let us go ahead and do that.2329

In order to do that it is often helpful if you have the mean and standard deviation.2331

How do we find the mean and standard deviation somewhere.2335

Here let me just get the means here and move this over by one column just so that I can write mean and standard deviation.2340

Sometimes it will get confused as to like what we are doing and it is often helpful to write these things down.2361

I like to put it in a different color because that helps me know this is not part of my data.2370

Let us get the average mean of all of our data here.2377

The data for the student faculty ratio, as well as the cost per unit.2384

Subtlety that same data and find the standard deviation because we are often going to need that for z score.2391

It is just useful to have it in advance.2400

We have the mean and the standard deviation.2405

Here I’m just going to put a little divider here for now so that I can move this down.2408

Notice that it gets from row 7 to row 34.2416

Let us find the z scores.2422

Now that we have mean and standard deviation it should be really easy.2424

It is just the difference between my point and my average all divided by standard deviation.2427

I want to lock down that mean and average and as long as it is in the same column it will always use but I do want to use E when I move over.2437

I’m not going to lockdown the B part I’m just locking down the row.2453

I guess z score of -1.556.2458

If my z score calculations and my mean and all that stuff are correct.2465

I should roughly have z scores that are both positive and negative, and they should roughly balance out.2471

Let us take a look at our data and it seems like half of them are negative and roughly half of them are positive.2481

They should balance out.2489

Once I have that I could actually take all of these guys and drag that over.2491

Let us check one of these formulas here.2499

This one it gives me this point at the deviation or the difference between this point and the mean divided by its standard deviation.2501

Perfect.2510

Once we have that I know I need to multiply and get the product of these z scores.2513

Z of s(r) × z(CPU).2518

Let us see what we could do here.2528

Here I’m just going to multiply this times this for every single one of my points.2532

Once I get down here and I know I need to find the mean of these points.2545

I’m going to find but I do not want to use just the formula for mean because that is going to divide by n.2550

We are going to divide by n -1.2558

When I split it up into adding all of these up.2560

I am going to sum them all up and divide by count instead of counting all of these.2563

I’m just going to use the same points here I’m going to say count all of this and subtract one and put all in my green parentheses.2584

We get a negative slope that is pretty high.2601

It is you know above .8 and so let us take a look at our data to see if that makes sense to us.2608

We certainly understand y it is negative.2615

It makes sense that r is negative and we did not think it was pretty good.2619

We did think it was pretty strong and if it does end up being pretty strong .6 or .7.2623

That is correlation coefficient see you next time on www.educator.com.2634

Educator®

Please sign in to participate in this lecture discussion.

Resetting Your Password?
OR

Start Learning Now

Our free lessons will get you started (Adobe Flash® required).
Get immediate access to our entire library.

Membership Overview

  • Available 24/7. Unlimited Access to Our Entire Library.
  • Search and jump to exactly what you want to learn.
  • *Ask questions and get answers from the community and our teachers!
  • Practice questions with step-by-step solutions.
  • Download lecture slides for taking notes.
  • Track your course viewing progress.
  • Accessible anytime, anywhere with our Android and iOS apps.