To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Multivariable techniques produce two major kinds of information: Information about how well the model (all the independent variables together) fit the data and information about the relationship of each of the independent variables to the outcome (with adjustment for all other independent variables in the analysis). Common measures of the strength of the relationship between an independent variable and the outcome are odds ratio, relative hazard, and relative risk. Adjusting for multiple comparisons is challenging; most important, is to decide ahead of time whether there will be adjustments of multiple comparisons. A common convention is to not adjust the primary outcome, but to adjust secondary outcomes for multiple comparisons.
This chapter is devoted to extensive instruction regarding bivariate regression, also known as ordinary least squares regression (OLS).Students are presented with a scatterplot of data with a best-fitting line drawn through it.They are instructed on how to calculate the equation of this line (least squares line) by hand and with the R Commander.Interpretation of the statistical output of the y-intercept, beta coefficient, and R-squared value are discussed.Statistical significance of the beta coefficient and its implications for the relationship between an independent and dependent variable are described.Finally, the use of the regression equation for prediction is illustrated.
Pearson’s correlation describes the relationship between two interval- or ratio-level variables. Positive correlation values indicate that individuals who have high X scores tend to have high Y scores (and that individuals with low X scores tend to have low Y scores). A negative correlation indicates that individuals with high X scores tend to have low Y scores (and that individuals with low X scores tend to have high Y scores). Correlation values closer to +1 or –1 indicate stronger relationships between the variables; values close to zero indicate weaker relationships. A correlation between two variables does not imply a causal relationship between them.
It is also possible to test a correlation coefficient for statistical significance, where the null hypothesis is r = 0. This follows the same steps of all NHSTs. The effect size for Pearson’s r is calculated by squaring the r value (r2).
A correlation is visualized with a scatterplot. Scatterplots for strong correlations have dots that are closely grouped together; scatterplots showing weak correlations have widely spaced dots. Positive correlations have dots that cluster in the lower-left and upper-right quadrants of a scatterplot. Negative correlations have the reverse pattern.
Recommend this
Email your librarian or administrator to recommend adding this to your organisation's collection.