In most of our research, we are interested in whether one variable of interest (e.g., relationship quality) predicts another (e.g., relationship satisfaction). Although a correlation analysis can tell us whether the two variables are related, it can’t tell us whether mentoring quality leads to better outcomes.
In order consider predictive relations, we use regressions. While correlations quantify the extent to which one variable tends to change when another one changes, a regression fits a line into the represented data, as in the example below.
The regression analysis provides a beta coefficient (b), which captures the strength of the relation between the specific variables. When there is more than one predictor (or independent variable), a beta coefficient is provided for each variable pair. In addition to a beta, the regression provides an over estimate of the strength of the association. That is, how much of the change (or variance) in the outcome (mentoring satisfaction) is explained by the predictor (or set of predictors). This overall regression effect is R-squared (R2).
- Regressions are similar to correlations in that they examine the relations between variables.
- In addition to investigating the association, regressions allow for research to explore the strength of the relations between two variables.
- Regressions can incorporate more than one predictor (e.g., relationship quality, youth age) of the outcome (e.g., satisfaction).