Ŷ
Simple Linear Regression Statistical Analysis Tool · Ordinary Least Squares
t & F Distributions APA 7th
Configure data
n = 0
n = 0
Model Coefficients
Analysis of Variance (ANOVA)
Regression Line & Scatterplot
APA 7th edition reporting statement
Copy-paste template
Calculation Details
Interpretation Report
Step-by-step analysis & plain-language conclusions
1
Hypotheses
2
Assumptions check
3
Model fit & equation
4
Statistical significance
5
Conclusion & reporting
What is Simple Linear Regression?
Simple linear regression models the relationship between a single independent variable (predictor, $X$) and a continuous dependent variable (criterion, $Y$). It uses the Ordinary Least Squares (OLS) method to find the "line of best fit" that minimizes the sum of squared differences (residuals) between the observed $Y$ values and the values predicted by the model ($\hat{Y}$).
Regression Equation: Ŷ = bX + a where: b = SP_xy / SS_x [Slope] a = ȳ - b(x̄) [Y-intercept] SP_xy = Σ(x_i - x̄)(y_i - ȳ) SS_x = Σ(x_i - x̄)²
Interpreting the Coefficients
  • Slope (b) — Represents the estimated change in the dependent variable ($Y$) for every one-unit increase in the independent variable ($X$). If the slope is positive, $Y$ increases as $X$ increases. If negative, $Y$ decreases as $X$ increases.
  • Intercept (a) — Represents the estimated value of $Y$ when $X$ equals zero. It anchors the line on the Y-axis. Note: If $X=0$ is theoretically impossible or outside the range of your data, the intercept may simply act as an adjustment constant rather than a meaningful prediction.
  • R-squared ($R^2$) — Also known as the Coefficient of Determination. It represents the proportion of variance in the dependent variable that can be explained by the predictor. E.g., $R^2 = 0.45$ means 45% of the variation in $Y$ is explained by $X$.
Assumptions (L.I.N.E.)
  • Linearity — The relationship between $X$ and the mean of $Y$ is linear. Check via scatterplot.
  • Independence — Observations are independent of each other (no autocorrelation).
  • Normality of Residuals — The errors (residuals) are normally distributed at any value of $X$.
  • Equal Variance (Homoscedasticity) — The variance of the residuals is constant across all levels of $X$.
APA 7th edition reporting format
Template: "A simple linear regression was calculated to predict [Y] based on [X]. A significant regression equation was found (F(1, [df]) = [F-val], p = [p-val]), with an $R^2$ of .[R2]. Participants' predicted [Y] is equal to [intercept] + [slope] (X). [Y] increased [slope] units for each [unit] of [X]."
Primary references
Cohen, J., Cohen, P., West, S. G., & Aiken, L. S. (2003). Applied multiple regression/correlation analysis for the behavioral sciences (3rd ed.). Lawrence Erlbaum Associates.
Field, A. (2018). Discovering statistics using IBM SPSS statistics (5th ed.). SAGE Publications.
Montgomery, D. C., Peck, E. A., & Vining, G. G. (2021). Introduction to linear regression analysis (6th ed.). Wiley.