Multiple Predictor Variables
Multiple Linear Regression Calculator
Evaluate how two continuous independent variables jointly and uniquely predict a single dependent variable using Ordinary Least Squares.
Configure data
n = 0
n = 0
n = 0
Model Coefficients
Analysis of Variance (ANOVA)
Model Fit: Actual vs. Predicted
APA 7th edition reporting statement
Copy-paste template
Calculation Details
Interpretation Report
Step-by-step analysis & plain-language conclusions
1
Hypotheses
▾
2
Assumptions & Multicollinearity
▾
3
Overall Model Fit (R²)
▾
4
ANOVA (Overall Significance)
▾
5
Individual Predictors
▾
6
Conclusion & reporting
▾
All interpretations are generated from your data. Review before use in academic submissions.
Reference · Statistical Theory
How Multiple Linear Regression Works
Theoretical foundation, assumptions, partial coefficients, and primary references for Ordinary Least Squares (OLS) Multiple Regression.
What is Multiple Linear Regression?
Multiple Linear Regression (MLR) extends simple linear regression to include two or more independent variables (predictors). It models the linear relationship between the predictors and a single continuous dependent variable, aiming to find the plane (or hyperplane) of best fit.
Multiple Regression Equation:
Ŷ = b₀ + b₁X₁ + b₂X₂ + ... + bₖXₖ
where:
b₀ = Y-intercept (value of Y when all Xs are 0)
bᵢ = Partial slope for predictor i
Interpreting the Coefficients
- Partial Slope (bᵢ) — Represents the estimated change in the dependent variable ($Y$) for a one-unit increase in $X_i$, holding all other predictors constant. This "controlling for" aspect is what makes multiple regression so powerful.
- R-squared ($R^2$) — The proportion of variance in the dependent variable explained by the entire set of predictors.
- Adjusted $R^2$ — $R^2$ always increases when adding new predictors, even if they are useless. Adjusted $R^2$ penalizes the addition of unnecessary variables, providing a more accurate estimate of the population $R^2$.
Assumptions (L.I.N.E. + M)
- Linearity — The relationship between the predictors and the criterion is linear.
- Independence — Observations are independent of each other.
- Normality of Residuals — The errors (residuals) are normally distributed.
- Equal Variance (Homoscedasticity) — The variance of the residuals is constant across all predicted values.
- No Perfect Multicollinearity — Predictors should not be too highly correlated with each other. This is checked using the Variance Inflation Factor (VIF). A VIF > 10 indicates severe multicollinearity that distorts standard errors.
APA 7th edition reporting format
Template: "A multiple linear regression was calculated to predict [Y] based on [X1] and [X2]. A significant regression equation was found (F([df_reg], [df_res]) = [F-val], p = [p-val]), with an $R^2$ of .[R2]. Both [X1] (p = [p1]) and [X2] (p = [p2]) were significant predictors of [Y]."
Primary references
Cohen, J., Cohen, P., West, S. G., & Aiken, L. S. (2003). Applied multiple regression/correlation analysis for the behavioral sciences (3rd ed.). Lawrence Erlbaum Associates.
Field, A. (2018). Discovering statistics using IBM SPSS statistics (5th ed.). SAGE Publications.
Montgomery, D. C., Peck, E. A., & Vining, G. G. (2021). Introduction to linear regression analysis (6th ed.). Wiley.