### What's the difference between multiple R and R squared?

• In linear regression, we often get multiple R and R squared. What are the differences between them?

• Capital \$R^2\$ (as opposed to \$r^2\$) should generally be the multiple \$R^2\$ in a multiple regression model. In bivariate linear regression, there is no multiple \$R\$, and \$R^2=r^2\$. So one difference is applicability: "multiple \$R\$" implies multiple regressors, whereas "\$R^2\$" doesn't necessarily.

Another simple difference is interpretation. In multiple regression, the multiple \$R\$ is the coefficient of multiple correlation, whereas its square is the coefficient of determination. \$R\$ can be interpreted somewhat like a bivariate correlation coefficient, the main difference being that the multiple correlation is between the dependent variable and a linear combination of the predictors, not just any one of them, and not just the average of those bivariate correlations. \$R^2\$ can be interpreted as the percentage of variance in the dependent variable that can be explained by the predictors; as above, this is also true if there is only one predictor.

So if in a multiple regression R^2 is .76, then we can say the model explains 76% of the variance in the dependent variable, whereas if r^2 is .86, we can say that the model explains 86% of the variance in the dependent variable? What's the difference in their interpretation?

As the answer suggests - "multiple R" implies multiple regressors. Is it possible to have multiple R value in single regressor model?