RT Book, Section A1 Glantz, Stanton A. A1 Slinker, Bryan K. A1 Neilands, Torsten B. SR Print(0) ID 1141898841 T1 Multicollinearity and What to Do About It T2 Primer of Applied Regression and Analysis of Variance, 3e YR 2017 FD 2017 PB McGraw-Hill Education PP New York, NY SN 9780071824118 LK accessbiomedicalscience.mhmedical.com/content.aspx?aid=1141898841 RD 2022/01/22 AB Multiple regression allows us to study how several independent variables act together to determine the value of a dependent variable. The coefficients in the regression equation quantify the nature of these dependencies. Moreover, we can compute the standard errors associated with each of these regression coefficients to quantify the precision with which we estimate how the different independent variables affect the dependent variable. These standard errors also permit us to conduct hypothesis tests about whether the different proposed independent variables affect the dependent variable at all. The conclusions we draw from regression analyses will be unambiguous when the independent variables in the regression equation are statistically independent of each other, that is, when the value of one of the independent variables does not depend on the values of any of the other independent variables. Unfortunately, as we have already seen in Chapter 3, the independent variables often contain at least some redundant information and so tend to vary together, a situation called multicollinearity. Severe multicollinearity indicates that a substantial part of the information in one or more of the independent variables is redundant, which makes it difficult to separate the effects of the different independent variables on the dependent variable.