TY - CHAP M1 - Book, Section TI - Multicollinearity and What to Do About It A1 - Glantz, Stanton A. A1 - Slinker, Bryan K. A1 - Neilands, Torsten B. PY - 2017 T2 - Primer of Applied Regression and Analysis of Variance, 3e AB - Multiple regression allows us to study how several independent variables act together to determine the value of a dependent variable. The coefficients in the regression equation quantify the nature of these dependencies. Moreover, we can compute the standard errors associated with each of these regression coefficients to quantify the precision with which we estimate how the different independent variables affect the dependent variable. These standard errors also permit us to conduct hypothesis tests about whether the different proposed independent variables affect the dependent variable at all. The conclusions we draw from regression analyses will be unambiguous when the independent variables in the regression equation are statistically independent of each other, that is, when the value of one of the independent variables does not depend on the values of any of the other independent variables. Unfortunately, as we have already seen in Chapter 3, the independent variables often contain at least some redundant information and so tend to vary together, a situation called multicollinearity. Severe multicollinearity indicates that a substantial part of the information in one or more of the independent variables is redundant, which makes it difficult to separate the effects of the different independent variables on the dependent variable. SN - PB - McGraw-Hill Education CY - New York, NY Y2 - 2024/04/19 UR - accessbiomedicalscience.mhmedical.com/content.aspx?aid=1141898841 ER -