TutorTeddy.com, Tutoring, Dallas, TX
Like us for free help*

Multicollinearity in Multiple Regressions



Ask any Statistics/Probability/Math Question

Help on Multicollinearity in Multiple Regression

In Multiple regression we fit a model that allows to predict a dependent variable Y from more than one independent variable:


Y=β 0 + β 1X1 + β 2X2 + β 3X3 + β 4X4... + random error


To see whether the model fits the data we will examine the R2 and corresponding p value. For a high R2 and low p value the fit will be considered as good. Apart from overall p value individual p value for every independent variable are also taken into account in multiple regression. If one gets a low p value it signifies that a specific independent variable improves the fit of the model significantly. This is done by comparing goodness of fit for the entire model and for the case when independent variable is omitted. If it happens that after omitting that independent variable the fit turns worse, then a low p value will be observed indicating the variable having a significant impact.


Multiple Regression Equation

Multicollinearity

Sometimes it may happen that though the overall P value is very low yet the individual P values are quite high. That is to say that even though the model fits well neither of the independent variables has a statistically significant impact to predict the dependent variable. Now both the independent variables gives the same information if they are highly correlated. Neither of them will be a significant contribution to the model if we include the other one. Both will however contribute a lot. It will be a worse fit if both variables are removed. To calculate multicollinearity you have to examine how well every independent variable is predicted from the other ones.

Online Statistics Tutorials

Problem faced in Multicollinearity

Multicollinearity is not a problem if you want to predict the dependent variables from a set of independent variables. The overall R2 or adjusted R2 tells you how well it your fit is.


Eliminating Multicollinearity

One should understand why multicollinearity occurs and removing it will be a good solution. Now when two or more variables are related and when they measure the same thing multicollinearity occurs. So one needs to see if one variable is illogical in the model ,and after that he needs to remove that variable in this way multicollinearity can be eliminated.


Our Help

Our experts will help you out to understand the phenomenon of multicollinearity in multiple regression.


Statistics Formulas

TutorTeddy.com & Boston Predictive Analytics

[ Email your Statistics or Math problems to tutor@aafter.com (camera phone photos are OK) ]


Boston Office (Near MIT/Kendall 'T'):
Cambridge Innovation Center,
One Broadway, 14th Floor,
Cambridge, MA 02142,
Phone: 617-395-8864


Dallas Office (Near Galleria):
15950 Dallas Parkway,
Suite 400,
Dallas, TX 75248,
Phone: 866-930-6363

Copyright 2011 tutorteddy.com. All Rights Reserved. | By using our site, you agree to our TOS.