I think a high R^2 in combination with insignificant coefficients for the independent variables is a red flag for multicollinearity, but doesn't necessarily guarantee it. . . of course, I'm not positive about this, but I think I remember reading this.
But in a single variable regression , R^2 can detect strong correlation . Then , if you use the dep. and independent variables to explain a third dependent variable, you should claim multi-collinarity straightaway. Because of the strong correlation in the first equation.
It explains the variation in a model. Its uses similar inputs as the F test but is different. Multi-c is only tested with a significant F and insignificant T test.
Try q. 42 in CFAI Mock 2010 Afternoon . You'll know what I mean.
Here 's the question:
Because there are only two independent variables in her regression, Hamilton’s most appropriate conclusion is that multicollinearity is least likely a problem, based on the observation that the:
A. model R2 is relatively low.
B. correlation between S&P500 and SPREAD is low.
C. model F-value is high and the p-values for S&P500 and SPREAD are low.