Paraguay Wrote:
-------------------------------------------------------
> If there are only 2 independent variables pairwise
> correlation is all you need.
>
> When adding variables past two you need to look
> for significant f, non-significant t-values.
+1
I answered C in the Mock and went back to cfai book and had to read that part how to detect multico. several times to find the little sentence about two independent variables and ok to use correl
i just looked over it, R^2 in a linear regression with one independent variable is the coefficient determination. Correlation is the Square root of that, in a single linear regression model.
High R^2 and significant F-statistic even though the t--statistics on the slope coefficients are themselves not significant (indicating inflated std. errors for the slope coeffs).
Also they specifically also write - magnitude of pairwise correlations between independent variables has occasionally been suggested to assess multicoll. but is generally not adequate. It is not necessary that the pairwise correlations be high for there to be a multicoll. problem.
so based on the above
B) is eliminated.
C) F-value is high, but p-values are low - which means that the S&P500 and SPREAD are significant variables - so that too is out.
Try q. 42 in CFAI Mock 2010 Afternoon . You'll know what I mean.
Here 's the question:
Because there are only two independent variables in her regression, Hamilton’s most appropriate conclusion is that multicollinearity is least likely a problem, based on the observation that the:
A. model R2 is relatively low.
B. correlation between S&P500 and SPREAD is low.
C. model F-value is high and the p-values for S&P500 and SPREAD are low.