返回列表 发帖
I don't think that is how R^2 is used.

R^2 in a multiple regression is

rss/sst

It explains the variation in a model. Its uses similar inputs as the F test but is different. Multi-c is only tested with a significant F and insignificant T test.

TOP

But in a single variable regression , R^2 can detect strong correlation . Then , if you use the dep. and independent variables to explain a third dependent variable, you should claim multi-collinarity straightaway. Because of the strong correlation in the first equation.

Kind of basic stuff

TOP

R^2 is just How well the model as whole explains the variation in the dependent variable

TOP

Right Right, but I saw a question where T-Stats were low and R^2 was 81%. I thought that over like 90% was high? What's the cutoff?

TOP

CFAI Mock afternoon had exactly this question, and I used CPK's logic to answer it. I didn't even RTFQ.

Of course I got it wrong since there was no F stat given


Instead the fine print gave the R^2 between the TWO independent variables as 0.3 . The answer claimed that this was low , hence no multi-collinearity

TOP

nm, i think cpk is correct.

TOP

I think a high R^2 in combination with insignificant coefficients for the independent variables is a red flag for multicollinearity, but doesn't necessarily guarantee it. . . of course, I'm not positive about this, but I think I remember reading this.

TOP

I thought High F, with low T-stats on the regression coefficients was multicoll.

CP

TOP

返回列表