返回列表 发帖

Reading 12- LOS g: Q6-10

6Which of the following is most likely to indicate that two or more of the independent variables, or linear combinations of independent variables, may be highly correlated with each other? Unless otherwise noted, significant and insignificant mean significantly different from zero and not significantly different from zero, respectively.
A)    The R2 is high, the F-statistic is significant and the t-statistics on the individual slope coefficients are significant.
B)    The R2 is low, the F-statistic is insignificant and the Durbin-Watson statistic is significant.
C)    The R2 is high, the F-statistic is significant and the t-statistics on the individual slope coefficients are insignificant.
D)    The R2 is larger than the F-statistic and the t-statistics on the individual slope coefficients are significant.

7Suppose there is evidence that two or more of the independent variables, or linear combinations of independent variables, may be highly correlated with each other. The most likely effect on the statistical inferences Smith can make from the regression results is to commit a:
A)    Type I error by incorrectly failing to reject the null hypothesis that the regression parameters are equal to zero.
B)    Type II error by incorrectly failing to reject the null hypothesis that the regression parameters are equal to zero.
C)    Type II error by incorrectly rejecting the null hypothesis that the regression parameters are equal to zero.
D)    Type I error by incorrectly rejecting the null hypothesis that the regression parameters are equal to zero.

8Using the Durban-Watson test statistic, Smith rejects the null hypothesis suggested by the test. This is evidence that:
A)    the error terms are correlated with each other.
B)    two or more of the independent variables are highly correlated with each other.
C)    the variance of the error term is correlated with the values of the independent variables.
D)    the error term is normally distributed.

9Which of the following conditions does NOT by itself affect statistical inference about regression parameters?
A)    Conditional heteroskedasticity.
B)    Unconditional heteroskedasticity.
C)    Serial correlation.
D)    Multicollinearity.

10Which of the following statements regarding heteroskedasticity is FALSE?
A)    Heteroskedasticity may occur in cross-section or time-series analyses.
B)    Heteroskedasticity results in an estimated variance that is too large and, therefore, affects statistical inference.
C)    The assumption of linear regression is that the residuals are heteroskedastic.
D)    Conditional heteroskedasticity is the case in which the residuals are correlated with the values of the independent variables.

6Which of the following is most likely to indicate that two or more of the independent variables, or linear combinations of independent variables, may be highly correlated with each other? Unless otherwise noted, significant and insignificant mean significantly different from zero and not significantly different from zero, respectively.
A)    The R2 is high, the F-statistic is significant and the t-statistics on the individual slope coefficients are significant.
B)    The R2 is low, the F-statistic is insignificant and the Durbin-Watson statistic is significant.
C)    The R2 is high, the F-statistic is significant and the t-statistics on the individual slope coefficients are insignificant.
D)    The R2 is larger than the F-statistic and the t-statistics on the individual slope coefficients are significant.
The correct answer was C)
Multicollinearity occurs when two or more of the independent variables, or linear combinations of independent variables, may be highly correlated with each other. In a classic effect of multicollinearity, the R2 is high and the F-statistic is significant, but the t-statistics on the individual slope coefficients are insignificant.

7
Suppose there is evidence that two or more of the independent variables, or linear combinations of independent variables, may be highly correlated with each other. The most likely effect on the statistical inferences Smith can make from the regression results is to commit a:
A)    Type I error by incorrectly failing to reject the null hypothesis that the regression parameters are equal to zero.
B)    Type II error by incorrectly failing to reject the null hypothesis that the regression parameters are equal to zero.
C)    Type II error by incorrectly rejecting the null hypothesis that the regression parameters are equal to zero.
D)    Type I error by incorrectly rejecting the null hypothesis that the regression parameters are equal to zero.
The correct answer was B)
One problem with multicollinearity is that the standard errors of the parameter estimates will be too large and the t-statistics too small. This will lead Smith to incorrectly fail to reject the null hypothesis that the parameters are statistically insignificant. In other words, Smith will incorrectly conclude that the parameters are not statistically significant when in fact they are. This is an example of a Type II error: incorrectly failing to reject the null hypothesis when it should be rejected.

8
Using the Durban-Watson test statistic, Smith rejects the null hypothesis suggested by the test. This is evidence that:
A)    the error terms are correlated with each other.
B)    two or more of the independent variables are highly correlated with each other.
C)    the variance of the error term is correlated with the values of the independent variables.
D)    the error term is normally distributed.
The correct answer was A)
Serial correlation (also called autocorrelation) exists when the error terms are correlated with each other.
Conditional heteroskedasticity exists when the variance of the error term is correlated with the values of the independent variables. Multicollinearity, on the other hand, occurs when two or more of the independent variables are highly correlated with each other. One assumption of multiple regression is that the error term is normally distributed.

9
Which of the following conditions does NOT by itself affect statistical inference about regression parameters?
A)    Conditional heteroskedasticity.
B)    Unconditional heteroskedasticity.
C)    Serial correlation.
D)    Multicollinearity.
The correct answer was B)
Unconditional heteroskedasticity does not impact the statistical inference concerning the parameters.

10
Which of the following statements regarding heteroskedasticity is FALSE?
A)    Heteroskedasticity may occur in cross-section or time-series analyses.
B)    Heteroskedasticity results in an estimated variance that is too large and, therefore, affects statistical inference.
C)    The assumption of linear regression is that the residuals are heteroskedastic.
D)    Conditional heteroskedasticity is the case in which the residuals are correlated with the values of the independent variables.
The correct answer was C)
The assumption of regression is that the residuals are homoskedastic (i.e., the residuals are drawn from the same distribution).

TOP

返回列表