答案和详解如下: Q6. Which of the following conditions will least likely affect the statistical inference about regression parameters by itself?
A) Multicollinearity. B) Unconditional heteroskedasticity. C) Conditional heteroskedasticity. Correct answer is B) Unconditional heteroskedasticity does not impact the statistical inference concerning the parameters. Q7. George Smith, an analyst with Great Lakes Investments, has created a comprehensive report on the pharmaceutical industry at the request of his boss. The Great Lakes portfolio currently has a significant exposure to the pharmaceuticals industry through its large equity position in the top two pharmaceutical manufacturers. His boss requested that Smith determine a way to accurately forecast pharmaceutical sales in order for Great Lakes to identify further investment opportunities in the industry as well as to minimize their exposure to downturns in the market. Smith realized that there are many factors that could possibly have an impact on sales, and he must identify a method that can quantify their effect. Smith used a multiple regression analysis with five independent variables to predict industry sales. His goal is to not only identify relationships that are statistically significant, but economically significant as well. The assumptions of his model are fairly standard: a linear relationship exists between the dependent and independent variables, the independent variables are not random, and the expected value of the error term is zero. Smith is confident with the results presented in his report. He has already done some hypothesis testing for statistical significance, including calculating a t-statistic and conducting a two-tailed test where the null hypothesis is that the regression coefficient is equal to zero versus the alternative that it is not. He feels that he has done a thorough job on the report and is ready to answer any questions posed by his boss. However, Smith’s boss, John Sutter, is concerned that in his analysis, Smith has ignored several potential problems with the regression model that may affect his conclusions. He knows that when any of the basic assumptions of a regression model are violated, any results drawn for the model are questionable. He asks Smith to go back and carefully examine the effects of heteroskedasticity, multicollinearity, and serial correlation on his model. In specific, he wants Smith to make suggestions regarding how to detect these errors and to correct problems that he encounters. Suppose that there is evidence that the residual terms in the regression are positively correlated. The most likely effect on the statistical inferences drawn from the regressions results is for Smith to commit a: A) Type II error by incorrectly failing to reject the null hypothesis that the regression parameters are equal to zero. B) Type I error by incorrectly rejecting the null hypotheses that the regression parameters are equal to zero. C) Type I error by incorrectly failing to reject the null hypothesis that the regression parameters are equal to zero. Correct answer is B) One problem with positive autocorrelation (also known as positive serial correlation) is that the standard errors of the parameter estimates will be too small and the t-statistics too large. This may lead Smith to incorrectly reject the null hypothesis that the parameters are equal to zero. In other words, Smith will incorrectly conclude that the parameters are statistically significant when in fact they are not. This is an example of a Type I error: incorrectly rejecting the null hypothesis when it should not be rejected. Q8. Sutter has detected the presence of conditional heteroskedasticity in Smith’s report. This is evidence that: A) the variance of the error term is correlated with the values of the independent variables. B) two or more of the independent variables are highly correlated with each other. C) the error terms are correlated with each other. Correct answer is A) Conditional heteroskedasticity exists when the variance of the error term is correlated with the values of the independent variables. Multicollinearity, on the other hand, occurs when two or more of the independent variables are highly correlated with each other. Serial correlation exists when the error terms are correlated with each other. |