Answered You can hire a professional tutor to get the answer.
Suppose that your linear regression model includes a constant term, so that in the linear regression model y = X + " ; (1) the matrix of explanatory...
1. Suppose that your linear regression model includes a constant term, so that in the
linear regression model
y = X + " ; (1)
the matrix of explanatory variables X can be partitioned as follows: X = [i X1]. The
OLS estimator of can thus be partitioned accordingly into b0 = [b0 b0
1], where b0
is the OLS estimator of the constant term and b1 is the OLS estimator of the slope
coecients.
(a) Find the inverse of the matrix X0X. (hint: Apply result (A-74).)
(b) Use partitioned regression to derive formulas for b1 and b0. (Note: Question 5 of
Problem Set 1 asks you to do this without using partitioned regression.)
(c) Derive var(b1 j X) using (B-87). How is your answer related to your answer to
part (a)?
(d) What is var(b0 j x)? (You should be able to answer this question without doing
any further derivations, using your answers to parts (a) - (c).)
2. Suppose that instead of estimating the full regression model including the constant
term, you have estimated instead a model in deviations from means; i.e., you have
regressed M0y on M0X1. We can write the estimating equation in this case as
M0y = M0X11 +M0" ; (2)
Call the OLS estimator of 1 in this equation eb
1.
(a) Derive eb
1. How does it compare to b1 in question 1?
(b) Let the residuals vector for equation (2) be ee
. Show that ee
is identical to e, the
vector of OLS residuals for equation (1).