WAGE1Suppose that the random variables x and y are related by the simple linear regression model, y = 0 + 1x + u, with the assumption E(u|x)=0.
WAGE1Suppose that the random variables x and y are related by the simple linear regression
model, y = 0 + 1x + u, with the assumption E(u|x)=0.
- How does the unconditional expectation E(y) relate to the unconditional expectation E(x)?
- Using the fact that E(u|x)=0 implies Cov(u,x)=0, show that
1 = Cov(x,y) / Var(x)
(Hint: Figure out Cov(x,y) by plugging in the model for y. Some useful facts about covariances that should help you… for constants k1 and k2 and random variables X, Y, Z, we have (i) Cov(k1,X)=0, (ii) Cov(k1X, k2Y)= k1 k2Cov(X,Y), (iii) Cov(k1X+ k2Y,Z)= k1Cov(X,Z)+ k2Cov(Y,Z), (iv) Cov(X,X)=Var(X).)
c. How does the result in part b relate to the result from class for the estimated slope parameter? What is the important difference between the two results?
- Do you think that it’s possible to have a positive value for the population parameter 1 but a negative value for the estimated slope parameter?
- If x and y are independent random variables, how does the simple linear regression model simplify? Be specific.