Respuesta :
Answer:
See proof below.
Step-by-step explanation:
If we assume the following linear model:
[tex] y = \beta_o + \beta_1 X +\epsilon[/tex]
And if we have n sets of paired observations [tex] (x_i, y_i) , i =1,2,...,n[/tex] the model can be written like this:
[tex] y_i = \beta_o +\beta_1 x_i + \epsilon_i , i =1,2,...,n[/tex]
And using the least squares procedure gives to us the following least squares estimates [tex] b_o [/tex] for [tex]\beta_o[/tex] and [tex] b_1[/tex] for [tex]\beta_1[/tex] :
[tex] b_o = \bar y - b_1 \bar x[/tex]
[tex] b_1 = \frac{s_{xy}}{s_xx}[/tex]
Where:
[tex] s_{xy} =\sum_{i=1}^n (x_i -\bar x) (y-\bar y)[/tex]
[tex] s_{xx} =\sum_{i=1}^n (x_i -\bar x)^2[/tex]
Then [tex] \beta_1[/tex] is a random variable and the estimated value is [tex]b_1[/tex]. We can express this estimator like this:
[tex] b_1 = \sum_{i=1}^n a_i y_i [/tex]
Where [tex] a_i =\frac{(x_i -\bar x)}{s_{xx}}[/tex] and if we see careful we notice that [tex] \sum_{i=1}^n a_i =0[/tex] and [tex]\sum_{i=1}^n a_i x_i =1[/tex]
So then when we find the expected value we got:
[tex] E(b_1) = \sum_{i=1}^n a_i E(y_i)[/tex]
[tex] E(b_1) = \sum_{i=1}^n a_i (\beta_o +\beta_1 x_i)[/tex]
[tex] E(b_1) = \sum_{i=1}^n a_i \beta_o + \beta_1 a_i x_i[/tex]
[tex] E(b_1) = \beta_1 \sum_{i=1}^n a_i x_i = \beta_1[/tex]
And as we can see [tex]b_1[/tex] is an unbiased estimator for [tex]\beta_1[/tex]
In order to find the variance for the estimator [tex]b_1[/tex] we have this:
[tex] Var(b_1) = \sum_{i=1}^n a_i^2 Var(y_i) +\sum_i \sum_{j \neq i} a_i a_j Cov (y_i, y_j) [/tex]
And we can assume that [tex] Cov(y_i,y_j) =0[/tex] since the observations are assumed independent, then we have this:
[tex] Var (b_1) =\sigma^2 \frac{\sum_{i=1}^n (x_i -\bar x)^2}{s^2_{xx}}[/tex]
And if we simplify we got:
[tex] Var(b_1) = \frac{\sigma^2 s_{xx}}{s^2_{xx}} = \frac{\sigma^2}{s_{xx}}[/tex]
And with this we complete the proof required.