How do I calculate the variance of the OLS estimator $\beta_0$, conditional on $x_1, \ldots , x_n$?

  • I know that $$\hat{\beta_0}=\bar{y}-\hat{\beta_1}\bar{x}$$ and this is how far I got when I calculated the variance:

    \begin{align*} Var(\hat{\beta_0}) &= Var(\bar{y} - \hat{\beta_1}\bar{x}) \\ &= Var((-\bar{x})\hat{\beta_1}+\bar{y}) \\ &= Var((-\bar{x})\hat{\beta_1})+Var(\bar{y}) \\ &= (-\bar{x})^2 Var(\hat{\beta_1}) + 0 \\ &= (\bar{x})^2 Var(\hat{\beta_1}) + 0 \\ &= \frac{\sigma^2 (\bar{x})^2}{\displaystyle\sum\limits_{i=1}^n (x_i - \bar{x})^2} \end{align*}

    but that's far as I got. The final formula I'm trying to calculate is

    \begin{align*} Var(\hat{\beta_0}) &= \frac{\sigma^2 n^{-1}\displaystyle\sum\limits_{i=1}^n x_i^2}{\displaystyle\sum\limits_{i=1}^n (x_i - \bar{x})^2} \end{align*}

    I'm not sure how to get $$(\bar{x})^2 = \frac{1}{n}\displaystyle\sum\limits_{i=1}^n x_i^2$$ assuming my math is correct up to there.

    Is this the right path?

    \begin{align} (\bar{x})^2 &= \left(\frac{1}{n}\displaystyle\sum\limits_{i=1}^n x_i\right)^2 \\ &= \frac{1}{n^2} \left(\displaystyle\sum\limits_{i=1}^n x_i\right)^2 \end{align}

    I'm sure it's simple, so the answer can wait for a bit if someone has a hint to push me in the right direction.

    This is not the right path. The 4th equation doesn't hold. For example, with $x_1=−1$, $x_2=0$, and $x_3=1$, the left term is zero, whilst the right term is $2/3$. The problem comes from the step where you split the variance (3rd line of second equation). See why?

    Hint towards Quantlbex point: variance is not a linear function. It violates both additivity and scalar multiplication.

    @DavidMarx That step should be $$=Var((-\bar{x})\hat{\beta_1}+\bar{y})=(\bar{x})^2Var(\hat{\beta_1})+\bar{y}$$, I think, and then once I substitute in for $\hat{\beta_1}$ and $\bar{y}$ (not sure what to do for this but I'll think about it more), *that* should put me on the right path I hope.

    This is not correct. Think about the condition required for the variance of a sum to be equal to the sum of the variances.

    I thought that $\bar{y}$ is considered non-random when you condition on the $x$'s, so it can be treated as a constant, i.e. $$Var(aX+b)=a^2 Var(X) + b$$.

    No, $\bar{y}$ is random since $y_i = \beta_0 + \beta_1 x_i + \epsilon$, where $\epsilon$ denotes the (random) noise. But OK, my previous comment was maybe misleading. Also, ${\rm Var}(aX + b)= a^2{\rm Var}(X)$, if $a$ and $b$ denote constants.

  • QuantIbex

    QuantIbex Correct answer

    7 years ago

    This is a self-study question, so I provide hints that will hopefully help to find the solution, and I'll edit the answer based on your feedbacks/progress.

    The parameter estimates that minimize the sum of squares are \begin{align} \hat{\beta}_0 &= \bar{y} - \hat{\beta}_1 \bar{x} , \\ \hat{\beta}_1 &= \frac{ \sum_{i = 1}^n(x_i - \bar{x})y_i }{ \sum_{i = 1}^n(x_i - \bar{x})^2 } . \end{align} To get the variance of $\hat{\beta}_0$, start from its expression and substitute the expression of $\hat{\beta}_1$, and do the algebra $$ {\rm Var}(\hat{\beta}_0) = {\rm Var} (\bar{Y} - \hat{\beta}_1 \bar{x}) = \ldots $$

    Edit:
    We have \begin{align} {\rm Var}(\hat{\beta}_0) &= {\rm Var} (\bar{Y} - \hat{\beta}_1 \bar{x}) \\ &= {\rm Var} (\bar{Y}) + (\bar{x})^2 {\rm Var} (\hat{\beta}_1) - 2 \bar{x} {\rm Cov} (\bar{Y}, \hat{\beta}_1). \end{align} The two variance terms are $$ {\rm Var} (\bar{Y}) = {\rm Var} \left(\frac{1}{n} \sum_{i = 1}^n Y_i \right) = \frac{1}{n^2} \sum_{i = 1}^n {\rm Var} (Y_i) = \frac{\sigma^2}{n}, $$ and \begin{align} {\rm Var} (\hat{\beta}_1) &= \frac{ 1 }{ \left[\sum_{i = 1}^n(x_i - \bar{x})^2 \right]^2 } \sum_{i = 1}^n(x_i - \bar{x})^2 {\rm Var} (Y_i) \\ &= \frac{ \sigma^2 }{ \sum_{i = 1}^n(x_i - \bar{x})^2 } , \end{align} and the covariance term is \begin{align} {\rm Cov} (\bar{Y}, \hat{\beta}_1) &= {\rm Cov} \left\{ \frac{1}{n} \sum_{i = 1}^n Y_i, \frac{ \sum_{j = 1}^n(x_j - \bar{x})Y_j }{ \sum_{i = 1}^n(x_i - \bar{x})^2 } \right \} \\ &= \frac{1}{n} \frac{ 1 }{ \sum_{i = 1}^n(x_i - \bar{x})^2 } {\rm Cov} \left\{ \sum_{i = 1}^n Y_i, \sum_{j = 1}^n(x_j - \bar{x})Y_j \right\} \\ &= \frac{ 1 }{ n \sum_{i = 1}^n(x_i - \bar{x})^2 } \sum_{i = 1}^n (x_j - \bar{x}) \sum_{j = 1}^n {\rm Cov}(Y_i, Y_j) \\ &= \frac{ 1 }{ n \sum_{i = 1}^n(x_i - \bar{x})^2 } \sum_{i = 1}^n (x_j - \bar{x}) \sigma^2 \\ &= 0 \end{align} since $\sum_{i = 1}^n (x_j - \bar{x})=0$.
    And since $$\sum_{i = 1}^n(x_i - \bar{x})^2 = \sum_{i = 1}^n x_i^2 - 2 \bar{x} \sum_{i = 1}^n x_i + \sum_{i = 1}^n \bar{x}^2 = \sum_{i = 1}^n x_i^2 - n \bar{x}^2, $$ we have \begin{align} {\rm Var}(\hat{\beta}_0) &= \frac{\sigma^2}{n} + \frac{ \sigma^2 \bar{x}^2}{ \sum_{i = 1}^n(x_i - \bar{x})^2 } \\ &= \frac{\sigma^2 }{ n \sum_{i = 1}^n(x_i - \bar{x})^2 } \left\{ \sum_{i = 1}^n(x_i - \bar{x})^2 + n \bar{x}^2 \right\} \\ &= \frac{\sigma^2 \sum_{i = 1}^n x_i^2}{ n \sum_{i = 1}^n(x_i - \bar{x})^2 }. \end{align}

    Edit 2

    Why do we have ${\rm var} ( \sum_{i = 1}^n Y_i) = \sum_{i = 1}^n {\rm Var} (Y_i) $?

    The assumed model is $ Y_i = \beta_0 + \beta_1 X_i + \epsilon_i$, where the $\epsilon_i$ are independant and identically distributed random variables with ${\rm E}(\epsilon_i) = 0$ and ${\rm var}(\epsilon_i) = \sigma^2$.

    Once we have a sample, the $X_i$ are known, the only random terms are the $\epsilon_i$. Recalling that for a random variable $Z$ and a constant $a$, we have ${\rm var}(a+Z) = {\rm var}(Z)$. Thus, \begin{align} {\rm var} \left( \sum_{i = 1}^n Y_i \right) &= {\rm var} \left( \sum_{i = 1}^n \beta_0 + \beta_1 X_i + \epsilon_i \right)\\ &= {\rm var} \left( \sum_{i = 1}^n \epsilon_i \right) = \sum_{i = 1}^n \sum_{j = 1}^n {\rm cov} (\epsilon_i, \epsilon_j)\\ &= \sum_{i = 1}^n {\rm cov} (\epsilon_i, \epsilon_i) = \sum_{i = 1}^n {\rm var} (\epsilon_i)\\ &= \sum_{i = 1}^n {\rm var} (\beta_0 + \beta_1 X_i + \epsilon_i) = \sum_{i = 1}^n {\rm var} (Y_i).\\ \end{align} The 4th equality holds as ${\rm cov} (\epsilon_i, \epsilon_j) = 0$ for $i \neq j$ by the independence of the $\epsilon_i$.

    I think I got it! The book has suggested steps, and I was able to prove each step separately (I think). It's not as satisfying as just sitting down and grinding it out from this step, since I had to prove intermediate conclusions for it to help, but I think everything looks good.

    See edit for the development of the suggested approach.

    The variance of the sum equals the sum of the variances in this step: $$ {\rm Var} (\bar{Y}) = {\rm Var} \left(\frac{1}{n} \sum_{i = 1}^n Y_i \right) = \frac{1}{n^2} \sum_{i = 1}^n {\rm Var} (Y_i) $$ because since the $X_i$ are independent, this implies that the $Y_i$ are independent as well, right?

    Also, you can factor out a constant from the covariance in this step: $$ \frac{1}{n} \frac{ 1 }{ \sum_{i = 1}^n(x_i - \bar{x})^2 } {\rm Cov} \left\{ \sum_{i = 1}^n Y_i, \sum_{j = 1}^n(x_j - \bar{x})Y_j \right\} $$ even though it's not in both elements because the formula for covariance is multiplicative, right?

    Is my reasoning correct on those two points? I think it is, but I want to make sure I don't mislearn something. Since I proved the formula for the variance of $\hat{\beta_0}$ a different way, I want to make sure mine lines up with yours.

    Please see edit 2 for your first point.

    For your second point, recall that for random variables $X$ and $Y$, and constants $a$ and $b$, we have ${\rm cov} (aX, bY) = ab\,{\rm cov} (X, Y)$.

    I don't know what you mean by "multiplicative formula". Anyway, I guess I answered this in my previous comment.

    Thanks for the clarification. This is really helpful. (I would vote you up but I don't have enough rep). Was the solution I used correct, as far as you can see, or did it work by chance?

    I'm planning to; I just want to make sure the other solution is actually correct too and didn't just work by chance.

    In the accepted answer, at the end you seem to ignore the (x-bar)^2 term multiplied by the variance of Beta1. I can't find where you account for it. You seem to treat it as merely x-bar when simplifying the equation. Could you show the missing steps accounting for this? Otherwise, the equation you conclude with does not make sense.

    @Gabriel, thanks for spotting this typo. The term $\bar{x}^2$ was missing after the first equality for $\mbox{Var}(\hat{\beta}_0)$ at the very end of Edit 1, but the subsequent expressions seem correct.

    For the \begin{align} {\rm Var}(\hat{\beta}_0) &= {\rm Var} (\bar{Y} - \hat{\beta}_1 \bar{x}) \\ &= {\rm Var} (\bar{Y}) + (\bar{x})^2 {\rm Var} (\hat{\beta}_1) - 2 \bar{x} {\rm Cov} (\bar{Y}, \hat{\beta}_1). \end{align} line, do you have to work out the COV part manually to show that it is zero or are the elements independent and have zero covariance by definition?

    Why is $$\frac{1}{n} \frac{ 1 }{ \sum_{i = 1}^n(x_i - \bar{x})^2 } {\rm Cov} \left\{ \sum_{i = 1}^n Y_i, \sum_{j = 1}^n(x_j - \bar{x})Y_j \right\} \\ = \frac{ 1 }{ n \sum_{i = 1}^n(x_i - \bar{x})^2 } \sum_{i = 1}^n (x_j - \bar{x}) \sum_{j = 1}^n {\rm Cov}(Y_i, Y_j) $$

    @user1603548, this is simply by the properties of covariance.

    If $Var(\bar{y}) = \frac{1}{n} \sum_{i=1}^n Var(y_i)$, which then becomes $\frac{1}{n} Var(\sum_{i=1}^n \epsilon_i$), why is $Var(\bar{y}) = \sigma^2 / n$ and not $\sigma^2 / n^2$?

    @oort, your starting point is wrong: $\mbox{Var}(\bar{Y}) = n^{-2} \sum_{i = 1}^n \mbox{Var}(Y_i)$, provided the $Y_i$s are independent.

    @QuantIbex, you're absolutely right, a typo on my part, but I'm confused as to where the $n$ in the numerator comes from that cancels out the $n^{-2}$ to become $n^{-1}$

    @oort, in the numerator you have the sum of $n$ terms that are identical (and equal to $\sigma^2$), so the numerator is $n \sigma^2$.

    Not to beat a dead horse... But then $Var(\epsilon_i) = \sigma^2$ and not $Var(\epsilon)$, where $\epsilon = \sum_{i=1}^n \epsilon_i$?In ISLR they state $Var(\epsilon) = \sigma^2$, which is why I thought that $Var(\sum_{i=1}^n \epsilon_i)$ was equal to $\sigma^2$

    I think the last term $n \bar{x}$ in the second equation before Edit 2 should be $n \bar{x}^2$.

    I think the last term in the equation below: "And since" should be $n \bar{x}^2$ rather than $n \bar{x}$. However, excellent post, I spent days on this until I saw your post.

    I'm trying to understand this. Why Ybar treated as a random variable while Xbar is not?

License under CC-BY-SA with attribution


Content dated before 6/26/2020 9:53 AM