Covariance and independence?

  • I read from my textbook that $\text{cov}(X,Y)=0$ does not guarantee X and Y are independent. But if they are independent, their covariance must be 0. I could not think of any proper example yet; could someone provide one?

    You might also enjoy a quick review of Anscombe's Quartet, which illustrates some of the many different ways in which a particular nonzero covariance can be realized by a bivariate dataset.

    The thing to note is that the measure of covariance is a measure of linearity.. Calculating the covariance is answering the question 'Do the data form a straight line pattern?' If the data do follow a linear pattern, they are therefore dependent. BUT, this is only one way in which the data can be dependent. Its like asking 'Am I driving recklessly?' One question might be 'Are you travelling 25 mph over the speed limit?' But that isn't the only way to drive recklessly. Another question could be 'Are you drunk?' etc.. There is more than one way to drive recklessly.

    The so- called measure of linearity gives a structure to the relationship. What is important that the relationship can be non-linear which is not uncommon. Generally, covariance is not zero, It is hypothetical.The covariance indicates the magnitude and not a ratio,

  • jpillow

    jpillow Correct answer

    9 years ago

    Easy example: Let $X$ be a random variable that is $-1$ or $+1$ with probability 0.5. Then let $Y$ be a random variable such that $Y=0$ if $X=-1$, and $Y$ is randomly $-1$ or $+1$ with probability 0.5 if $X=1$.

    Clearly $X$ and $Y$ are highly dependent (since knowing $Y$ allows me to perfectly know $X$), but their covariance is zero: They both have zero mean, and

    $$\eqalign{ \mathbb{E}[XY] &=&(-1) &\cdot &0 &\cdot &P(X=-1) \\ &+& 1 &\cdot &1 &\cdot &P(X=1,Y=1) \\ &+& 1 &\cdot &(-1)&\cdot &P(X=1,Y=-1) \\ &=&0. }$$

    Or more generally, take any distribution $P(X)$ and any $P(Y|X)$ such that $P(Y=a|X) = P(Y=-a|X)$ for all $X$ (i.e., a joint distribution that is symmetric around the $x$ axis), and you will always have zero covariance. But you will have non-independence whenever $P(Y|X) \neq P(Y)$; i.e., the conditionals are not all equal to the marginal. Or ditto for symmetry around the $y$ axis.

License under CC-BY-SA with attribution


Content dated before 6/26/2020 9:53 AM