What's the difference between variance and standard deviation?
I was wondering what the difference between the variance and the standard deviation is.
If you calculate the two values, it is clear that you get the standard deviation out of the variance, but what does that mean in terms of the distribution you are observing?
Furthermore, why do you really need a standard deviation?
You probably got the answer by now. Still this link has the simplest and best explanation. http://www.mathsisfun.com/data/standard-deviation.html
Standard deviation is useful as the value is in the same scale as the data from which it was computed. If measuring meters, the standard deviation will be meters. Variance, in contrast, will be meters squared.
The standard deviation is the square root of the variance.
The standard deviation is expressed in the same units as the mean is, whereas the variance is expressed in squared units, but for looking at a distribution, you can use either just so long as you are clear about what you are using. For example, a Normal distribution with mean = 10 and sd = 3 is exactly the same thing as a Normal distribution with mean = 10 and variance = 9.
yeah thats the mathematical way to explain these two parameters, BUT whats the logical explenation? Why do I really ned two parameters to show the same thing(the deviation around the arithmetical mean)...
You don't really need both. If you report one, you don't need to report the other
We need both: standard deviation is good for interpretation, reporting. For developing the theory the variance is better.
For reporting purposes, you don't need both. For developing theory, I agree you need the variance, but that doesn't seem to be what this is about. (Anyone who is developing statistical theory would know that they need the variance).
The benefit of reporting standard deviation is that it remains in the scale of data. Say, a sample of adult heights is in meters, then standard deviation will also be in meters.
2 years late but @kjetilbhalvorsen, could you please explain why variance would be better for developing theory? (A one/two line answer would be perfect)
@RushatRai When dealing with sums of random variables, variances get added together. For independent random variables, $Var(\sum X_i) = \sum Var(X_i)$. A similar expression exists in the general case without independence (with a correction using covariance terms). In general, the square root transformation complicates things and makes standard deviation more difficult to work with analytically.