### Variance of product of multiple random variables

• We know the answer for two independent variables: $${\rm Var}(XY) = E(X^2Y^2) − (E(XY))^2={\rm Var}(X){\rm Var}(Y)+{\rm Var}(X)(E(Y))^2+{\rm Var}(Y)(E(X))^2$$

However, if we take the product of more than two variables, ${\rm Var}(X_1X_2 \cdots X_n)$, what would the answer be in terms of variances and expected values of each variable?

Because $X_1X_2\cdots X_{n-1}$ is a random variable and (assuming all the $X_i$ are independent) it is independent of $X_n$, the answer is obtained inductively: nothing new is needed. Lest this seem too mysterious, the technique is no different than pointing out that since you can add two numbers with a calculator, you can add $n$ numbers with the same calculator just by repeated addition.

Could you write out a _proof_ of your displayed equation? I am curious to find out what happened to the $(E[XY])^2$ term which _should_ give you some terms involving $\operatorname{cov}(X,Y)$.

@DilipSarwate, I suspect this question tacitly assumes $X$ and $Y$ are independent. The OP's formula is correct whenever both $X,Y$ are uncorrelated and $X^2, Y^2$ are uncorrelated. See my answer to a related question here.

@Macro I am well aware of the points that you raise. What I was trying to get the OP to understand and/or figure out for himself/herself was that for _independent_ random variables, just as $E[X^2Y^2]$ simplifies to $$E[X^2Y^2]=E[X^2]E[Y^2]=(\sigma_X^2+\mu_X^2)(\sigma_Y^2+\mu_Y^2),$$ $E[(X_1\cdots X_n)^2]$ simplifies to $$E[(X_1\cdots X_n)^2]=E[X_1^2]\cdots E[X_n^2]=\prod_{i=1}^n(\sigma_{X_i}^2+\mu_{X_i}^2)$$ which I think is a more direct way of getting to the end result than the inductive method that whuber pointed out.

@DilipSarwate, nice. I suggest you post that as an answer so I can upvote it!

@Macro OK. I have added a few words of explanation too

@Dilip Your "more direct way" implicitly uses induction. I find it *less* direct insofar as it works with expectations of squares rather than variances, and therefore requires subsequent algebraic manipulations (which, once again, are implicit inductions).

• I will assume that the random variables $X_1, X_2, \cdots , X_n$ are independent, which condition the OP has not included in the problem statement. With this assumption, we have that \begin{align} \operatorname{var}(X_1\cdots X_n) &= E[(X_1\cdots X_n)^2]-\left(E[X_1\cdots X_n]\right)^2\\ &= E[X_1^2\cdots X_n^2]-\left(E[(X_1]\cdots E[X_n]\right)^2\\ &= E[X_1^2]\cdots E[X_n^2] - (E[X_1])^2\cdots (E[X_n])^2\\ &= \prod_{i=1}^n \left(\operatorname{var}(X_i)+(E[X_i])^2\right) - \prod_{i=1}^n \left(E[X_i]\right)^2 \end{align} If the first product term above is multiplied out, one of the terms in the expansion cancels out the second product term above. Thus, for the case $n=2$, we have the result stated by the OP. As @Macro points out, for $n=2$, we need not assume that $X_1$ and $X_2$ are independent: the weaker condition that $X_1$ and $X_2$ are uncorrelated and $X_1^2$ and $X_2^2$ are uncorrelated as well suffices. But for $n \geq 3$, lack of correlation is not enough. Independence suffices, but is not necessary. What is required is the factoring of the expectation of the products shown above into products of expectations, which independence guarantees.

thanks a lot! I really appreciate it. Yes, the question was for independent random variables.

Is it also possible to do the same thing for dependent variables? I am trying to figure out what would happen to variance if $$X_1=X_2=\cdots=X_n=X$$? Can we derive a variance formula in terms of variance and expected value of X?

Dilip, is there a generalization to an arbitrary $n$ number of variables that are not independent? (This is a different question than the one asked by damla in their new question, which is about the variance of arbitrary powers of a single variable.)

@Alexis To the best of my knowledge, there is no generalization to non-independent random variables, not even, as pointed out already, for the case of $3$ random variables.

@Alexis I withdraw my comment above. See this answer for the case of correlated random variables.