A well known fact about joint normally distributed random variables, is that they are independent if and only if their covariance is zero. In one direction, this statement is trivial. *Any* independent pair of random variables has zero covariance (assuming that they are integrable, so that the covariance has a well-defined value). The strength of the statement is in the other direction. Knowing the value of the covariance does not tell us a lot about the joint distribution so, in the case that they are joint normal, the fact that we can determine independence from this is a rather strong statement.

**Theorem 1** * A joint normal pair of random variables are independent if and only if their covariance is zero. *

*Proof:* Suppose that *X,Y* are joint normal, such that and , and that their covariance is *c*. Then, the characteristic function of can be computed as

for all . It is standard that the joint characteristic function of a pair of random variables is equal to the product of their characteristic functions if and only if they are independent which, in this case, corresponds to the covariance *c* being zero. ⬜

To demonstrate necessity of the joint normality condition, consider the example from the previous post.

**Example 1** * A pair of standard normal random variables **X,Y* which have zero covariance, but * is not normal. *

As their sum is not normal, *X* and *Y* cannot be independent. This example was constructed by setting for some fixed , which is standard normal whenever *X* is. As explained in the previous post, the intermediate value theorem ensures that there is a unique value for *K* making the covariance equal to zero. Continue reading “Independence of Normals” →

### Like this:

Like Loading...