machine learning, computer science, and more

Computing the product of Gaussian distributions

Suppose you have multivariate Gaussian distributions, each of dimensionality . It turns out that the product of these distributions, after normalization, is also multivariate Gaussian. What is the algorithmic complexity to compute this product?

Let's first assume that each of the Gaussian distributions is parameterized by mean and covariance . The product is proportional to a Gaussian with mean and covariance , where

Assuming we have memory to store and reuse all intermediate results, we will need to perform matrix inversions and to solve linear systems. Thus, the runtime complexity is .

Now, let's instead assume that each of the Gaussian distributions is parameterized by mean and precision (i.e. inverse covariance) . The product is proportional to a Gaussian with mean and precision where

Here, we only need to perform matrix-vector products and solve linear system. So the runtime complexity is . But this analysis understates the likely speedup from using the precision matrices rather than covariance matrices. Precision matrices are often sparse but with dense inverses; in such cases this latter approach is faster and requires much less memory.

#ML