Solved – Distance measure between two multivariate normal distributions (with differing mean and covariances)

covariance-matrixdistancenormal distribution

I have two estimates of two points in space – each has a 3D position and a cigar shaped 3×3 covariance matrix and I am checking the hypothesis that these observations are actually referencing the one and same point. So I would like to calculate the agreement of the two observations with this assumption.

A search brings up Bhattacharyya distance, or Kullback–Leibler divergence as candidates. I am not looking for the most correct estimate, but rather an easy to implement function which takes two positions and two 3×3 matrices and returns a percentage or distance in standard deviations.

Here are some similar threads:

Mahalanobis distance between two bivariate distributions with different covariances

Measures of similarity or distance between two covariance matrices

Best Answer

In the end I went for the Bhattacharyya distance. I adapted the R code referenced here:

// In the following, Vec3 and Mat3 are C++ Eigen types.

/// See: https://en.wikipedia.org/wiki/Mahalanobis_distance
double mahalanobis(const Vec3& dist, const Mat3& cov)
{
    return (dist.transpose()*cov.inverse()*dist).eval()(0);
}

/// See: https://en.wikipedia.org/wiki/Bhattacharyya_distance
double bhattacharyya(const Vec3& dist, const Mat3& cov1, const Mat3& cov2)
{
    const Mat3 cov = (cov1+cov2)/2;
    const double d1 = mahalanobis(dist, cov)/8;
    const double d2 = log(cov.determinant()/sqrt(cov1.determinant()*cov2.determinant()))/2;
    return d1+d2;
}