Squared exponential kernel with Manhatten distance does not result in positive semi-definite matrix

covariancemetric-spacespositive-semidefinite

Here it is stated that the squared exponential covariance function
$$C(d) = e^{-(\frac{d}{V})^2},$$
where $V$ is a scaling parameter and $d$ is a distance between two points, is a stationary covariance function with smooth sample paths.
When I use the Manhattan distance or the Chebyshev distance for a specific set of points I get a matrix
$$K_{i,j} = C(x_i, x_j)$$
which is not positive semi-definite. In other words not all Eigenvalues are positive or zero.
Is the squared exponential kernel only a PSD function when the Euclidean distance is used, or should it be the case for all valid metrics? Moreover, what does it mean for a covariance matrix to not be PSD (which by definition would not be a covariance matrix…)?

Best Answer

I found a sufficient answer in other posts:

Gaussian kernels for arbitrary metric spaces

Is the exponential of −d a positive definite kernel for a general metric space (X,d)?

For further details on which metric spaces result in PSD kernels of this type, I highly recommend to take a look in the following papers:

Open Problem: Kernel methods on manifolds and metric spaces

Geodesic exponential kernels: When Curvature and Linearity Conflict