Solved – Trouble with kernel in kernlab R package

kernel trickrsvm

I'm using kernlab package

Here are two examples:
First:

library(kernlab)
x <- runif(1020, 1, 5000)
y <- sqrt(x)
model.vanilla <- rvm(x, y, kernel='vanilladot')

Got error:

Error in chol.default(crossprod(Kr)/var + diag(1/thetatmp)) :
the leading minor of order 2 is not positive definite

Second:

library(kernlab)
x <- runif(1020, 1, 5000)
y <- sqrt(x)
model.rbf <- rvm(x[1:1000], y[1:1000], kernel='rbfdot')
print(model.rbf)
py.rbf <- predict(model.rbf, x[1001:1020])
print(paste("MSE: ", sum((py.rbf - y[1001:1020]) ^ 2) / length(py.rbf)))

OK:

Using automatic sigma estimation (sigest) for RBF or laplace kernel 
Relevance Vector Machine object of class "rvm" 
Problem type: regression 

Gaussian Radial Basis kernel function. 
 Hyperparameter : sigma =  5.44268665122008e-06 

Number of Relevance Vectors : 247 
Variance :  4.368e-06
Training error : 3.418e-06 
[1] "MSE:  4.921706631013e-05"

Why doesn't using linear kernel work here? polydot (polynomial kernel function) doesn't work either.

Can this be fixed?

Best Answer

Evidently, your data is too sparse for that combination of method and kernel. If you change your

x <- runif(1020, 1, 5000)

to either of

x <- runif(10200, 1, 5000)
x <- runif(1020, 1, 100)

it works. For me.