Many authors of papers I read affirm SVMs is superior technique to face their regression/classification problem, aware that they couldn't get similar results through NNs. Often the comparison states that
SVMs, instead of NNs,
- Have a strong founding theory
- Reach the global optimum due to quadratic programming
- Have no issue for choosing a proper number of parameters
- Are less prone to overfitting
- Needs less memory to store the predictive model
- Yield more readable results and a geometrical interpretation
Is it seriously a broadly accepted thought? Don't quote No-Free Lunch Theorem or similar statements, my question is about practical usage of those techniques.
On the other side, which kind of abstract problem you definitely would face with NN?
Best Answer
It is a matter of trade-offs. SVMs are in right now, NNs used to be in. You'll find a rising number of papers that claim Random Forests, Probabilistic Graphic Models or Nonparametric Bayesian methods are in. Someone should publish a forecasting model in the Annals of Improbable Research on what models will be considered hip.
Having said that for many famously difficult supervised problems the best performing single models are some type of NN, some type of SVMs or a problem specific stochastic gradient descent method implemented using signal processing methods.
Pros of NN:
Pros of SVM:
Fewer hyperparameters. Generally SVMs require less grid-searching to get a reasonably accurate model. SVM with a RBF kernel usually performs quite well.
Global optimum guaranteed.
Cons of NN and SVM: