Machine Learning – Should Dimensionality Be Reduced or Increased?

learningmachine learning

I had a question about machine learning and dimensionality.

In many machine learning methods, we try to reduce the dimensionality and find a latent space / manifold in which the data can be represented, i.e. neural networks taking in images

In other methods like SVM/kernels, we try and find a higher dimensional space so we can separate/classify our data.

Why are these two machine learning methods doing completely different things with dimensionality? Or does it instead depend on the task at hand (i.e images being extremely high dimensional data so we want to lower it)? Or am I misunderstanding something? Thanks

Best Answer

SVM's are a kernel method. A kernel can be in an infinite dimensional feature space. After evaluating data points in this "higher" dimension, we can then perform linear inference on them Mercer's Theorem. SVM's specifically use the higher dimensionality to aid in drawing a hyperplane which can separate the data Kernel trick.

The bad part about this is that we need to maintain a kernel $K \in \mathbb{R}^{N \times N}$ and increase the size for every new datapoint which arrives in our dataset.

Neural Networks maintain a fixed dimensionailty through stacked linear/non-linear parameterized functions with a linear output. We then learn the maximum likelihood parameters through gradient descent.

The Similarities between NN's and SVM's are this.

  • Both use combinations lionear/non-linear features of the inputs $\phi(x)$
  • Both utilize some adaptation to fit the training data (either in the linear or non-linear functions)

The differences

  • The SVM kernel will grow when you want to add more training points, or more inference points
  • The NN (once a model is decided) will maintain the same dimensionality no matter how much the dataset grows.

They do different things with dimensionality because their designs specify how they handle inputs, and Kernel methods, and neural networks behave differently due to different designs