Solved – Theoretical: Minimum Number of Support Vectors

machine learningsvm

I wanted to check if I am understanding the concept of the support vectors correctly. Let's say I am not using any kind of kernel, and it is a hard-margin SVM.

In this case, whatever the number of the dimensions is for our features, the minimum possible number of support vectors equals to 2 (1 for +, 1 for -). Am I correct? Do we still need more support vectors even in such an ideal case in $R^{d}$?

Best Answer

Yes. The minimum number of support vectors is two for your scenario. You don't need more than two here.

All of the support vectors lie exactly on the margin. Regardless of the number of dimensions or size of data set, the number of support vectors could be as little as 2.

Reference: https://stackoverflow.com/questions/9480605/what-is-the-relation-between-the-number-of-support-vectors-and-training-data-and

enter image description here