Erik Altman, Jovan Blanusa, et al.
NeurIPS 2023
The support vector machine (SVM) is a widely used tool for classification. Many efficient implementations exist for fitting a two-class SVM model. The user has to supply values for the tuning parameters: the regularization cost parameter, and the kernel parameters. It seems a common practice is to use a default value for the cost parameter, often leading to the least restrictive model. In this paper we argue that the choice of the cost parameter can be critical. We then derive an algorithm that can fit the entire path of SVM solutions for every value of the cost parameter, with essentially the same computational cost as fitting one SVM model. We illustrate our algorithm on some examples, and use our representation to give further insight into the range of SVM solutions.
Erik Altman, Jovan Blanusa, et al.
NeurIPS 2023
Rangachari Anand, Kishan Mehrotra, et al.
IEEE Transactions on Neural Networks
Ryan Johnson, Ippokratis Pandis
CIDR 2013
Salvatore Certo, Anh Pham, et al.
Quantum Machine Intelligence