Aurélie C. Lozano, Naoki Abe, et al.
KDD 2009
The support vector machine (SVM) is a widely used tool for classification. Many efficient implementations exist for fitting a two-class SVM model. The user has to supply values for the tuning parameters: the regularization cost parameter, and the kernel parameters. It seems a common practice is to use a default value for the cost parameter, often leading to the least restrictive model. In this paper we argue that the choice of the cost parameter can be critical. We then derive an algorithm that can fit the entire path of SVM solutions for every value of the cost parameter, with essentially the same computational cost as fitting one SVM model. We illustrate our algorithm on some examples, and use our representation to give further insight into the range of SVM solutions.
Aurélie C. Lozano, Naoki Abe, et al.
KDD 2009
Conrad Albrecht, Jannik Schneider, et al.
CVPR 2025
Ken C.L. Wong, Satyananda Kashyap, et al.
Pattern Recognition Letters
Xiaoxiao Guo, Shiyu Chang, et al.
AAAI 2019