Publication
Neural Computation
Paper

Leave-one-out bounds for kernel methods

View publication

Abstract

In this article, we study leave-one-out style cross-validation bounds for kernel methods. The essential element in our analysis is a bound on the parameter estimation stability for regularized kernel formulations. Using this result, we derive bounds on expected leave-one-out cross-validation errors, which lead to expected generalization bounds for various kernel algorithms. In addition, we also obtain variance bounds for leave-one-out errors. We apply our analysis to some classification and regression problems and compare them with previous results.

Date

Publication

Neural Computation

Authors

Share