Efficient, Geometrically Adaptive Techniques for Multiscale Gaussian-Kernel SVM Classification
Advanced Studies in Classification and Data Science
Tadashi Imaizumi, Akinori Okada, Sadaaki Miyamoto, Fumitake Sakaori, Yoshiro Yamamoto, Maurizio Vichi
Single-scale Gaussian-kernel support vector machines (SVM) have achieved competitive accuracy in many practical tasks; however, a fundamental limitation of the underlying model is its use of a single bandwidth parameter which essentially assumes that the training and test data has a uniform scale everywhere. In cases of data with multiple scales, only one parameter may be unable to fully capture the heterogeneous scales present in the data. In this paper, we present two efficient approaches to constructing multiscale Gaussian kernels for SVM classification by following the multiple-kernel learning research by Gonen and Alpaydin (J Mach Learn Res 12: 2211–2268, 2011) and a self-tuning spectral clustering procedure introduced by Zelnik-Manor and Perona (Advances in Neural Information Processing Systems 17:1601–1608, 2004) in the unsupervised setting, respectively. The resulting kernels adapt to the different scales of the data and are directly computable from the training data, thus avoiding expensive hyperparameter tuning tasks. Numerical experiments demonstrate that our multiscale kernels lead to superior accuracy and fast speed.
Mathematics and Statistics
Guangliang Chen. "Efficient, Geometrically Adaptive Techniques for Multiscale Gaussian-Kernel SVM Classification" Advanced Studies in Classification and Data Science (2020): 45-56. https://doi.org/10.1007/978-981-15-3311-2_4