Efficient, Geometrically Adaptive Techniques for Multiscale Gaussian-Kernel SVM Classification

Publication Date

1-1-2020

Document Type

Conference Proceeding

Publication Title

Advanced Studies in Classification and Data Science

Editor

Tadashi Imaizumi, Akinori Okada, Sadaaki Miyamoto, Fumitake Sakaori, Yoshiro Yamamoto, Maurizio Vichi

DOI

10.1007/978-981-15-3311-2_4

First Page

45

Last Page

56

Abstract

Single-scale Gaussian-kernel support vector machines (SVM) have achieved competitive accuracy in many practical tasks; however, a fundamental limitation of the underlying model is its use of a single bandwidth parameter which essentially assumes that the training and test data has a uniform scale everywhere. In cases of data with multiple scales, only one parameter may be unable to fully capture the heterogeneous scales present in the data. In this paper, we present two efficient approaches to constructing multiscale Gaussian kernels for SVM classification by following the multiple-kernel learning research by Gonen and Alpaydin (J Mach Learn Res 12: 2211–2268, 2011) and a self-tuning spectral clustering procedure introduced by Zelnik-Manor and Perona (Advances in Neural Information Processing Systems 17:1601–1608, 2004) in the unsupervised setting, respectively. The resulting kernels adapt to the different scales of the data and are directly computable from the training data, thus avoiding expensive hyperparameter tuning tasks. Numerical experiments demonstrate that our multiscale kernels lead to superior accuracy and fast speed.

Department

Mathematics and Statistics

Share

COinS