Advanced search
Start date
Betweenand

Scalable Variable Selection for Reproducing Kernel Hilbert Spaces Methods

Grant number: 21/02178-8
Support type:Scholarships in Brazil - Master
Effective date (Start): October 01, 2021
Effective date (End): September 30, 2023
Field of knowledge:Physical Sciences and Mathematics - Probability and Statistics - Statistics
Principal researcher:Rafael Izbicki
Grantee:Mateus Piovezan Otto
Home Institution: Centro de Ciências Exatas e de Tecnologia (CCET). Universidade Federal de São Carlos (UFSCAR). São Carlos , SP, Brazil

Abstract

Reproducing Kernel Hilbert Spaces (RKHS) methods are a broad family of statistical learning models that builds upon the notion of a kernel, which can be interpreted as a similarity measure between data observations. For instance, ridge regression, support vector machines - including their kernelized versions - and smoothing splines can be cast to a generic function optimization problem in RKHS. However, these methods suffer from two major drawbacks. First, they involve the inversion of a n by n (where n is the sample size) kernel matrix, a process that numerically scales as O(n^{2.3-3}), thus being computationally expensive. Fortunately, this problem can be addressed by approximating the kernel matrix via Random Fourier Features, as described in Rahimi and Recht (2007). Second, since most adopted kernels are isotropic, RKHS methods do not automatically perform variable selection, incurring in poor performance in applications with many irrelevant features. Our main objective is to extend Rahimi and Recht's framework for non-isotropic kernels, construct a neural network to optimize the variable selection parameters (e.g., bandwidths of a gaussian kernel) and apply the resulting model to several consolidated benchmarking datasets.

News published in Agência FAPESP Newsletter about the scholarship:
Articles published in other media outlets (0 total):
More itemsLess items
VEICULO: TITULO (DATA)
VEICULO: TITULO (DATA)