Duan, Weikang2022-05-102022-05-10https://hdl.handle.net/2097/42235In the past decades, statistical learning has been an increasingly popular topic that has drawn a significant amount of attention from researchers. Kernel based nonlinear models, in particular, are powerful tools due to their flexibility to extract information from complex datasets. A major challenge with the kernel modeling in the current big data era is the curse of dimensionality. Although an abundance of variable selection methods have been proposed, the developments in high-dimensional Bayesian kernel models is still in its infancy. In addition to the variable selection, the innate nature of kernel based models induces heavy computational costs, which further prohibit the application of related methods. The goal of this dissertation is to develop new, fast variable selection and prediction procedures in order to address the problem of high-dimensional nonlinear regression and classification from the Bayesian perspective. To reduce the computational cost, we propose a novel hybrid search algorithm and the Bayesian doubly-sparse frameworks to the kernel based models. In Chapter 1, we discuss the background, existing methods, and their limitations. We also give the motivation for our study. In Chapter 2, we propose a Bayesian model hybrid search algorithm for Gaussian process (GP) regression models, which quickly scan through the model space to search for a set of models with high posterior probabilities. In addition, we address the massive and high-dimensional data problem for GP by proposing an approach which combines quantile subsample hybrid search with a nearest neighbor GP scheme. In Chapter 3, we propose a novel Bayesian doubly-sparse framework to the reproducing kernel Hilbert space (RKHS) regression models. The proposed doubly-sparse frame work performs both variable selection and sparse kernel matrix estimation. In Chapter 4, we extend our proposed Bayesian doubly-sparse framework to the nonlinear Bayesian support vector machine.en-USBayesianGaussian processReproducing kernel Hilbert spaceVariable selectionSparse Bayesian kernel learning for high-dimensional regression and classificationDissertation