My Master in TU Chemnitz

Thesis

Title

Comparison of preconditioners for kernel ridge regression

Supervisor

Prof. Dr. Martin Stoll

Chemnitz University of Technology (TUC)

Motivation

Kernel ridge regression (KRR) is a widely used machine learning technique specifically designed for regression applications. Nevertheless, the computational expense of Kernel Ridge Regression (KRR) can be substantial, particularly when dealing with extensive datasets. Preconditioners can be employed to decrease the computational expense of KRR. This thesis focuses on the comparison of various preconditioners for Kernel Ridge Regression (KRR). We assess the efficacy of the preconditioners by analyzing their performance on both synthetic and real-world datasets. The findings indicate that the use of preconditioners can greatly decrease the computational expenses associated with KRR.

Summary

The performance of various preconditioners for Kernel Ridge Regression (KRR) was evaluated across three datasets: SUSY, Cod Rna, and Higgs with sample of size N=10000N=10000, Gaussian kernel (RBF) with bandwidth γ=0.001\gamma=0.001, regularization parameter μ=0.001\mu=0.001, and rank 50. First we solve each problem without preconditioner (CG) and then we apply the preconditioners RPCholesky, Uniform Sampling, Greedy method, randomized Nyström, and Random Fourier Features.

implementation

The implementation of the preconditioners is done in Python. The code is available on Github.

The courses: