My Master in TU Chemnitz
Thesis
Title
Comparison of preconditioners for kernel ridge regression
Supervisor
Chemnitz University of Technology (TUC)
Motivation
Kernel ridge regression (KRR) is a widely used machine learning technique specifically designed for regression applications. Nevertheless, the computational expense of Kernel Ridge Regression (KRR) can be substantial, particularly when dealing with extensive datasets. Preconditioners can be employed to decrease the computational expense of KRR. This thesis focuses on the comparison of various preconditioners for Kernel Ridge Regression (KRR). We assess the efficacy of the preconditioners by analyzing their performance on both synthetic and real-world datasets. The findings indicate that the use of preconditioners can greatly decrease the computational expenses associated with KRR.
Summary
The performance of various preconditioners for Kernel Ridge Regression (KRR) was evaluated across three datasets: SUSY, Cod Rna, and Higgs with sample of size , Gaussian kernel (RBF) with bandwidth , regularization parameter , and rank 50. First we solve each problem without preconditioner (CG) and then we apply the preconditioners RPCholesky, Uniform Sampling, Greedy method, randomized Nyström, and Random Fourier Features.
implementation
The implementation of the preconditioners is done in Python. The code is available on Github.
The courses:
- An introduction to Data Science with Prof. Dr. Martin Stoll
- Scientific Computing with Python with Prof. Dr. Martin Stoll
- Matrix Methods in Data Science with Prof. Dr. Martin Stoll
- Statistics in Data Science with Prof. Dr. Martin Stoll
- A Mathematical Introduction to Learning Theory with Prof. Dr. Tino Ullrich
- Mathematical Foundations of Big Data Analytics with Prof. Dr. Vladimir Shikhman
- Optimization in Machine Learning with Dr. Sara Grundel
- Neurocomputing with Dr. habil. Julien Vitay