YaseenSyed.me

Fast Evaluation of Kernel Distances

I worked on this project in Spring 2021, after getting awarded the provost undergraduate reasearch scholarship.

Kernel-based methods play a key role in many applications of com- putational sciences and engineering, and—in particular—in machine learning. A major drawback of kernel methods is their computational complexity and scalability. If implemented naively, they require linear storage, with a cubic computational complexity for direct methods, and a quadratic complexity for iterative methods. In the present work, we consider methods for the evaluation of sums of Gaussian kernels with a linear complexity; the considered class of algorithms is referred to Fast Gauss Transform (FGT).

I initally implemnted this algorithm in MATLAB then after getting familiar with the idea behind it I implemented it in C++ using NVIDIAs CUDA framework.

I learned quite a bit while working on this project which includes getting familiar with Kernal methods, getting a taste of the Array Programming Paradigm, and machine learning methods for multivariate datasets.

More of this work can be found here.