Kernel Matrices: From Physics to Machine Learning
Edmond Chow, Georgia Institute of Technology
22 January 2025
Kernel matrices, defined by a set of points and a pairwise interaction function, have garnered significant attention recently due to rising interest in Gaussian process regression and other kernel methods in machine learning. However, kernel matrices have a long history, particularly in computational physics and integral equation problems, often under different names.
In machine learning, kernel methods are often perceived as limited by the computational cost of processing the data, primarily due to the need to solve systems of equations involving the kernel matrix. Recently, the intersection of algorithms from physical applications and statistical ideas has led to innovative methods for kernel matrix problems.
In this presentation, we will explore the hierarchical approximation of kernel matrices, enabling storage and operations to be performed in linear time relative to the number of points. We will also present a preconditioner designed for the iterative solution of kernel matrix systems, specifically targeting Gaussian process hyperparameter estimation.
About the speaker
Edmond Chow is Professor and Associate Chair in the School of Computational Science and Engineering at Georgia Institute of Technology. His research is in developing numerical methods specialized for high-performance computers and applying these methods to enable the solution of large-scale physical simulation problems in science and engineering. Dr. Chow previously held positions at D. E. Shaw Research and Lawrence Livermore National Laboratory. He was chair of the 2022 ACM Gordon Bell Prize committee, and was co-chair of the 2022 SIAM Annual Meeting. He is currently Vice-Chair of the SIAM Activity Group on Computational Science and Engineering. Dr. Chow is a Fellow of SIAM.