cover photo

Mathematics of Data Science Seminar Series

About the seminar series

The Mathematics of Data Science Seminar Series is hosted by DTU Compute at the Technical University of Denmark and is organized by Martin S. Andersen, Allan P. Engsig-Karup, and Jakob Lemvig. The seminars are open to everyone and aim to bring together researchers, students, and industry practitioners with an interest in mathematics and data science. With this activity, we wish to promote the exchange of ideas, foster cross-disciplinary and inter-organizational collaboration, and create a nourishing academic environment for emerging data scientists. Each seminar will be given by an invited speaker from academia or industry and is intended for a broad audience. Topics of interest include:

  • mathematical, statistical, and computational methods for data science,
  • application areas (e.g., solution of a practical data science problem), and
  • current problems and open challenges.

The seminar series is supported by the Danish Data Science Academy.

Upcoming seminars

Hayden Schaefer (University of California, Los Angeles)

Title: Multi-Task Neural Operators and Autoregressive Methods for PDEs

Date: May 11, 2026
Time: TBA
Location: TBA
Stream: Zoom

Abstract

Learning collections of solution operators for nonlinear PDEs remains challenging due to high dimensionality, multiscale phenomena, and limited observational data. Recent advances span both PDE foundation models (utilizing transformer-based or autoregressive spatiotemporal architecture) and multi-operator neural operator methods, which leverage shared structure across tasks. When trained on a wide range of datasets, these approaches enable zero-shot and few-shot generalization across parameter regimes. In this talk, we present both perspectives, highlighting their strengths in capturing complex dependencies, achieving high accuracy, and generalizing beyond the training distribution. We further present recent theoretical results establishing statistical generalization guarantees for multi-task and multiple operator learning, including explicit approximation-estimation tradeoffs and scaling laws that characterize dependence on dataset size, model capacity, and hierarchical sampling. Focusing on incompressible and compressible flows, we provide empirical and theoretical insights into model accuracy, robustness, and scalability on large datasets.

About the speaker

Hayden Schaeffer is the Director of Applied Mathematics and a Professor of Mathematics at the University of California, Los Angeles. His research is in mathematical foundations of AI and scientific machine learning, optimization, and dynamics. He has received an NSF CAREER award, an AFOSR Young Investigator Award, an NSF Mathematical Sciences Postdoctoral Research Fellowship, a UC President’s Postdoctoral Fellowship, and an NDSEG Fellowship.

Mailing list

If you would like to receive information about upcoming seminars, you can subscribe to our mailing list by signing up below. We do not share your information with any third parties, and you can unsubscribe at any time by clicking the link in the footer of our emails or by clicking here.

Subscribe to our mailing list

* indicates required

Please select all the ways you would like to hear from us:

You can unsubscribe at any time by clicking the link in the footer of our emails.

We use Mailchimp as our marketing platform. By clicking below to subscribe, you acknowledge that your information will be transferred to Mailchimp for processing. Learn more about Mailchimp's privacy practices here.