Mattes Mollenhauer

mattes.mollenhauer
(at) merantix-momentum.com


   


About


I am a senior machine learning researcher at Merantix Momentum. Currently, my work is focused on the theory of machine learning in infinite-dimensional spaces and incorporates ideas from inverse problems, nonparametric statistics and uncertainty quantification. I am particularly interested in models which have applications in physics and stochastic processes, specifically molecular dynamics and fluid dynamics.

In 2022, I obtained my PhD under the supervision of Christof Schütte in the Biocomputing group at the Department of Mathematics and Computer Science of Freie Universität Berlin. My dissertation investigates theoretical aspects of nonparametric models for Markov processes. During my PhD program, I spent some time doing R&D in the aircraft engine development division of Rolls-Royce Deutschland, where I worked on security critical machine learning software for system identification and control. During my time as a postdoctoral researcher in the group of Claudia Schillings at Freie Universität Berlin, I focused on the theory of operator learning and uncertainty quantification. I also spent time with the people at dida for five yearsdoing more practical work on design and devops of deep learning systems for a variety of problems arising in weather forecasting, remote sensing and image segmentation.

Publications and preprints


Reports and preprints

Mattes Mollenhauer and Claudia Schillings.
"On the concentration of subgaussian vectors and positive quadratic forms in Hilbert spaces".
Preprint, 2023. link

Mattes Mollenhauer, Nicole Mücke and Tim Sullivan.
"Learning linear operators: Infinite-dimensional regression as a well-behaved non-compact inverse problem".
Preprint, 2022. link

Mattes Mollenhauer and Péter Koltai.
"Nonparametric approximation of conditional expectation operators."
Preprint, 2020. link

Accepted/published

Dimitri Meunier, Zikai Shen, Mattes Mollenhauer, Arthur Gretton and Zhu Li.
"Optimal Rates for Vector-Valued Spectral Regularization Learning Algorithms".
Advances in Neural Information Processing Systems, Vol. 38, 2024 (to appear). link

Zhu Li, Dimitri Meunier, Mattes Mollenhauer and Arthur Gretton.
"Towards Optimal Sobolev Norm Rates for the Vector-Valued Regularized Least-Squares Algorithm".
Journal of Machine Learning Research, 25(181):1−51, 2024. link

Andreas Bittracher, Mattes Mollenhauer, Péter Koltai and Christof Schütte.
“Optimal Reaction Coordinates: Variational Characterization and Sparse Computation”.
SIAM Multiscale Modeling & Simulation, 21:449-488, 2023. link

Zhu Li, Dimitri Meunier, Mattes Mollenhauer and Arthur Gretton.
“Optimal Rates for Regularized Conditional Mean Embedding Learning”.
Advances in Neural Information Processing Systems, Vol. 36, 2022. link

Mattes Mollenhauer, Stefan Klus, Christof Schütte and Péter Koltai.
“Kernel autocovariance operators of stationary processes: Estimation and convergence”.
Journal of Machine Learning Research, 23(327):1−34, 2022. link

Mattes Mollenhauer, Ingmar Schuster, Stefan Klus and Christof Schütte.
“Singular Value Decomposition of Operators on Reproducing Kernel Hilbert Spaces”.
Advances in Dynamics, Optimization and Computation, 109-130, Springer, 2020. link

Ingmar Schuster, Mattes Mollenhauer, Stefan Klus and Krikamol Muandet.
“Kernel Conditional Density Operators”.
23rd International Conference on Artificial Intelligence and Statistics (AISTATS), 2020. link

Stefan Klus, Brooke E. Husic, Mattes Mollenhauer and Frank Noé.
“Kernel methods for detecting coherent structures in dynamical data”.
Chaos: An Interdisciplinary Journal of Nonlinear Science 29.12, 2019. link


Miscellaneous

"On the Statistical Approximation of Conditional Expectation Operators".
PhD thesis, Freie Universität Berlin. 2022. link





Design courtesy of Vasilios Mavroudis: Plain Academic