AI Colloquium with Lamarr Fellow Prof. Dr. Arnulf Jentzen
We are pleased to welcome Prof. Arnulf Jentzen for a talk in the AI Colloquium series at the Lamarr Institute.
The AI Colloquium, organized by the Lamarr Institute, provides a platform for leading researchers to present groundbreaking work in the field of Machine Learning and Artificial Intelligence. These 90-minute sessions unlike other colloquia, focus on interactive dialog and international collaboration and include one-hour lectures and 30-minute Q&A sessions. The colloquium will be held mainly in English.
Abstract
Stochastic gradient descent (SGD) optimization methods are nowadays the method of choice for the training of deep neural networks (DNNs) in artificial intelligence systems. In practically relevant training problems, often not the plain vanilla standard SGD method is the employed optimization scheme but instead suitably accelerated and adaptive SGD optimization methods such as the famous Adam optimizer are applied. In this talk we show that Adam does typically not converge to minimizers or criticial points of the objective function (the function one intends to minimize) but instead converges to zeros of another function, which we refer to as Adam vector field. Moreover, we establish convergence rates in terms of the number of Adam steps and the size of the mini-batch for all strongly convex stochastic optimization problems. Finally, we present acceleration techniques for Adam in the context of deep learning approximations for partial differential equation and optimal control problems. The talk is based on joint works with Steffen Dereich, Thang Do, Robin Graeber, and Adrian Riekert.
Anrulf Jentzen
Arnulf Jentzen (*November 1983) is a professor at the Chinese University of Hong Kong, Shenzhen (CUHK-Shenzhen) (since 2021) and a professor at the University of Münster (since 2019). In 2004 he started his undergraduate studies in mathematics at Goethe University Frankfurt in Germany, in 2007 he received his diploma degree at this university, and in 2009 he completed his PhD in mathematics at this university. The core research topics of his research group are machine learning approximation algorithms, computational stochastics, numerical analysis for high dimensional partial differential equations (PDEs), stochastic analysis, and computational finance. Currently, he serves in the editorial boards of several scientific journals such as the Journal of Machine Learning, the SIAM Journal on Scientific Computing, the SIAM Journal on Numerical Analysis, and the SIAM/ASA Journal on Uncertainty Quantification. His research activities have been recognized through several major awards such as the Felix Klein Prize of the European Mathematical Society (EMS) (2020), an ERC Consolidator Grant from the European Research Council (ERC) (2022), the Joseph F. Traub Prize for Achievement in Information-Based Complexity (IBC) (2022), and a Frontier of Science Award in Mathematics (jointly with Jiequn Han and Weinan E) by the International Congress of Basic Science (ICBS) (2024). Details on the activities of his research group can be found at the webpage http://www.ajentzen.de
Details
Date
3. July 2025
10:00 - 12:00
Location
Bonn University
Friedrich-Hirzebruch-Allee 6
Bonn
Topics
Science