Date: Wed. May 5, 2021
Organized by: Chair in Applied Analysis – Alexander von Humboldt Professorship at FAU Erlangen-Nürnberg
Title: Variational problems on L-infinity and continuum limits on graphs

Speaker: Dr. Leon Bungert
Affiliation: FAU Erlangen-Nürnberg, Germany

Abstract. Modern machine learning techniques and in particular Deep Learning have surpassed classical methods both in terms of efficiency and accuracy. On the other hand, many (semi)supervised learning methods are inherently instable with respect to noise or so-called adversarial examples which hinders their usage in safety critical applications. A potential remedy for this drawback is to design Lipschitz continuous, and hence stable, inference models. In this talk I will first speak about a graph-based semi-supervised learning approach called Lipschitz learning and study its continuum limit as the number of data points tends to infinity. Using Gamma-convergence one can prove that minimizers converge to solutions of a variational problem in L-infinity. Then I will present a novel regularization algorithm for neural networks called CLIP, which penalizes large Lipschitz constants of a neural network during training by keeping track of the set of unstable points.

Recording/Video:

If you like this, you don’t want to miss out our upcoming events!

Tags:

Don't miss out our posts on Math & Research!

Probabilistic Constrained Optimization on Flow Networks By Michael Schuster Uncertainty often plays an important role in the context of flow problems. We analyze a stationary and a dynamic flow model with uncertain boundary data and we consider optimization problems with probabilistic constraints in this context. The […]
Perceptrons, Neural Networks and Dynamical Systems By Sergi Andreu // This post is last part of the “Deep Learning and Paradigms” post Binary classification with Neural Networks When dealing with data classification, it is very useful to just assign a color/shape to every label, and so be able to visualize […]
Deep Learning and Paradigms By Sergi Andreu // This post is the 2nd. part of the “Opening the black box of Deep Learning” post Deep Learning Now that we have some intuition about the data, it’s time to focus on how to approximate the functions that […]
Our last Publications
© 2019-2021 Chair for Dynamics, Control and Numerics - Alexander von Humboldt Professorship at FAU Erlangen-Nürnberg, Germany | Imprint