Date: Wed. March 24, 2021
Organized by: Chair in Applied Analysis – Alexander von Humboldt Professorship at FAU Erlangen-Nürnberg
Title: Optimal sampling in least-squares methods – applications in PDEs and inverse problems

Speaker: Prof. Dr. Albert Cohen
Affiliation: Laboratoire Jacques-Louis Lions, Sorbonne Université, France

Abstract. Recovering an unknown function from point samples is an ubiquitous task in various applicative settings: non-parametric regression, machine learning, reduced modeling, response surfaces in computer or physical experiments, data assimilation and inverse problems. In the first part of this lecture, I shall present a theoretical setting that yield recovery bounds in the context where the user is allowed to select the measurement points, sometimes refered to as active learning. These results allow us to derive an optimal sampling point distribution when the approximation is searched in a linear space of finite dimension n and computed by weigted-least squares. Here optimal means both that the approximation is comparable to the best possible in this space, and that the sampling budget m barely exceeds n. The main involved tools are inverse Christoffel functions and matrix concentration inequalities.

In a second part, I shall cover some novel and ongoing developments building upon this approach. The first one addresses the setting where the approximation space is not fixed but adaptively generated with growing dimension n, which requires a particular online sampling methodology. The second discusses the setting where the measurements are not point value but more general functionals which may be thought as point evaluation in a transformed space, the typical applicative setting being that of inverse problems, and also of collocation methods for solving PDEs.


If you like this, you don’t want to miss out our upcoming events!


Don't miss out our posts on Math & Research!

Probabilistic Constrained Optimization on Flow Networks By Michael Schuster Uncertainty often plays an important role in the context of flow problems. We analyze a stationary and a dynamic flow model with uncertain boundary data and we consider optimization problems with probabilistic constraints in this context. The […]
Perceptrons, Neural Networks and Dynamical Systems By Sergi Andreu // This post is last part of the “Deep Learning and Paradigms” post Binary classification with Neural Networks When dealing with data classification, it is very useful to just assign a color/shape to every label, and so be able to visualize […]
Deep Learning and Paradigms By Sergi Andreu // This post is the 2nd. part of the “Opening the black box of Deep Learning” post Deep Learning Now that we have some intuition about the data, it’s time to focus on how to approximate the functions that […]
Our last Publications
© 2019-2021 Chair for Dynamics, Control and Numerics - Alexander von Humboldt Professorship at FAU Erlangen-Nürnberg, Germany | Imprint