- Cet évènement est passé
Séminaires scientifiques de la FIL – Juin 2022
27 juin 2022 à 14 h 30 min - 15 h 30 min
Navigation évènement
Date et heure :
Lundi 27 juin 2022, 14:30 – 15:30
Lieu :
LIP – ENS Lyon
Site Monod – 3ème étage – amphithéâtre A
46, allée d’Italie. 69364 Lyon Cedex 07
Visioconférence :
ID de réunion : 992 5554 8823
Code secret : EV60j5
Programme :
- 14:30 – 15:00 : Elisa Riccietti « Multilevel Physics Informed Neural Networks »
Abstratct:
We consider the approximation of the solution of partial differential equations (PDEs) by physics informed neural networks (PINNs). It is known that their training converges slowly when approximating a solution characterized by high frequencies. The purpose of this talk is then to study how ideas from the classical multigrid (MG) approach for solving PDEs can be extended to the PINN’s context. This is of interest because MG techniques are by far the most effective methods for linear problems, their use of alternating relaxations among fine and coarse grids enforcing an optimally fast reduction of all frequency components of the error. Inspired by this approach, we propose a multilevel PINN (MPINN) approach, based on writing the solution of the PDE as a sum of several terms, each targeting a different frequency range. Each term is a PINN depending on a different number of parameters, trained on a different training set and optimized in a cyclic fashion. We show that this approach for the training of PINNs enjoys the acceleration typically observed in classical MG methods, especially when the convergence of standard PINNs is slow.
Speaker bio:
Elisa Riccietti is currently associate professor at Ecole Normale Superieure in Lyon, France. Her research revolves around numerical optimization and machine learning. In the past her research focused on the solution of least-squares problems with noisy function and gradient, in particular ill-posed large-scale problems with inexact function and gradient. More recently she is working on the application of multilevel optimization methods to the training of artificial neural networks and mixed-precision methods for machine learning.
- 15:00 – 15:30 : Rémi Eyraud « Opening the Black-Box: What Formal Language Theory Teaches Us About Recurrent Neural Networks »
Abstratct:
Recent practical successes of machine learning – for instance in signal processing, natural language processing, or image retrieval – heavily rely on the training of powerful models such as deep neural networks. However, the decision taken by these models are hard to interpret – they are usually seen as black boxes – and using them can require important amount of computation power.
Speaker bio:
Rémi Eyraud is a junior professor at the Jean Monnet University, France. After the defense of his Ph.D, he moved for a post-doc at the University of Amsterdam, The Netherlands. He then was hired at the Aix-Marseille University, France, as a Maître de Conférences. He then spent a couple of years as an invited researcher at two US East Coast universities: the University of Maryland, Baltimore County and the University of Delaware. He joined the Jean Monnet University in 2020.
- 15:30 – 16:00 : Moment de convivialité et d’échange autour d’un café / thé