Marc Mezard
Professor, Director of Ecole Normale Superieure, Paris, France.
Title: Statistical physics of machine learning: successes, limitations, and perspectives
Abstract: In the last fifty years, statistical physics has elaborated a large corpus of concepts and methods to study probabilistic models in large dimensions. The connections to constraint satisfaction problems have also led to interesting algorithmic developments. This corpus provides a useful approach for theoretical studies of machine learning, complementary to traditional approaches in learning theory. While these approaches are more focused on worst-case analysis, statistical physics can provide detailed analytic description of the behaviour of large systems, trained with a typical database drawn from some ensemble. This talk will review some of the historical and recent successes of this approach. It will also underline its main limitations, and propose some perspective for future developments.
Bio: Marc Mézard stands behind many deep results in physics of disordered systems. Since the 80s he is pioneering the use of statistical physics methods in computer science, and machine learning. His works and books set up a multidisciplinary direction that uncovered many deep connections between fields studying implicitely systems of many interacting elements. His work impacts among others the theory of glasses and spin glasses, combinatorial optimization, computaitonal complexity, neural networks theory, information theory, signal processing. He is currently the director of École normale supérieure in Paris.
Qianxiao Li
Assistant Professor, Department of Mathematics, National University of Singapore
Title: Machine learning and dynamical systems - approximation, optimization and beyond
Abstract: In this talk, we discuss some recent work on the connections between machine learning and dynamical systems. These come broadly in three categories, namely machine learning via, for and of dynamical systems. In the first direction, we introduce a dynamical approach to deep learning theory with particular emphasis on its connections with approximation and control theory. In the second direction, we discuss the approximation and optimization theory of learning input-output temporal relationships using recurrent neural networks and variants, with the goal of highlighting key new phenomena that arise in learning in dynamic settings. In the last direction, we discuss some principled methods that learns stable and interpretable dynamical model from data arising in scientific applications. We conclude with some discussion on future directions in this rapidly developing field.
Bio: Qianxiao Li is an assistant professor in the Department of Mathematics, National University of Singapore and a research scientist in the Institute of High Performance Computing, ASTAR. He graduated with a BA in mathematics from University of Cambridge and a PhD in applied mathematics from Princeton University. His research interests include the interplay of machine learning and dynamical systems, stochastic gradient algorithms and the application of data-driven methods to problems in the physical sciences. He is a recipient of the Singapore NRF fellowship, class of 2021.