Stéphane d'Ascoli

Stéphane d'Ascoli

Research Scientist

Meta AI, Paris

Biography

Hi! I’m a Research Scientist at Meta AI, working in the Brain & AI team. Previously, I was an AI4Science research fellow at EPFL, and completed a Ph.D. in deep learning, during which I shared my time between the Center for Data Science of ENS Paris and Facebook AI Research – you can find my thesis here. Prior to that, I studied Theoretical Physics at ENS Paris, and worked with NASA on black hole mergers. You can download my CV here.

My current research focuses on decoding neural activity, with the aim of understanding better how the brain works, and perhaps one day help those who have difficulties to speak or type. I am also interested in understanding large neural networks and applying them to computer vision, symbolic regression and the natural sciences in general. Outside work, I love communicating about science (I wrote a few books for the general public), playing the clarinet and travelling very far on my bicycle!

Education

  • PhD in Artificial Intelligence, 2022

    Ecole Normale Supérieure, Paris

  • Master's in Theoretical Physics, 2018

    Ecole Normale Supérieure, Paris

  • Bachelor's in Physics, 2016

    Ecole Normale Supérieure, Paris

Research

Boolformer: Symbolic Regression of Logic Functions with Transformers

Boolformer: Symbolic Regression of Logic Functions with Transformers

Length generalization in arithmetic transformers

Length generalization in arithmetic transformers

ODEFormer: Symbolic Regression of Dynamical Systems with Transformers

ODEFormer: Symbolic Regression of Dynamical Systems with Transformers

Deep symbolic regression for recurrence prediction

Deep symbolic regression for recurrence prediction

End-to-end symbolic regression with transformers

End-to-end symbolic regression with transformers

Optimal learning rate schedules in high-dimensional non-convex optimization problems

Optimal learning rate schedules in high-dimensional non-convex optimization problems

Align, then memorise: the dynamics of learning with feedback alignment

Align, then memorise: the dynamics of learning with feedback alignment

ConViT: Improving Vision Transformers with Soft Convolutional Inductive Biases

ConViT: Improving Vision Transformers with Soft Convolutional Inductive Biases

On the interplay between data structure and loss function in classification problems

On the interplay between data structure and loss function in classification problems

Transformed CNNs: recasting pre-trained convolutional layers with self-attention

Transformed CNNs: recasting pre-trained convolutional layers with self-attention

Conditioned Text Generation with Transfer for Closed-Domain Dialogue Systems

Conditioned Text Generation with Transfer for Closed-Domain Dialogue Systems

Double Trouble in Double Descent: Bias and Variance (s) in the Lazy Regime

Double Trouble in Double Descent: Bias and Variance (s) in the Lazy Regime

Scaling description of generalization with number of parameters in deep learning

Scaling description of generalization with number of parameters in deep learning

Triple descent and the two kinds of overfitting: where and why do they appear?

Triple descent and the two kinds of overfitting: where and why do they appear?

A jamming transition from under-to over-parametrization affects generalization in deep learning

A jamming transition from under-to over-parametrization affects generalization in deep learning

Finding the Needle in the Haystack with Convolutions: on the benefits of architectural bias

Finding the Needle in the Haystack with Convolutions: on the benefits of architectural bias

Jamming transition as a paradigm to understand the loss landscape of deep neural networks

Jamming transition as a paradigm to understand the loss landscape of deep neural networks

Electromagnetic Emission from Supermassive Binary Black Holes Approaching Merger

Electromagnetic Emission from Supermassive Binary Black Holes Approaching Merger

Outreach

Podcasts

L’espace-temps est courbe: qu’est-ce à dire?

Podcast “La Conversation Scientifique” with Etienne Klein, France Culture, June 2021.

Big Bang et Trous Noirs

Podcast “Minute Papillon” with Sidonie Bonnec, France Bleu, May 2021.

Books

Voyage au Coeur de l’Atome

Book on quantum mechanics, co-written with Adrien Bouscal, published by First Editions, May 2022.

Voyage au Coeur de l’Espace-Temps

Book on relativity, co-written with Arthur Touati, published by First Editions, March 2021. Also available on Audible as an audio book.

Comprendre la révolution de l’Intelligence Artificielle

Book on AI, published by First Editions, March 2020.

L’Intelligence Artificielle en 5 minutes par jour

Short book on AI, published by First Editions, September 2020.

Videos

Les Intelligences Artificielles les plus flippantes

Co-wrote the script for Dr. Nozman, November 2022.

Deep Symbolic Regression for Recurrent Sequences

Interview with Yannic Kilcher, January 2022.

Qu’est-ce que l’Intelligence Artificielle?

Conference “Les assises du livre numérique”, organized by the Syndicat National de l’Edition, December 2021.

Comprendre la révolution de l’Intelligence Artificielle

Conference “Les Mardis Scientifiques”, organized by Université du Temps Libre, November 2021.

Simulation Reveals Spiraling Supermassive Black Holes

Explanatory video on black hole mergers, co-produced with NASA, October 2018.

360-degree Simulated View of the Sky Between Two Supermassive Black Holes

VR visualization of binary black holes, co-produced with NASA, October 2018.

Music




Travel

From September 2022 to March 2023, I cycled through South America with two friends, from Quito (Ecuador) to Punta Arenas (Chile). You may find a step-by-step narrative of our trip here.

Along the way, I filmed the most beautiful places we encountered with my drone and created these videos with pieces by Debussy and Ravel as soundtrack. Enjoy!