top of page
WWTNS (1).png

Wednesday, 5 pm CET, i.e, 11 am ET


Organized by David Hansel, Ran Darshan & Carl van Vreeswijk* 


About Us

About the Seminar

VVTNS  is a weekly digital seminar on Zoom targeting the theoretical neuroscience community. Created as the World Wide Neuroscience Seminar (WWTNS) in November 2020 and renamed in homage to Carl van Vreeswijk in Memoriam (April 20, 2022), its aim is to be a platform to exchange ideas among theoreticians. Speakers have the occasion to talk about theoretical aspects of their work which cannot be discussed in a setting where the majority of the audience consists of experimentalists. The seminars  are 45 min long followed by a discussion and are held on Wednesdays at 11 am EST. The talks are recorded with authorization of the speaker and are available to everybody on our YouTube channel.


To participate in the seminar you need to fill out a registration form after which you will

receive an email telling you how to connect.

  • Twitter
  • YouTube

Blake Bordelon

Harvard University

November 29, 2023


Mean Field Approaches to Learning Dynamics in Deep Networks

Deep neural network learning dynamics are very complex with large numbers of learnable weights and many sources of disorder. In this talk, I will discuss mean field approaches to analyze the learning dynamics of neural networks in large system size limits when starting from random initial conditions. The result of this analysis is a dynamical mean field theory (DMFT) where all neurons obey independent stochastic single site dynamics. Correlation functions (kernels) and response functions for the features and gradients at each layer can be computed self-consistently from these stochastic processes. Depending on the choice of scaling of the network output, the network can operate in a kernel regime or a feature learning regime in the infinite width limit. I will discuss how this theory can be used to analyze various learning rules for deep architectures (backpropagation, feedback alignment based rules, Hebbian learning etc), where the weight updates do not necessarily correspond to gradient descent on an energy function. I will then present recent extensions of this theory to residual networks at infinite depth and discuss the utility of deriving scaling limits to obtain consistent optimal hyperparameters (such as learning rate) across widths and depths. Feature learning in other types of architectures will be discussed if time permits. Lastly, I will discuss open problems and challenges associated with this theoretical approach to neural network learning dynamics.


The organizers of the VVTNS express their abhorrence of the barbaric attack on Israeli civilians that took place on 7 October 2023. We wish to express our support of the Israeli neuroscience community in this dark and frightening time.


David Hansel

I am a theoretical neuroscientist at the National Center for Scientific Research in Paris, France and visiting professor at The Hebrew University in Jerusalem, Israel. I am mainly interested in the recurrent dynamics in the cortex and 

basal ganglia.

Carl van Vreeswijk *

I am a theoretical neuroscientist working at the National Center for Scientific Research in Paris, France. My main interest is the dynamics of recurrent networks of neurons in the sensory system.



Ran Darshan

 I am a theoretical neuroscientist working at the Faculty of Medicine, the Sagol School of Neuroscience & the School of Physics and Astronomy at Tel Aviv University, Israel. I am interested in learning and dynamics of neural networks. My main goal is to achieve a mechanistic understanding of brain functions.

©2020 by WWTNS

bottom of page