top of page
Soledad-Gonzalo-Cogno.jpeg

Soledad Gonzalo Cogno

NTNU,Trondheim

February 1, 2023

Minute-scale periodic sequences in medial entorhinal cortex

The medial entorhinal cortex (MEC) hosts many of the brain’s circuit elements for spatial navigation and episodic memory, operations that require neural activity to be organized across long durations of experience. While location is known to be encoded by a plethora of spatially tuned cell types in this brain region, little is known about how the activity of entorhinal cells is tied together over time. Among the brain’s most powerful mechanisms for neural coordination are network oscillations, which dynamically synchronize neural activity across circuit elements. In MEC, theta and gamma oscillations provide temporal structure to the neural population activity at subsecond time scales. It remains an open question, however, whether similarly coordination occurs in MEC at behavioural time scales, in the second-to-minute regime. In this talk I will show that MEC activity can be organized into a minute-scale oscillation that entrains nearly the entire cell population, with periods ranging from 10 to 100 seconds. Throughout this ultraslow oscillation, neural activity progresses in periodic and stereotyped sequences. The oscillation sometimes advances uninterruptedly for tens of minutes, transcending epochs of locomotion and immobility. Similar oscillatory sequences were not observed in neighboring parasubiculum or in visual cortex. The ultraslow periodic sequences in MEC may have the potential to couple its neurons and circuits across extended time scales and to serve as a scaffold for processes that unfold at behavioural time scales.

331764.jpeg

Lenka Zdeborová

EPFL, Lausanne

February 8, 2023

Understanding Machine Learning via

Exactly Solvable Statistical Physics Models

The affinity between statistical physics and machine learning has a long history. I will describe the main lines of this long-lasting friendship in the context of current theoretical challenges and open questions about deep learning. Theoretical physics often proceeds in terms of solvable synthetic models, I will describe the related line of work on solvable models of simple feed-forward neural networks. I will highlight a path forward to capture the subtle interplay between the structure of the data, the architecture of the network, and the optimization algorithms commonly used for learning.  

mato.jpeg

German Mato

CONICET, Bariloche

February 15, 2023

Orientation selectivity in rodent V1: theory vs experiments

TBA

téléchargement (3).jpeg

Sophie Denève

CNRS, Paris

February 22, 2023

TBA

TBA

naud.jpeg

Richard Naud

University of Ottawa

March 1st , 2023

TBA

TBA

March 8 & March 15

 

Eve and following day Cosyne22

No Seminar

sfusi.jpeg

Stefano Fusi

Columbia University

March 22 , 2023

TBA

TBA

dahmen.png

David Dahmen

Jülich Research Center

March 29 , 2023

TBA

TBA

cropped-gabrielle_portraits_verapash-68-of-164_websize.jpeg

Gabrielle Gutierrez

Columbia University

 April 5 , 2023

TBA

TBA

TBA

April 12 , 2023

TBA

TBA

eric_uw_2_small_compressed.jpeg

Eric Shea-Brown

University of Washington

Seattle

April 19 , 2023

TBA

TBA

gruen.jpeg

Sonia Gruen

Forschungszentrum Jülich

April 26 , 2023

TBA

TBA

téléchargement (4).jpeg

Ashok Litwin-Kumar

Columbia University

May 3 , 2023

TBA

TBA

gary2.jpeg

Gary Cotrell

UCSD

May 10 , 2023

TBA

TBA

langdon.jpeg

Angela J. Langdon

National Institute of Mental Health at NIH

May 17 , 2023

TBA

TBA

couzin.jpeg

Iain Couzin

University of Konstanz

May 24 , 2023

TBA

TBA

TBA

May 31 , 2023

TBA

TBA

TBA

June 7, 2023

TBA

TBA

TBA

June 14 , 2023

TBA

TBA

TBA

June 21 , 2023

TBA

TBA

TBA

June 28 , 2023

TBA

TBA

bottom of page