Je suis maître de conférences en neurosciences computationnelles et théoriques dans l'équipe Audition du département d'études cognitives de l'Ecole Normale Supérieure de Paris, et membre de l'Institut Universitaire de France. Je m'intéresse à la modélisation neuronale de la perception, en particulier de l'audition. Mes travaux sont principalement financés par l'European Research Council.
I am an associate professor in theoretical and computational neuroscience in the hearing lab at Ecole Normale Supérieure in Paris, and a member of Institut Universitaire de France. My main interest is neural modeling of perception, in particular of the auditory system. My work is primarily funded by the European Research Council.
Please don't hesitate to contact me if you are interested in working in my group. I am also an editor for the Springer Series in Computational Neuroscience: contact me if you would like to write a book.
The goal of my research is to understand the neural basis of perception, in particular auditory perception, by developing theoretical and computational neural models. The theories are tested with psychophysical and physiological experiments. They are also tested computationally, by evaluating the performance of the models in complex sensory tasks. There are two specificities in my approach.
1) The ecological approach. This means two things: 1) we study functional models, that address a specific ecological task (e.g. locating a sound source from real acoustical signals), as opposed to models aimed at reproducing a specific set of experiments; 2) we investigate how neurons extract the invariant structure of sensory signals (e.g. periodicity in pitch perception), as opposed to recognizing patterns. Most computational neuroscience and machine learning has been following the idea that the function of a sensory system is to recognize patterns (for example patterns of pixels). Instead, we try to understand how a sensory system can identify laws that govern sensory or sensorimotor signals. This is related to Gibson's ecological theory and to O'Regan's sensorimotor theory of perception. To give a simple example, consider the sound wave produced by a musical instrument (e.g. a piano), which could look like this:
There are two ways to describe this sound wave. One way is to see it as a temporal pattern (time-varying amplitude) or a spectral pattern (a sum of sine waves with specific frequencies). Another way is to observe that the wave follows a particular law: it is periodic, with a particular period. This is a more interesting description, because sounds with the same pitch (same musical note) follow the same law but may have different spectra. It is also computationally more powerful.
Yet this view has not been much developed in theoretical neuroscience, which is what I am trying to do. The first task is to understand the structure of sensory signals in an ecological environment, the second task is to understand how this structure can be detected by neurons.
2) The spike-based point of view. Neurons communicate primarily with spikes, which are discrete events, but most computational neuroscience deals with spike rates, which are continuous variables. I am interested in understanding neural computation at the spike level. Some of this is described in my Habilitation thesis, written in 2009: "Spike-based models of neural computation".
The ecological approach and the spike-based point of view are related by the idea that in a diverse population of neurons, synchrony patterns reflect the similarity structure of sensory signals (1), and synchrony can be detected by coincidence detection (2). I am developing these ideas mainly in the auditory system (3). To simulate these models, I have developed the Brian simulator (4), with Dan Goodman. I am also developing experimental techniques to test these theories (5).