Our lab works on theoretical neuroscience, with the fundamental goal of understanding how networks of neurons and synapses cooperate across multiple scales of space and time to mediate important brain functions, like sensory perception, motor control, and memory. To achieve this goal, we employ and extend tools from disciplines like statistical mechanics, dynamical systems theory, machine learning, information theory, control theory, and high-dimensional statistics, as well as collaborate with experimental neuroscience laboratories collecting physiological data from a range of model organisms. Some topics of interest include: how birds learn to sing, spatial memory in the rodent hippocampus, attention and motor control in macaques, memory properties of complex synapses, dynamics of plasticity in recurrent networks, signal propagation in neural circuits, the emergence of categorization in multi-layered networks, and the statistical mechanics of compressed sensing.
Condensed Matter Physics
We employ techniques from statistical mechanics, like replica theory and random matrix theory, to analyze the complex dynamics of learning, signal propagation and memory in neuronal networks. We also have an interest in exploiting statistical mechanics to analyze the performance of algorithms from machine learning which could be implemented in neuronal architectures.
- Holographic Protection of Chronology in Universes of the Godel Type
- Twisted Six Dimensional Gauge Theories, Matrix Models and Integrable Systems
- E10 Orbifolds
- Function Constrains Network Architecture and Dynamics: A Case Study on the Yeast Cell Cycle Network
- One Dimensional Dynamics of Attention and Decision Making in LIP
- Memory Traces in Dynamical Systems
- Feedforward to the past: the relation between neuronal connectivity, amplification, and short-term memory
- Statistical Mechanics of Compressed Sensing
- Short-term memory in neuronal networks through dynamical compressed sensing