How does our brain allow us to see? What goes wrong in visual disorders? How can we replicate human visual functions in machines? These are some of the questions investigated in the Active Perception Laboratory
Research in the Active Perception Laboratory
integrates approaches from experimental and theoretical/computational neuroscience to elucidate the mechanisms underlying visual perception in humans and replicate similar processing strategies in artificial systems. Techniques include high-resolution eye- and head-tracking, human psychophysics under virtual reality and precise real-time control of retinal stimulation, electroencephalography, computational modeling of neural systems, and robotics.
Our research, supported by the National Science Foundation and the National Institutes of Health, has led to multiple findings on the way humans and other species encode visual information and establish spatial representations. It has raised specific hypotheses on the influences of eye movements during development, has resulted in new tools for experimental studies in visual neuroscience, and has led to robots directly controlled by models of neural pathways.
Interested in joining our team? Please see the OPEN POSITIONS
page to apply.