Part of the research conducted in the Active Perception Laboratory involves the embodiment of neuronal models in robotic systems. Robots are useful tools in active perception studies, as they allow exposure of neural models to the real sensory inputs present during behavior.

Recent biorobotic studies conducted in the Active Perception Laboratory include:
General approach
M. Rucci, D. Bullock, and F. Santini, Integrating robotics and neuroscience: brains for robots, bodies for brains, Advanced Robotics (in press).
RucciEtAl07_1 Abstract: Researchers in robotics and artificial intelligence have often looked at biology as a source of inspiration for solving their problems. From the opposite perspective, neuroscientists have recently turned their attention to the use of robotic systems as a way to quantitatively test and analyze theories that would otherwise remain at a speculative stage. Computational models of neurons and networks of neurons are often activated with simplified artificial patterns that bear little resemblance to natural stimuli. The use of robotic systems has the advantage of introducing phenotypic and environmental constraints similar to those that brains of animals have to face during development and in everyday life. Consideration of these constraints is particularly important in light of modern brain theories, which emphasize the importance of closing the perception/action loop between the agent and the environment. To provide concrete examples of the use of robotic systems in neuroscience, this paper reviews our work in the areas of sensory perception and motor learning. The interdisciplinary approach followed by this research establishes a direct link between natural sciences and engineering. This research can lead to the understanding of basic biological problems while producing robust and flexible systems that operate in the real world.

[TOP]

Active 3D vision in a humanoid robot
F. Santini and M. Rucci (2007), Active estimation of distance in a robotic system that replicates human eye movements, Journal of Robotics and Autonomous Systems. 55, 107-121.
SantiniRucci07 Abstract: In a moving agent, the different apparent motion of objects located at various distances provides an important source of depth information. While motion parallax is evident for large translations of the agent, a small parallax also occurs in most head/eye systems during rotations of the cameras. A similar parallax is also present in the human eye, so that a redirection of gaze shifts the projection of an object on the retina by an amount that depends not only on the amplitude of the rotation, but also on the distance of the object with respect to the observer. This study examines the accuracy of distance estimation on the basis of the parallax produced by camera rotations. Sequences of human eye movements were used to control the motion of a pan/tilt system specifically designed to reproduce the oculomotor parallax present in the human eye. We show that the oculomotor strategies by which humans scan visual scenes produce parallaxes that provide accurate estimation of distance. This information simplifies challenging visual tasks such as image segmentation and figure/ground segregation.

A movie (Movie AVI 13.6MB) illustrates the results obtained with the humanoid robot, Mr.T.

See also:
F. Santini and M. Rucci, Depth perception in an anthropomorphic robot that replicates human eye movements, IEEE International Conference on Robotics and Automation, Orlando, FL, May 2006 - Best Vision Paper Award.
[TOP]

Spatial localization in a robotic barn owl
M. Rucci, G.M. Edelman, and J. Wray (1999), Adaptation of orienting behavior: from the barn owl to a robotic system, IEEE Transactions on Robotics and Automation 15(1), 96-110.
RucciEdelmanWray99 Abstract: Autonomous robotic systems need to adjust their sensorimotor coordinations so as to maintain good performance in the presence of changes in their sensory and motor characteristics. Biological systems are able to adapt to large variations in their physical and functional properties. In the last decade, the adjustment of orienting behavior has been carefully investigated in the barn owl, a nocturnal predator with highly developed auditory capabilities.We have recently proposed that the development and maintenance of the barn owl’s accurate orienting behavior can be explained through a process of learning based on the saliency of sensorimotor events. In this paper we consider the application of a detailed computer model of the principal neural structures involved in the process of spatial localization in the barn owl to the control of the orienting behavior of a robotic system, in the presence of auditory and visual stimulation. The system is composed of a robotic head equipped with two lateral microphones and a camera. We show that the model produces accurate orienting behavior toward both auditory and visual targets and is able to quickly recover good performance after alterations of the sensory inputs and motor outputs. The results illustrate that an architecture specifically designed to account for biological phenomena can produce flexible and robust motor control of a robotic system operating
in the real world.

[TOP]

M. Rucci, J. Wray, and G.M. Edelman (2000), Robust localization of auditory and visual targets in a robotic barn owl, Journal of Robotics and Autonomous Systems 30(1-2), 181-194.
RucciWrayEdelman00 Abstract: In the last two decades, the barn owl, a nocturnal predator with accurate visual and auditory capabilities, has become a common experimental system for neuroscientists investigating the biological substrate of spatial localization and orienting behavior. As a result, much data are now available regarding the anatomy and physiology of many neural structures involved in such processes. On the basis of this growing body of knowledge, we have recently built a computer model that incorporates detailed replicas of several important neural structures participating in the production of orienting behavior. In order to expose this model to sensorimotor and environmental conditions similar to those experienced by a barn owl, the computer simulations of the neural structures were coupled to a robot emulating the head of a barn owl, which was presented with auditory and visual stimulation. By using this system we have performed a number of studies on the mechanisms underlying the barn owl’s calibration of orienting behavior and accurate localization of auditory targets in noisy environments. In this paper we review the main results that have emerged from this line of research. This work provides a concrete example of how, by coupling computer simulations of brain structures with robotic systems, it is possible to gain a better understanding of the basic principles of biological systems while producing robust and flexible control of robots operating in the real world.

[TOP]


See the robotic barn owl ( MovieMPG 10.6MB) after it has learned to orient toward audio-visual stimuli.

[TOP]