Implicit Sensorimotor Mapping of the Peripersonal Space by Gazing and Reaching

TitleImplicit Sensorimotor Mapping of the Peripersonal Space by Gazing and Reaching
Publication TypeJournal Article
Year of Publication2011
AuthorsChinellato, E, Antonelli, M, Grzyb, BJ, del Pobil, AP
JournalAutonomous Mental Development, IEEE Transactions on
Keywordsarm motor control, arm movement control, artificial agent, control engineering computing, eye movement control, Eye–arm coordination, gazing action, humanoid robot, implicit sensorimotor mapping, implicit visuomotor representation, joint-space representation, motion control, oculomotor control, peripersonal space, radial basis function framework, radial basis function networks, reaching actions, Robotics, self-supervised learning, shared sensorimotor map, spatial awareness, stereo vision

Primates often perform coordinated eye and arm movements, contextually fixating and reaching towards nearby objects. This combination of looking and reaching to the same target is used by infants to establish an implicit visuomotor representation of the peripersonal space, useful for both oculomotor and arm motor control. In this work, taking inspiration from such behavior and from primate visuomotor mechanisms, a shared sensorimotor map of the environment, built on a radial basis function framework, is configured and trained by the coordinated control of eye and arm movements. Computational results confirm that the approach seems especially suitable for the problem at hand, and for its implementation on a real humanoid robot. By exploratory gazing and reaching actions, either free or goal-based, the artificial agent learns to perform direct and inverse transformations between stereo vision, oculomotor, and joint-space representations. The integrated sensorimotor map that allows to contextually represent the peripersonal space through different vision and motor parameters is never made explicit, but rather emerges thanks to the interaction of the agent with the environment.