TY - JOUR T1 - Implicit Sensorimotor Mapping of the Peripersonal Space by Gazing and Reaching JF - Autonomous Mental Development, IEEE Transactions on Y1 - 2011 A1 - Eris Chinellato A1 - Marco Antonelli A1 - Beata J. Grzyb A1 - Angel P. del Pobil KW - arm motor control KW - arm movement control KW - artificial agent KW - control engineering computing KW - eye movement control KW - Eye–arm coordination KW - gazing action KW - humanoid robot KW - implicit sensorimotor mapping KW - implicit visuomotor representation KW - joint-space representation KW - motion control KW - oculomotor control KW - peripersonal space KW - radial basis function framework KW - radial basis function networks KW - reaching actions KW - Robotics KW - self-supervised learning KW - shared sensorimotor map KW - spatial awareness KW - stereo vision AB -

Primates often perform coordinated eye and arm movements, contextually fixating and reaching towards nearby objects. This combination of looking and reaching to the same target is used by infants to establish an implicit visuomotor representation of the peripersonal space, useful for both oculomotor and arm motor control. In this work, taking inspiration from such behavior and from primate visuomotor mechanisms, a shared sensorimotor map of the environment, built on a radial basis function framework, is configured and trained by the coordinated control of eye and arm movements. Computational results confirm that the approach seems especially suitable for the problem at hand, and for its implementation on a real humanoid robot. By exploratory gazing and reaching actions, either free or goal-based, the artificial agent learns to perform direct and inverse transformations between stereo vision, oculomotor, and joint-space representations. The integrated sensorimotor map that allows to contextually represent the peripersonal space through different vision and motor parameters is never made explicit, but rather emerges thanks to the interaction of the agent with the environment.

VL - 3 ER -