• Robot Hand-Eye Coordination Inspired By Infant Development

    by  • February 10, 2014 • News, Research • 0 Comments

    infant development inspired hand eye coordination robot Researchers have developed a learning mechanism for robot hand-eye coordination based off the development of human infants. By implementing infant like learning behaviors and development patters, they have created learning algorithms that provide fast and incremental hand-eye coordination.

    Providing robots with adaptive and continuously developing ways to interact with the world around them is when robotic systems will be able to act in truly autonomous ways. Most current robots rely on fixed structured environments, even the cutting edge machines found at competitions like the DARPA Robotics Challenge. These machines may be able to navigate difficult terrain and open doors, but replace that door with a different one and the robot is likely to run into a heap of trouble unless programmed otherwise.

    Through the study of very early development of humans, monitoring various aspects such as developmental psychology, sensory-motor coordination, emergent behavior and social interactions, researchers from Xiamen University, China and Aberystwyth University, UK, have produced a way to replicate how cognitive biological systems develop this skill.

    By creating a (relatively) simplistic simulation of human brain, alongside sensory motor maps (a calibration of sensor to end effector through kinematics), the team began to teach the robot to recognize its own hand thus enabling it to learn its own hand-eye coordination. Following observed infant behaviors such as repeat hand movements and later reaching movements the system continues to develop its self calibrating coordination. This approach proved to be fast at learning, in comparison to current systems alongside improved autonomy, akin to that of an infant.

    The first implementation was done only in 2D space, with the use of a single camera. Although outlined that improvements need to be made and further research in applying this approach to 3D environments alongside 3D sensing abilities.

    To read the full details on the project, follow the link below to the journal article (open to everyone).

    Via: Intech

    Leave a Reply