It’s about more than just making a good
robot arm, though. It’s also about teaching a robotic arm tactile learning
Essentially, Google outfitted the arms with basic
optical sensors and a neural network, or artificial brain capable of learning.
The robot was then tasked with finding
specific objects in a pile, and learning how best to grasp them in order to
move the object.
This is easier said than done, given that
it’s a skill even humans gradually acquire. The researchers said it’s
essentially hand-eye coordination.
The goal is a robot that could automate
sorting tasks. This could be a bot designed for warehouse work, or one that
could sort through single stream recycling to pick out glass bottles and
aluminum cans from other items in the bin.
granting machines spatial reasoning and the ability to learn and adapt to
objects could be a big boon going forward, with bots able to adjust its grasp
for as hard or as light as the job necessitates.