By Bryan Clark
MIT researchers at the Computer Science and Artificial Intelligence Lab (CSAIL) have created a predictive AI that allows robots to link multiple senses in much the same way humans do.
“While our sense of touch gives us a channel to feel the physical world, our eyes help us immediately understand the full picture of these tactile signals,” writes Rachel Gordon, of MIT CSAIL. In robots, this connection doesn’t exist. In an effort to bridge the gap, researchers developed a predictive AI capable of learning to “see by touching” and “feel by seeing,” a means of linking senses of sight and touch in future robots.