Technology

MIT has developed a robot that can “touch and feel” objects

The state of the art in artificial intelligence keeps improving steadily, especially in the domains of image, speech and text recognition. In other words, we have reached a point where AI can “see” and “hear” once it has acquired sufficient training, but that’s certainly not the end of it. MIT is all set to turn things up a notch by equipping a robot with the ability to recognize things by “touch” for the very first time.

Recognition by touch is one of the easiest and most natural ways for us humans to familiarize ourselves with an object and ultimately learn to recognize it. Unsurprisingly, it is a whole different ball game for machines, since they lack our complicated nervous system, including the brain.

Therefore, teaching a robot to “touch and feel” proceeds just like every other machine learning recognition task: feed enough relevant data to it to make it form connections between the object and its interaction with the object. At the Computer Science and Artificial Intelligence Laboratory (CSAIL) in MIT, this is exactly what a group of students set out to do.

The team first acquired a KUKA robot arm and appended a special sensor called GelSight to it. Created by Ted Adelson’s group in CSAIL, GelSight is a tactile sensor i.e. it collects relevant incoming information whenever it touches an object. Allowing the robot arm to obtain touch-related information was the first step in enabling it to connect tactile and visual data.

The next step was training, and that involved feeding the underlying machine learning algorithm 12,000 videos of 200 household objects like fabrics and tools being touched. These videos were subsequently broken down into individual frames that were used by the algorithm to correlate touch-related and vision-related data so that it could start recognizing those objects by “touch”.

“By looking at the scene, our model can imagine the feeling of touching a flat surface or a sharp edge”, said Yunzhu Li, CSAIL Ph.D. student and lead author on a new paper about the system. “By blindly touching around, our model can predict the interaction with the environment purely from tactile feelings. Bringing these two senses together could empower the robot and reduce the data we might need for tasks involving manipulating and grasping objects.”

Currently, the robot can only recognize objects by touch in a controlled environment, but this achievement is nonetheless a profound one as we continue to expand upon the abilities of artificial intelligence.

Sponsored
Hamza Zakir

Platonist. Humanist. Unusually edgy sometimes.

Leave a Comment
Share
Published by
Hamza Zakir
Tags: AIRobots

Recent Posts

Indian Army Granted Authority to Directly Takedown Social Media Content

The Indian Army has been granted direct authority to manage social media content concerning its…

8 hours ago

NUST Achieves Global Recognition with 43 Researchers Among World’s Top 2% Scientists

In an extraordinary accomplishment for Pakistani academia, 43 researchers from the National University of Sciences…

9 hours ago

Khyber Pakhtunkhwa Govt Ready to Acquire PIA with Bid Exceeding Rs10bn

The Khyber Pakhtunkhwa (KP) government has officially informed the federal government of its intention to…

10 hours ago

PTA Cautions Public on Illegal and Unlicensed Forex Trading Platforms

The Pakistan Telecommunication Authority (PTA) has issued a stern advisory, warning citizens against investing in…

10 hours ago

Record Emigration: Economic Instability Forces Pakistan’s Best and Brightest to Depart

Economic challenges are prompting skilled workers to leave Pakistan, resulting in a significant talent drain…

13 hours ago

Russia’s Fine on Google Exceeds World’s Economy

A Russian court has fined Google for blocking state-run media channels on YouTube, imposing a…

1 day ago