MIT Researchers Build AI System That Can Visualise Objects Using Touch
7:11:15 2019-06-19 818

A team of researchers at the Massachusetts Institute of Technology (MIT) have come up with a predictive Artificial Intelligence (AI) that can learn to see by touching and to feel by seeing.

 

While our sense of touch gives us capabilities to feel the physical world, our eyes help us understand the full picture of these tactile signals.

 

Robots, however, that have been programmed to see or feel can't use these signals quite as interchangeably.

 

The new AI-based system can create realistic tactile signals from visual inputs, and predict which object and what part is being touched directly from those tactile inputs.

 

In the future, this could help with a more harmonious relationship between vision and robotics, especially for object recognition, grasping, better scene understanding and helping with seamless human-robot integration in an assistive or manufacturing setting.

 

"By looking at the scene, our model can imagine the feeling of touching a flat surface or a sharp edge", said Yunzhu Li, PhD student and lead author from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL).

 

"By blindly touching around, our model can predict the interaction with the environment purely from tactile feelings," Li added.

 

The team used a KUKA robot arm with a special tactile sensor called GelSight, designed by another group at MIT.

 

Using a simple web camera, the team recorded nearly 200 objects, such as tools, household products, fabrics, and more, being touched more than 12,000 times.

 

Breaking those 12,000 video clips down into static frames, the team compiled "VisGel," a dataset of more than three million visual/tactile-paired images.

 

"Bringing these two senses (vision and touch) together could empower the robot and reduce the data we might need for tasks involving manipulating and grasping objects," said Li.

 

The current dataset only has examples of interactions in a controlled environment.

 

The team hopes to improve this by collecting data in more unstructured areas, or by using a new MIT-designed tactile glove, to better increase the size and diversity of the dataset.

 

"This is the first method that can convincingly translate between visual and touch signals", said Andrew Owens, a post-doc at the University of California at Berkeley.

 

The team is set to present the findings next week at the "Conference on Computer Vision and Pattern Recognition" in Long Beach, California.

Forgive Others   2025-07-23
Reality Of Islam

Patience in Islamic Codices

11:28:24   2025-08-02  

The Fields of Patience

11:22:10   2025-07-30  

Patience Against Sin

10:34:41   2025-07-23  

A Mathematical Approach to the Quran

10:52:33   2024-02-16  

mediation

2:36:46   2023-06-04  

what Allah hates the most

5:1:47   2023-06-01  

allahs fort

11:41:7   2023-05-30  

striving for success

2:35:47   2023-06-04  

Imam Ali Describes the Holy Quran

5:0:38   2023-06-01  

livelihood

11:40:13   2023-05-30  

silence about wisdom

3:36:19   2023-05-29  

Gold remains perfectly solid wh

read more

MOST VIEWS

Importance of Media

9:3:43   2018-11-05

Illuminations

the effect of words

5:58:12   2021-12-18

good people

11:34:48   2022-06-29

true friendship

11:2:27   2022-10-06

the quran

3:18:29   2022-12-24

belief cause cleanliness

10:47:11   2022-11-22

friendship

2:13:43   2022-05-27



IMmORTAL Words
LATEST Just One Diet Soda a Day May Raise Your Type 2 Diabetes Risk by 38% Gold Does Something Unexpected When Superheated Past Its Melting Point Scientists Found a Mysterious Barrier in The Ocean That Jellyfish Will Not Cross Take Responsibility for Your Choices Interpretation of Sura Hud - Verses 108-110 Patience in Islamic Codices Study Reveals the Shocking Amount of Plastic We Breathe in Every Day Third Phase of AI Is Here. Here is How Agents May Impact Our Lives. Yellowstone Aspen Forests Are Already Responding to The Return of Wolves The Key to Success in Your Work and Life The Fields of Patience Interpretation of Sura Hud - Verses 105-107