Meta Is Cracking The Robotic Space, One Touch At A Time
Meta is advancing in the frontiers of AI in robotics and is further exploring touch sensitivity.
We are fascinated by robots because they reflect ourselves." This quote from robotics researcher Ken Goldberg is more accurate than ever. Robots are becoming more and more humanized day by day. And Meta (formerly Facebook) is accelerating this growth trend even further. The company pushes the boundaries of artificial intelligence (AI) in robotics and explores water resources in the field of sensory sensitivity. Meta recently announced a new sensor to grow its tactile sensing ecosystem. The field of robotics aimed at understanding and reproducing human contact in the physical world. Touch is an essential part of knowing your environment. These advances will allow robots to interact more effectively with the world around them, giving them more capabilities. The four pillars of the sensory ecosystem: hardware, simulators, libraries, benchmarks, and datasets are needed to create artificial intelligence systems that can interact through touch.
After decades of research, researchers have yet to decipher the universal manipulation. This remains an unresolved problem in robotics. However, as a step towards improving robot handling, Meta introduced DIGIT, a compact, high-resolution tactile sensor for suction manipulation. DIGIT is a robot sensor or sensor instrument. Earlier last year, Meta (then Facebook) released a full open source version of DIGIT. Compared to other commercially available tactile sensors currently available, DIGIT claims to be cheaper to manufacture and Facebook AI offers more (hundreds of thousands) more touch points than others, making it more useful and accessible to researchers. This time, Meta AI announced that it is partnering with MIT's subsidiary GELSight for the commercial production of DIGIT, helping researchers conduct and accelerate sensory research.
Meta's open-source TACTO is a high-resolution vision-based tactile sensor simulator that provides a faster experimental platform, supporting machine learning researchers without hardware. Simulators like TACTO play an important role in prototyping, debugging, and testing new advances in robotics, allowing testing without actual experimentation. In addition, TACTO can display high-resolution touch readings at hundreds of frames per second, allowing researchers to simulate vision-based tactile sensors in a variety of form factors that can be mounted on robots.
Significantly, Meta AI teamed up with Carnegie Mellon researchers to create ReSkin, a synthetic leather for robotic arms. This open source touch shell is 2-3mm thick and is made of plastic. ReSkin has a low form factor that can help robots learn high-frequency tactile sensing over large surfaces. This generalized tactile skin helps gather data and train AI models to perform touch-based tasks. ReSkin provides tactile sensing for suction manipulation that can be used to develop AI to train robots in tasks like using a key to open a door and grab a fragile object like a strawberry. It can also be useful for measuring forces while a person interacts with an object.
How does it work?
ReSkin is a deformable elastomer with embedded magnetic particles. In other words, it contains micromagnetic particles that can create a magnetic field. When they touch different skin, the magnetic field particles alternate and the sensor registers a magnetic flux. Data collected in this way is fed into an AI system to understand the power of touch and interaction. ReSkin can also provide researchers with high-frequency triaxial tactile signals for quick manipulation including popping, catching, throwing and sliding. When worn, it can be removed and replaced.
Meta showed that you can make 100 ReSkin units for less than $6. Each of these blocks has a lifetime of 50,000 interactions and has a high temporal resolution of up to 400 Hz and a spatial resolution of 1 millimeter with 90% accuracy. The technical features make the ReSkin ideal for use with sleeves, tactile gloves and robotic arms.
MetaAI researchers say improved sensory perception could help advance AI and create robots with advanced features and capabilities. It also opens up opportunities in AR and VR to drive innovation in industrial, agricultural and medical robotics. They are working to make sure each robot has sensory capabilities.