Facebook Metaverse: From emoji to 3D avatars and “black mirror” functions
Meta CEO Mark Zuckerberg announced that a new touch sensor and a plastic genre can work together to create a leather-like material that could be used to create a more immersive experience in the metaverse.
In collaboration with Carnegie Mellon‘s scientists, Meta’s artificial intelligence researchers created a transformable plastic “skin” less than 3 millimeters thick. ReSkin, as the well-known technology is called, can detect forces of up to 0.1 newton from objects that are less than 1 millimeter in size.
The skin was tested on robots that knew how to handle soft fruits, such as grapes and blueberries. Specifically, it was placed inside a rubber glove, while a human hand formed a bao bun.
The AI system had to be trained in 100 human contacts to ensure that it had enough data to understand how changes in the magnetic field are related to touch.
It is a high-resolution touch sensor made together with the Carnegie Mellon resulting in the construction of a thin skin robot, according to Zuckerberg. This creation brings them one step closer to realistic virtual objects and physical interactions in the metaverse.
The first presentation also concerns realistic Codex avatars combined with a digital environment that will support their existence, which, according to the company, was depicted in real time reacting to objects of the real world.
During the presentation, Meta showed the work it does in its Codec Avatars to give users better control over their eyes, facial expressions, hairstyles and appearance.
The company also showed the ability to simulate how the avatar’s hair and skin could react to different lighting conditions and environments, and even how it works on interactive clothes.
As for the interface, meta relies on something called electromyography or EMG, to turn the signals sent by the brain in hand into computer commands, having already unveiled a relevant bracelet earlier this year.
The company finally unveiled the performance of the environment in real time, which would eventually be a place to use the avatar in order to interact with others. The system also allows people to interact with real-life objects, with changes reflected in the virtual world.
While realistic environments are nothing new, being able to make changes to them in the real world, seeing changes happen in a virtual world is something empirically new for users.