[ad_1]

Researchers say their study shows how emotions, which are complex forms of human information, could be used in next-generation wearable systems



A slew of new artificial intelligence (AI) technologies are popping up frequently, giving a glimpse into an exciting future. The latest is a wearable technology that can recognise human emotions in real-time.

The personalised skin-integrated facial interface (or PSiFI), developed by researchers at Ulsan National Institute of Science & Technology (UNSIT), uses machine learning to accurately identify emotions even when a person is wearing a mask. It combines verbal and non-verbal cues using a self-powered, stretchable sensor, which processes data for wireless communication, a Neuroscience News report explains.

The new technology has been applied in a virtual reality “digital concierge” scenario, which highlights its capability to customise user experiences in smart environments. According to the researchers, this innovative development is a huge leap forward towards improving human-machine interactions by integrating complex emotional data.

The PSiFI has a first-of-its-kind bidirectional triboelectric strain and vibration sensor that makes it possible for the simultaneous sensing and integration of verbal and non-verbal expression data. It is self-powered, stretchable, and transparent, Neuroscience News elaborates. The research was published recently in the journal Nature Communications.

The system generates power through the separation of charges upon friction – a mechanism known as friction charging. Hence, there is no need for external power sources or intricate measurement devices. Furthermore, a semi-curing technique was used to manufacture a transparent conductor for the friction-charging electrodes.

The research team integrated the identification of facial muscle deformation and vocal cord vibrations, enabling real-time emotion recognition, the Neuroscience News report explains. 

“With this developed system, it is possible to implement real-time emotion recognition with just a few learning steps and without complex measurement equipment. This opens up possibilities for portable emotion recognition devices and next-generation emotion-based digital platform services in the future,” study author Jin Pyo Lee said in a press statement.

The system showed high emotional recognition accuracy with minimal training. Notably, its wireless and customizable features make it a wearable device and add to its convenience. According to the researchers, this study shows how emotions, which are complex forms of human information, can be used in next-generation wearable systems.

However, as rapid development in AI is moving towards enhancing human-machine interactions, there are also privacy concerns and other challenges such as bias in the system.

[ad_2]

Source link