The multiple sensors were designed and placed on the device in such a way that they would not interfere with each other's measurements. For example, the researchers put a rigid layer under the temperature and humidity sensors to protect them from the stretching that the facial expression sensors would experience. They also used a waterproof layer to protect the temperature and strain sensors from humidity.
“We've engineered this device to measure these different signals independently, without them interfering with each other, providing a much clearer and more accurate picture of what's happening beneath the surface,” said Libo Gao, co-corresponding author on the paper and associate professor at Xiamen University.
Next the team trained an artificial intelligence (AI) model to read and understand signs of performed and real human emotion. The researchers recruited eight people, a common sample size for pilot studies, to perform six common facial expressions: happiness, surprise, fear, sadness, anger and disgust.
The participants displayed each emotional expression 100 times while the device tracked their movement. The researchers then fed this data to an AI model, training it to correlate specific facial movements with different emotions. They then recruited an additional three participants to further evaluate the model's abilities. It classified performed facial expressions with 96.28% accuracy.
When it came to tracking real emotions, the researchers tested how well the device tracked the psychological responses of the same participants as they watched video clips designed to elicit emotions. The device correctly identified emotions with 88.83% accuracy, with the sensors confirming that the psychological responses were consistent with known links between emotions and psychological reactions, such as increases in skin temperature and heart rate during surprise and anger.
Cheng noted that the ability to wirelessly transmit the data means that health care professionals could potentially monitor individuals remotely and provide timely emotional support through telemedicine.
“This sensor can serve a vital function in bridging gaps in access to care,” he said. “Given the rising stress levels in modern society, the ability to monitor emotions can provide early indicators of debilitating conditions and allow for proactive support.”
He explained that the device also opens the door for other systems for artificial intelligence (AI)-powered disease diagnostics and therapeutics beyond just emotion recognition. He noted there may be potential applications for clinicians to better understand the mental and emotional state of non-verbal patients, better identify behavioral and psychological symptoms of dementia and recognize opioid overdose. The technology could one day even be used for chronic wound monitoring and disease management, he added, as well as track neurodegenerative disease progression and athletic performance.
“While still in the research and development phase, this device is a significant step forward in our ability to monitor and understand human emotions, potentially paving the way for more proactive and personalized approaches to mental health care,” Cheng said.
Other contributors include Hongcheng Xu of Xi’an Jiaotong University. The U.S. National Institutes of Health and the U.S. National Science Foundation funded the Penn State researchers’ contributions to this work.