Using Facial Electromyogram to Control Emotions

Published on March 25, 2022

Imagine a magical puppeteer who can control your emotions just by looking at your face! Well, we’re not quite there yet, but researchers are making progress in tracking and regulating our emotional states. In this study, scientists propose using facial electromyogram measurements as a way to infer our internal emotional state and adjust stimuli accordingly. It’s like using the expression on your face as a window into your feelings. By modeling and tracking the dynamics of emotional valence (how pleasant or obstructive we feel), they develop a system that can estimate our hidden emotional state in real-time. They even use a fuzzy logic controller to make automatic adjustments to reach desired emotional levels. The results show that this controller is effective in regulating our emotions based on physiological data. So, imagine a future where wearable devices could help us manage overwhelming emotions and maintain our well-being. If you’re fascinated by the idea of controlling emotions like a puppeteer, check out the full article for more on this groundbreaking research!

Affective studies provide essential insights to address emotion recognition and tracking. In traditional open-loop structures, a lack of knowledge about the internal emotional state makes the system incapable of adjusting stimuli parameters and automatically responding to changes in the brain. To address this issue, we propose to use facial electromyogram measurements as biomarkers to infer the internal hidden brain state as feedback to close the loop. In this research, we develop a systematic way to track and control emotional valence, which codes emotions as being pleasant or obstructive. Hence, we conduct a simulation study by modeling and tracking the subject’s emotional valence dynamics using state-space approaches. We employ Bayesian filtering to estimate the person-specific model parameters along with the hidden valence state, using continuous and binary features extracted from experimental electromyogram measurements. Moreover, we utilize a mixed-filter estimator to infer the secluded brain state in a real-time simulation environment. We close the loop with a fuzzy logic controller in two categories of regulation: inhibition and excitation. By designing a control action, we aim to automatically reflect any required adjustments within the simulation and reach the desired emotional state levels. Final results demonstrate that, by making use of physiological data, the proposed controller could effectively regulate the estimated valence state. Ultimately, we envision future outcomes of this research to support alternative forms of self-therapy by using wearable machine interface architectures capable of mitigating periods of pervasive emotions and maintaining daily well-being and welfare.

Read Full Article (External Site)

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes:

<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>