Live demonstration of emotion recognition via a brain-computer interface
Jul 12, 2017
In the EMOIO project, funded by the German Federal Ministry of Education and Research (BMBF), scientists from Fraunhofer IAO are developing an emotionally responsive, neuro-adaptive brain-computer interface. The first live demonstration took place at the BMBF future congress “Bringing technology to people,” held in Bonn June 26-27.
Emotions play a crucial role in our interaction with technology – but even smart systems are not yet able to react appropriately to human emotion. In the EMOIO project, Fraunhofer IAO is working with the University of Stuttgart’s Institute of Human Factors and Technology Management IAT to investigate how techniques from neuroscience might be applied to gather and interpret user emotion based on brain activity. The intention is to relay these emotions to computer systems via a brain-computer interface – meaning that their design and behavior could be adapted to the needs of individual users.
First live demonstration at the BMBF future congress
Together with their project partners, the University Hospital of Tübingen, Brain Products GmbH and NIRx Medizintechnik GmbH, Fraunhofer IAO and the University of Stuttgart’s IAT gave the first live demonstration of their brain-computer interface (BCI) at the BMBF future congress going by the tagline “Bringing technology to people.”In the live demonstration, a test subject was presented with emotion inducing content such as pictures of baby animals and scenes of war. While looking at the pictures, the test subject’s brain activity was monitored using electroencephalography and near-infrared spectroscopy, and analyzed by an algorithm in real time. The algorithm sifts brain signals for patterns that have been identified to correlate with positive and negative emotions. This way, it is able to categorize a test subject’s reaction to an image as positive or negative within the space of a few seconds. This result can then be relayed to any computer system.
Mobile app displays emoticon for how you feel
The emotional classification determined by the algorithm is not just relevant for computer systems, but might also be interesting to users themselves. For this reason, Fraunhofer IAO has developed a mobile app that shows users their classified emotion in real time. The user’s current emotional state is represented by an emoticon, together with a dynamic chart showing the user’s emotional state over time. In future versions, the app is to be developed so that users can comment on their emotional experiences, and add contextual information such as the place, situation, or image.