Encoding of multi-modal emotional information via personalized skin-integrated wireless facial interface

Jin Pyo Lee, Hanhyeok Jang, Yeonwoo Jang, Hyeonseo Song, Suwoo Lee, Pooi See Lee*, Jiyun Kim*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

37 Citations (Scopus)

Abstract

Human affects such as emotions, moods, feelings are increasingly being considered as key parameter to enhance the interaction of human with diverse machines and systems. However, their intrinsically abstract and ambiguous nature make it challenging to accurately extract and exploit the emotional information. Here, we develop a multi-modal human emotion recognition system which can efficiently utilize comprehensive emotional information by combining verbal and non-verbal expression data. This system is composed of personalized skin-integrated facial interface (PSiFI) system that is self-powered, facile, stretchable, transparent, featuring a first bidirectional triboelectric strain and vibration sensor enabling us to sense and combine the verbal and non-verbal expression data for the first time. It is fully integrated with a data processing circuit for wireless data transfer allowing real-time emotion recognition to be performed. With the help of machine learning, various human emotion recognition tasks are done accurately in real time even while wearing mask and demonstrated digital concierge application in VR environment.

Original languageEnglish
Article number530
JournalNature Communications
Volume15
Issue number1
DOIs
Publication statusPublished - Dec 2024
Externally publishedYes

Bibliographical note

Publisher Copyright:
© 2024, The Author(s).

ASJC Scopus Subject Areas

  • General Chemistry
  • General Biochemistry,Genetics and Molecular Biology
  • General Physics and Astronomy

Fingerprint

Dive into the research topics of 'Encoding of multi-modal emotional information via personalized skin-integrated wireless facial interface'. Together they form a unique fingerprint.

Cite this