Facial expression generation of 3D avatar based on semantic analysis

Dinmukhamed Mukashev, Merey Kairgaliyev, Ulugbek Alibekov, Nurziya Oralbayeva, Anara Sandygulova

Research output: Chapter in Book/Report/Conference proceedingConference contribution

3 Citations (Scopus)

Abstract

3D avatars are widely used in various fields of emerging technology, from augmented reality to social robots. To interact in a natural way with the user, they must be able to show at least some basic emotions. However, generating animation for these virtual avatars is a time-consuming task and a creative process. The main goal of this work is to facilitate a generation of facial animation of basic emotions on a 3D avatar. To this end, we developed and compared two approaches. The first method consists of the generation of animation using tuning Blendshape features of the 3D model, whereas the second method captures it from the real face and maps it on the model correspondingly. Additionally, the text, from which the emotion was estimated, was passed to lip synchronization software for generating realistic lip movements for the avatar. Then, animations of six basic emotions were shown in different variations in the survey and respondents were asked to guess the emotion shown in the video. Besides, such anthropomorphic features of the avatar as human-likeness, life-likeness and pleasantness were examined. In general, the analysis of the survey provided the following interesting findings: a) participants did not have significant differences in recognizing emotions based on the type of animation generation method; b) inclusion of voice significantly enhanced the recognition of emotion. In relation to participants' accuracy of emotion recognition, Excitement and Happiness were mostly confused between each other more than any other two emotions, while Anger was the easiest emotion to recognize.

Original languageEnglish
Title of host publication2021 30th IEEE International Conference on Robot and Human Interactive Communication, RO-MAN 2021
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages89-94
Number of pages6
ISBN (Electronic)9781665404921
DOIs
Publication statusPublished - Aug 8 2021
Event30th IEEE International Conference on Robot and Human Interactive Communication, RO-MAN 2021 - Virtual, Vancouver, Canada
Duration: Aug 8 2021Aug 12 2021

Publication series

Name2021 30th IEEE International Conference on Robot and Human Interactive Communication, RO-MAN 2021

Conference

Conference30th IEEE International Conference on Robot and Human Interactive Communication, RO-MAN 2021
Country/TerritoryCanada
CityVirtual, Vancouver
Period8/8/218/12/21

Keywords

  • 3D avatar
  • Animations
  • Blendshapes
  • Emotions
  • Face
  • Unity

ASJC Scopus subject areas

  • Human-Computer Interaction
  • Communication
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Facial expression generation of 3D avatar based on semantic analysis'. Together they form a unique fingerprint.

Cite this