Designing an Expressive Avatar of a Real Person

Designing an Expressive Avatar of a Real Person

Lee, S., Carlson, G., Jones, S., Johnson, A., Leigh, J., Renambot, L.

image

  • Caption: Skin Texture Enhancement. Top: default, Bottom: acquired by projecting high quality photos of a target person onto 3D mesh model
  • Credit: S. Lee, EVL

The human ability to express and recognize emotions plays an important role in face-to-face communication, and as technology advances it will be increasingly important for computer-generated avatars to be similarly expressive. In this paper, we present the detailed development process for the Lifelike Responsive Avatar Framework (LRAF) and a prototype application for modeling a specific individual to analyze the effectiveness of expressive avatars. In particular, the goals of our pilot study (n = 1,744) are to determine whether the specific avatar being developed is capable of conveying emotional states (Ekmanös six classic emotions) via facial features and whether a realistic avatar is an appropriate vehicle for conveying the emotional states accompanying spoken information. The results of this study show that happiness and sadness are correctly identified with a high degree of accuracy while the other four emotional states show mixed results.