top of page

Theatre of the Future | Listening Together | 2017

Writer's picture: CreativeComputingGroupCreativeComputingGroup

Updated: Nov 16, 2019

A human tends to reference other human's feeling and set his one accordingly. This project explores the possibilities of using virtual character for helping audience to understand music better. The facial expression of one who understands a music well is captured using a motion-capture method and projected in the concert hall. During the concert, an audience and the character listen to a music together.




Publication

1. Saebom Kwon, Hyang Sook Kim, and Jusub Kim, Guided Music Listening: Can a Virtual Character Help Us Appreciate Music Better?, HCI Korea 2017, Feb. 2017

2. Saebom Kwon, Jusub Kim. (2019). Enhancing Music Listening Experience Based on Emotional Contagion and Real-time Facial Expression Retargeting. Journal of Digital Contents Society, 20(6), 1117-1124.

Comments


Commenting has been turned off.

    Creative Computing Group | Dept. of Art & Technology | School of Media, Arts, and Science

    Sogang University | 35 Baekbeom-ro, Seoul, South Korea

    bottom of page