top of page

3D Auditory Stage

a research on networked performance based on 3d sound effect

by Danqing Shi, Ke Fang, Muxi Gao, Xin Wen, Haozhe Liu  (Tsinghua University)  

The research uses 3 dimensional sounds to  create a virtual stages, on which, online audiences and performers are located with specific coordinates. Each audience can virtually move around in the sound-scape, interact with the scene and experience the development of the story.

There is an UI for director who can create auditory scenes by “animate” the sound samples on the virtual stage on a timeline and set the plots/conditions that trigger different sound samples. Beside pre-recored sounds, real performer can interact with audience use real-time chatting methods. 

bottom of page