This project was undertaken as a prelude to an interaction design project that aimed to develop user interfaces (hardware and software) for performing loop-based, live-sequenced electronic music (e.g. techno, house. etc.) with performer-device interactions that are emotive and legible to the audiences.
We were interested in the question of how watching a live-sequenced electronic music performance, compared to merely hearing the music, contributes to spectators’ experiences of tension. We explored this question via an experiment based on Vines, Krumhansl, Wanderley & Levitin’s 2006 work on cross-modal interactions in the perception of musical performance. We also explored the role of the performers’ “effective” and “ancillary” gestures in conveying tension, when they can be seen.
We conducted an experiment where 30 participants heard, saw, or both heard and saw a live-sequenced techno music performance recording while they produced continuous judgments on their experience of tension. Eye tracking data was also recorded from participants who saw the visuals, to reveal aspects of the performance that influenced their tension judgments. We analysed the data to explore how auditory and visual components and the performer’s movements contribute to spectators’ experience of tension. Our results show that their perception of emotional intensity is consistent across hearing and sight, suggesting that gestures in “non-instrumental” live-sequencing can be a medium for expressive performance.
Baytaş, M.A., Göksun, T., & Özcan, O. (2016). The Perception of Live-sequenced Electronic Music via Hearing and Sight. In Proceedings of the International Conference on New Interfaces for Musical Expression (NIME 2016).