Keyword(s):Gesture , Controller , Synthesis and Resynthesis Techniques , Interactivity , Gestural Interface , Real-time , MIDI , Computer Music , Multimedia
The author presents his experience on virtual instruments which use sound synthesis and interactive video: SuperPolm, BodySuit, BigEye. He details the connection between his different system levels (gesture, gestural interface, mapping, algorithm), the characteristics of interactivity and its use in performance contexts. He also discusses the nature of human perception that may occur during a performance.
All references of the same author:
(English)Goto, Suguru (2006). The Case Study of An Application of The System, 'BodySuit' and 'RoboticMusic' - Its Introduction and Aesthetics