Sonic Plasma Sphere

In this project a normal plasma sphere is "hacked" and transformed into an interactive controller for realtime audiovisual performances. By placing a hand or finger(s) on the glass surface larger light blobs are created, which makes it easier to identify them using a simple webcam and a computer vision system that analyzes and computes the interactions.

The light blobs that are created on the glass surface are tracked based on their size, and it is possible for the system to recognize multiple interactions simultaneously, as well as to define blob positions in XYZ space. After the tracking process, the system allows for the control of realtime sound synthesis engines (from additive, to pulse and granular synthesis), trigger sound samples, and control a whole music set interactively. Moreover, the sphere can be used as a 360 degrees controller that positions the sound source in the projected sound space, using specific modes that allow the spatialization of the generated sound in stereo, surround, or ambisonics.

Other features that have been included in this project, include abstract visual events that are generated and projected on near-by surfaces, voice recognition to allow flexible and instant control or various modes of the controller, and also automation features of "senseless" functions such as post messages to Twitter, or control the lights of a whole building.

The plasma sphere controller has been used so far in electro-acoustic performances, DJing, and live EDM performances. Following, is a little demonstration of how it can be used to control a range of sound syntheses.

Tags: computer vision, interaction, visuals, controller, synthesis

 Print  Email