It’s a recording of the sound sculptures generated at our recent danceroom Spectroscopy installation at the Barbican. It’s a gentle, ambient sound that ebbs, flows, and washes over you. I’ve been enjoying it. All the sounds were generated in real-time, from the motion of peoples’ virtual energy fields within the exhibition space. There’s three primary components that contribute to the sound:
The vibrational energy of people’s fields. These are measured in real-time by taking a Fourier Transform of the atomic dynamics, and generate the deep wave-like sounds you hear in the recording.
The location and motion of different particle clusters. The motion of peoples’ fields creates transient atomic clusters, which we detect and assign to different sonic channels. The cluster positions and velocities generate different sounds.
The atom-atom collisions. The motion of people’s fields causes different atoms to collide. In the recording, these collisions generate the delicate tinkling sounds.
On 5 December, I attended the UK Many Core Developer’s Conference (UKMAC 2012), a supercomputing conference organized by Simon McIntosh-Smith. I gave a presentation & demo of danceroom Spectroscopy, with emphasis on the algorithms and heterogeneous parallelization strategies we’ve implemented to build it (see video above). There were several interesting presentations, including: Adapteva’s Andreas Olofsson keynote lecture about designing small, energy-efficient parallel architectures; Alan Gray (Edinburgh, EPCC) talking about scaling soft matter physics code to more than one thousand (!) GPUs; Zheng Wang (Edinburgh) talking about auto-generating OpenCL code from OpenMP pragmas; and Pedro Gonnet (Durham) talking about task-based parallelization algorithms applied to molecular dynamics simulations.
The team and I just got back from running our interactive quantum mechanics art installation “danceroom Spectroscopy” as part of the London 2012 Olympic events! It was awesome! We put it in an immersive 360 degree projection dome (19 meter diameter). Professor Vader, our custom-built 12-core supercomputer (who also has 2048 graphics cores) managed to run seven 3d capture cameras and talk to six graphics outputs, meanwhile doing sonic analysis, graphics rendering, and solving equations of motion for 10,000 quantum particles – all in real-time! We also ran a dance performance of Hidden Fields, which is the show that we’ve been developing using the danceroom Spectroscopy technology. Stay tuned for some video footage…