From 9 – 13 Jan 2017, myself a group of collaborators came together at London’s Barbican to participate in their ‘Open Lab’ programme. The idea of the Open Lab is to provide an experimental arts space where collaborators can investigate directions for future work. With support from EPSRC, Arts Council England, the University of Bristol, and interactive Scientific (iSci), we came to the Barbican with two primary aims. First, we wanted to develop our interactive molecular virtual reality platform into a multi-person experience – i.e., allowing multiple people to simultaneously inhabit and share the same virtual reality. This is an area that I think is really important for VR moving forward, otherwise it risks becoming yet another sophisticated technology for alienating people. Second, we wanted to investigate the potential for this multi-person VR framework to support new approaches toward molecular aesthetics that might operate in an artistic and performative contexts…
We had a talented group of collaborators come together to support this research – some old faces (from the original danceroom Spectroscopy team), and also some new. The team was comprised of three of my Bristol-University based Phd students: Mike O’Connor, Rob Arbon, and Lisa May Thomas. Also involved were digital artist Phill Tew (iSci); electronic musician Prof. Joseph Hyde (Bath Spa University); sonic interaction expert Dr. Tom Mitchell (University of the West of England); contemporary dancer Isabelle Cressy; visual artist Dr. Gemma Anderson (Exeter University), and finally Benjamin de Kosnik, a San-Francisco based digital artist and activist.
It was a great experience! We all lived together for a week in a massive house in Shoreditch, which allowed us to interact as a group after our days spent at the Barbican. It also led to a few nights without much sleep as various team members worked late into the night hacking together various bits and pieces. As a team, we all interacted with one another over the entire period, but our efforts were loosely subdivided into three groups:
- Myself, Mike, and Phil were responsible for stabilising the multi-person technology framework that enabled multiple people to simultaneously inhabit the same interactive molecular virtual reality (there’s a video link here that gives you an idea of how the framework we built at the Barbican works);
- Tom and Joe were focussed on developing an 8-channel audio system which was integrated into the VR system, so as to allow the virtual reality users to have audio feedback which was spatially localized, depending on their position in space.
- Lisa, Izzie, and Gemma were focussed on developing:
- a choreographic movement vocabulary with the potential to describe the range of ways in which a dancer embedded in VR might user his/her body to interact with a real-time biomolecular dynamics simulations, for potential scientific and also performative applications
- a series of sketches which could articulate the biomolecular choreographic vocabulary
- Rob &Benjamin worked with Gemma to design neural network (NN) algorithms aimed at analyzing the style contained in her analogue drawings of biomolecular dynamics. The question here was: can we use the stylistic flesh of analogue drawings to explore aesthetic possibilities in the digital space – i.e., enabling us to find alternative algorithmic strategies for digitally rendering biomolecular structures. For example, the images below show NN analysis they carried out on one of her drawings.
After five amazing days together, I am glad to report that we made lots of exciting progress in tackling the strands outlined above! We built a framework that allows multiple people to inhabit the same virtual reality. At the moment, we know that it can handle up to four people, and we think that it can go up to eight without many problems… We also made great progress in developing what we have started to call a ‘bio-inspired’ choreography – i.e., a set of human movement principles gleaned from analysing the intricate and beautiful dynamics of biomolecular structures. We also made progress in developing some NN frameworks that allow us to begin exploring alternative digital rendering algorithms for biomolecular structures. The video link above gives a little taster of our journey during the week, but this is just the beginning – and I’m very excited! Stay tuned…