Etextile VR data glove experiments


This is some fun stuff that we’ve been building in the Intangible Realities Laboratory! This video shows my perspective as I tie a knot in 17-Alanine using the Narupa framework while wearing the new customized Etextile VR glove prototype designed by Becca Rose & Rachel Freire. Alex Binnie did a great job building the software interface for the gloves. The gloves have been designed specifically to enable efficient pinching and grasping of the sort required in a VR-enabled real-time simulation environment. Stay tuned – we’ll soon be open-sourcing the designs, for both the gloves and for the software interface. Meanwhile, we’ve just posted to the arXiv a paper we’ve written about the design. Stay tuned for more!

setting our {open-source} VR framework free!

Sounding the gong & ringing the bells the moment we set the project free!

After nearly three years of research & development, I’m happy to announce that we have finally open sourced our multi-person, virtual-reality enabled, real-time simulation framework under GPL. It lets groups of researchers go into VR, and touch molecular objects as if they were tangible objects. Our working name for the project is ‘Narupa’, which we arrived by combining the prefix ‘nano’ and the suffix ‘arupa’. Wikipedia explains how arūpa is a Sanskrit word describing non-physical and non-material objects. It seemed to us a good concept for describing what it’s like to interact with simulated nanoscale objects. It’s still a name-in-progress, so let us know how you find it.

We first prototyped this technology in Jan 2016 at London’s Barbican Arts Centre, as part of an ‘Open Lab’ residency which I organised using funding from the EPSRC and the Royal Society. It involved several participants from my University of Bristol research lab and also Phil Tew from my company Interactive Scientific Ltd., all of whom feature in a blog post & video which we made at the time. We’ve been hard at work in the intervening years, for example using it to carry out the studies described in our 2018 open-access paper in Science Advances. A number of my academic and industrial research colleagues have already joined us in our community efforts. I look forward to announcing a whole host of interesting partnerships over the next few months. It’s been particularly exciting for me to observe how our consciously open-source ethos has inspired my international research colleagues to release open-source versions of their own simulation codes, in order to participate in a community. We’ve had several emails from excited collaborators who have cloned the repo and are getting things running.

I’m absolutely delighted to have set this project free into the intellectual & cultural commons. Getting it out there has taken some doing, but now it’s done, set free into the commons as an open resource for enabling communities to cooperatively learn from one another. The video I’ve embedded in this post marks the moment at which we set Narupa free, complete with an ad hoc ceremony involving gongs and bells. It’s a credit to the fantastically creative VR researchers that I have the privilege to work with – Mike O’Connor, Alex Jones, Helen Deeks, Lisa May Thomas, Rebecca Walters, Simon Bennie, and Alex Binnie – these are people who know when to sound the gongs and ring the bells!

Stay tuned over the next few months, because there’s plenty more on the horizon. We’re building loads of cool new open-source features like quantum mechanical force engines, real-time data sonification and audio, and also code enabling you to stream your own real-time simulations from the cloud! We’ve also got a number of papers in the pipeline which will be coming out soon, where we will demonstrate a whole host of interesting application domains. We’ve even included instructions on how to build your own multi-person VR lab.

launch of a new {open-source} community!

In order to focus my energy on open-source & community projects, I’ve recently decided to step away from Interactive Scientific, a company which I co-founded in 2013. In this short blog, I describe the origins of my old company and why I stepped away. My intent here is to empower others to learn from my experience. It’s been quite a learning curve for me, especially useful in the way that it has evolved my own thinking regarding the intellectual commons – and I think others might stand to benefit from the thoughts and observations outlined herein.
Continue reading

VR Enabled real-time simulation: NYT, Nature, BBC

Really excited to report that the open access Science Advances paper published by O’Connor et al. during the summer, entitled “Sampling molecular conformations and dynamics in a multiuser virtual reality framework” has since generated significant media exposure, having been picked up by a number of scientific media outlets. Nature, the New York Times, and the BBC’s “Science in Action” show (the VR piece begins 7 mins in) all contacted me in order to discuss the implications this work could have for nanotech research. It’s been exciting to witness the interest which the paper has generated. It certainly seems to be captivating people’s imaginations, and is attracting lots of attention by workers across academia & industry.

Science Advances virtual reality paper


Working with academic colleagues from high-performance computing (HPC) and human-computer interaction (HCI), we recently published an open access paper entitled “Sampling molecular conformations and dynamics in a multiuser virtual reality framework” in the AAAS journal Science Advances. The paper described a scientifically rigorous, VR-enabled, multi-person, real-time interactive Molecular Dynamics (iMD) framework, , which lets researchers use virtual reality to literally reach out & touch real-time molecular physics using cloud-mounted supercomputing.

The paper presents the results of HCI experiments showing that VR (specifically the HTC Vive setup) enables users to carry out 3d molecular simulation tasks extremely efficiently compared to other platforms. Specifically, we asked users to perform three separate molecular manipulations, and timed how long each took on various platforms: in VR, on a touchscreen, and using a computer/mouse . The tasks included threading a molecule of methane through a simulated carbon nanotube; unwinding a left-handed helical molecule and rewinding it into a right-handed helix; and tying a knot in a simulated protein. The results showed that in VR, users were able to accomplish all of the tasks more quickly. The knot task, in particular, was completed nearly ten times as rapidly! By using 2D screen-based simulations of molecules, it seems pretty clear that we’re a lot less efficient than we could be.


interactive MD art installation

DS FESTIVAL -133-compressedIn collaboration with Bristol-based tech startup Interactive Scientific, more than 37,000 people had the chance to experience the acclaimed real-time interactive molecular dynamics art installation ‘danceroom Spectroscopy’ (dS) at the ‘We the Curious’ science museum in central Bristol. dS – whose architecture is described in a 2014 Faraday Discussion paper – fuses rigorous methods from computational physics, GPU computing, and computer vision to interpret people as fields whose movement creates ripples and waves in an unseen field. The result is a gentle piece comprised of interactive graphics and soundscapes, both of which respond in real-time to people’s movements – enabling them to sculpt the invisible fields in which they are embedded. Offering a unique and subtle glimpse into the beauty of our everyday movements, dS allows us to imagine how we interact with the hidden energy matrix and atomic world which forms the fabric of nature, but is too small for our eyes to see. It’s as much a next-generation digital arts installation as it is an invitation to contemplate the interconnected dynamism of the natural world and processes of emergence, fluctuation, and dissipation – from the microscopic to the cosmic. The installation ran from October 2017 through January 2018, and was open to anybody;  you can read more about it here.