Multi-person glove-based interaction in VR

test2.gif

(c) Rachel Freire, Becca Rose, & the Intangible Realities Lab, licensed under CC-BY-SA 4.0

Following on from some preliminary work we published in early 2019, I had an amazing last few days in the Intangible Realities Laboratory, spending time with the fabulous E-textile artists Rachel Freire and Becca Rose making v3 of our open-source VR data gloves, which we have specifically designed to facilitate multi-person interaction of the sort required to carry out careful molecular manipulations during interactive simulations. It’s part of a larger research study investigating strategies to move beyond controller-based interaction in VR – ultimately we want to simply reach out, touch, and move simulated molecular objects using our hands and fingers. The *.gif shows the final phases of our development workshop, with six glove-wearing VR users from the IRL. Stay tuned – we’ll be testing and publishing more over the next several weeks!

Narupa: open-source beta exectuble

Screen Shot 2019-03-04 at 13.01.24

We’ve just published an open access article (arxiv.1902.01827) describing Narupa, our open-source multi-person iMD-VR (interactive molecular dynamics in virtual reality) framework. To go along with the article, we’ve published a stable beta executable of Narupa, available at irl.itch.io/narupaxr.

The executable is effectively a build of the source at gitlab.com/intangiblerealities. If you want to set up your own multi-participant VR lab, instructions are available here.

Narupa provides a few key upgrades on the proof-of-principle iMD-VR framework we published in our 2018 Science Advances paper. Specifically, Narupa enables: (1) multiple participants to cohabit the same iMD-VR environment [as shown in the video at vimeo.com/244670465]; (2) molecular simulation experts to set up their own simulations using a flexible force API; (3) the ability to run both the VR client and the force server on local networks; and (4) access to a range of tools (e.g., a flexible selection interface) designed to streamline the use of iMD-VR for research applications like protein-drug binding [as shown in the video at vimeo.com/29630079].

There’s lots of exciting updates in the pipeline underway, which will improve the application of iMD-VR to a range of nanotech research problems!

Real-time audio from interactive molecular dynamics

Over the past few years, we’ve been exploring sound as a sensory channel for understanding the physics of real-time interactive molecular dynamics simulations. In the molecular sciences, sound is a vastly underutilized means for data processing, partly because audio representational standards are less well defined compared to graphics. For example, depicting an atom using a sphere is an arbitrary decision, but it’s intelligible owing to the fact that an atom and a sphere are both objects which are spatially delimited. Defining clearly delimited objects in the audio realm is not as straightforward, neither spatially nor compositionally. For example, it is difficult to imagine what constitutes an ‘atomistic’ object in a piece of audio design or music. Our work suggests that sound is best utilized for representing properties which are non-local: potential energy, electrostatic energy, local temperature, strain energy, etc. The non-locality of these properties makes them extremely difficult to visualize using conventional graphical rendering strategies (and even if there were effective strategies, would lead to significant visual congestion).

The video shows a real-time interactive simulation of 17-ALA peptide in which our resident sonification experts (PhD student Alex Jones, Prof. Tom Mitchell, & Prof. Joseph Hyde) track potential energy in real-time, in order to interactively generate sound. The video shows Alex manipulating a small protein from within Narupa, and highlights how the sound responds to dynamical changes in the potential energy. The sound changes as Alex takes the protein from its native folded state to a high-energy knotted state. Eventually, Alex unties the knot, and you can hear as the protein relaxes to a lower energy state.

machine learning from VR-generated data

New paper posted to the arXiv, which I’m really excited about, where we describe how user-guided real-time interactive quantum mechanics (QM) simulations in virtual reality (iMD-VR) can be used to train neural networks to learn QM energy functions. The paper is entitled “Training neural nets to learn reactive potential energy surfaces using interactive quantum chemistry in virtual reality”.

As far as I know, this is the first ever demonstration of real-time interactive QM in VR, something we were able to accomplish through collaboration with Markus Reiher and Alain Vaucher, our colleagues at ETH in Zurich, who worked with us to develop an interface between our VR framework Narupa, and their excellent SCINE quantum chemistry package. Silvia Amabilino and Lars Bratholm have have also done an excellent job making their excellent GPU-accelerated neural network (NN) framework for learning QM potential energy surfaces available as an open-source package on GitHub – see the paper for more details!