As virtual technologies improve, it is becoming possible to explore increasingly realistic environments. In this project sponsored by agricultural company Syngenta Australia, the 3DVAL created a digital educational experience that replicates on-site learnings gathered at Syngenta’s physical learning centres across the country.
In this immersive VR experience, users can explore wheat fields and learn about the impact of Crown Rot – a fungal disease that can be devastating to the global wheat industry. Users learn about the disease cycle, mechanisms leading to poor crop yields, and how Syngenta products can treat the disease to reduce the impact of Crown Rot.
In parallel to the development of the VR application, UNSW is completing a research study evaluating the effectiveness of different learning scenarios in a cohort of volunteers. The study will compare learning outcomes in VR compared to traditional screen-based methods and shed light on interfaces and immersive VR scenarios that offer the greatest knowledge transfer to the viewers.
Collaborators: Richard Packard (Syngenta Australia), Pat Younis & Jordan East (Tilde Visual)
Funded by: Syngenta Australia
Nanoscape is a data-driven 3D real-time interactive virtual cell environment.
The biological world at the molecular level is a complex and chaotic place. Striving for increased authenticity in real-time 3D computer visualisations and animations is aesthetically and computationally challenging due to extreme molecular crowding and large variations in spatio-temporal scales.
In this work, a data-first approach has been employed to visualise activity on the surface of a cancer cell in situ. Leveraging advances in GPU instancing and parallelism combined with emerging data regarding densities, scales and dynamics of cellular entities, Nanoscape aims to immerse the viewer in an unparalleled interactive real-time-rendered cellular exploration. This pioneering work will allow reflection, speculation and a much deeper fundamental understanding of the complex world inside the human body.
Built upon the desktop application, Nanoscape VR immerses the viewer within the cellular environment through a head mounted display. Surrounded by jostling proteins, cascading strands of connective tissue and towering cellular protrusions, the viewer is equipped with a torch and invited to explore the landscape to hunt for nanoparticles entering the tumour surface.
Accompanied by a moving soundscape, Nanoscape VR aims to better convey the orchestrated chaos occurring at the cellular level and gives the viewer an unparalleled insight into the scale of various cellular components.
The work attempts to push the limits of current hardware to display thousands of molecular entities in VR and employs creative solutions to cache and instance protein and cellular simulations.
Desktop and VR versions of Nanoscape have been released for public use and are available on the Steam store:
Collaborators: Prof Kris Thurecht and Prof Rob Parton (University of Queensland), A/Prof Angus Johnston (Monash University), Prof Maria Kavallaris, Prof Matthew Kearnes, Dr Declan Kuch (UNSW)
When the world was confronted with the COVID-19 pandemic, an imperative health message circulated urging the public to wash their hands regularly with soap and water to help stem the spread of the virus, yet little had been described about exactly how soap interacts with these infectious agents at the molecular level.
In collaboration with biochemical science professor, Pall Thordarson, the 3DVAL undertook a visualisation project to unveil these mysterious mechanisms and to increase public awareness of the importance of handwashing. The piece was selected for the prestigious Computer Animation Festival at the 2020 SIGGRAPH Asia conference.
Collaborator: Prof Pall Thordarson (UNSW)
The ability to convincingly immerse a viewer in an alternate reality remains the cornerstone of VR technology and the foundation of this collaborative project. We are exploring how distraction through gamified exploration of virtual worlds affects the perception of acute pain experienced by hospital patients.
In collaboration with medical experts at the St Vincent’s Hospital Sydney, the 3DVAL team are developing applications that run on the low-cost and ultra-portable Samsung GearVR headsets as a novel approach to pain management.
The project investigates the role of aesthetics and design principles (such as colour and interactivity) in boosting the psycho-physiological defences against the perception of pain.
Collaborators: Prof Steven Faux, Dr Christine Shiner (St Vincent’s Hospital)
Sponsored by Samsung
Interactive cell exploration
In 2016, an interactive virtual reality cell environment, the first of its kind, was generated at the 3D Visualisation Aesthetics Lab. The prototype uses the latest room-scale virtual reality technology and high-resolution electron microscopy data to allow researchers to observe the processes by which nanoparticles carrying cancer therapies are internalised and trafficked within a cancer cell. It is anticipated that this work will shift the paradigm of education while accelerating the science discovery process by offering researchers novel perspectives on drug delivery. Initial assessment of the learning outcomes of immersive media education compared with traditional screen-based methods in undergraduate students suggests VR can improve learning but further work is needed.
Based on ground-breaking research being undertaken by scientific collaborators at the Monash Institute of Pharmaceutical Science (MIPS) and the University of Queensland (UQ), the 3DVAL developed a thought-provoking computer-generated animation describing the stages of novel nanoparticle drug delivery to cancer cells isolated in the laboratory. The sequence incorporates accurate high-resolution microscopy data of a breast cancer cell and structural data from the Protein Data Bank to enhance scientific authenticity.
Design-led visualisation of nanomedicines in virtual reality
This research explores how VR can be used as a platform to understand and interact with pre-clinical imaging data – specifically PET-CT- in an immersive and intuitive manner. This interdisciplinary research work is being carried out between the 3D Visualisation Aesthetics Lab (3DVAL) at UNSW Sydney and the Australian Institute for Bioengineering and Nanotechnology/Centre for Advanced Imaging at The University of Queensland, Australia. The project focuses on how carefully-considered aesthetic and design choices such as colour, environment, sound and interface features, can enhance the user experience and comprehension of the 3D data sets.
Collaborators: Prof Kris Thurecht, Dr Zach Houston, Dr Nick Fletcher (University of Queensland)
Funded by the ARC Centre of Excellence in Convergent Bio-Nano Science & Technology (CBNS)
The visual microscope is an innovative fully interactive and ultra-high resolution navigation tool that has been developed to browse and analyze gene expression levels from human cancer cells, acting as a visual microscope on data. The tool uses high-performance visualization and computer graphics technology to enable genome scientists to observe the evolution of regulatory elements across time and gain valuable insights from their dataset as never before.
Following the Human Genome Project, extensive effort was dedicated to building a comprehensive annotation of the functional elements of the genome, led by the Roadmap Epigenomics and ENCODE [Stanford-University 2018]. More recently, studies have identified these regulatory elements to produce small RNA molecules called enhancer RNAs (eRNAs), that are readily detectable through modern RNA-sequencing technologies. Computational approaches to make sense of eRNAs have relied on static snapshots from a broad range of cell types. By taking repeated measurements of a single cell-type across time, it is possible to identify changes in eRNAs to correspond closely with changes in the activity of specific classes of genes. Interpretation of these complex datasets is hindered by the scarcity of tools for their visualization and interpretation. A unique challenge is presented by the dynamic nature of the sequencing data and the vast differences in scale between gene size and the intervening spaces between genes and regulatory elements.
Understanding complex physiological processes demands the integration of diverse insights derived from visual and quantitative analysis of bio-image data, such as microscopy images. This process is currently constrained by disconnects between methods for interpreting data, as well as by language barriers that hamper the necessary cross-disciplinary collaborations. Using immersive analytics, we leveraged bespoke immersive visualizations to integrate bio-images and derived quantitative data, enabling deeper comprehension and seamless interaction with multi-dimensional cellular information. We designed and developed a visualization platform that combines time-lapse confocal microscopy recordings of cancer cell motility with image-derived quantitative data spanning 52 parameters. The integrated data representations enable rapid, intuitive interpretation, bridging the divide between bio-images and quantitative information. Moreover, the immersive visualization environment promotes collaborative data interrogation, supporting vital cross-disciplinary collaborations capable of deriving transformative insights from rapidly emerging bio-image big data.
This work presents our strategy and pipeline architecture for visualising very large-scale graphs in an immersive environment, using high-performance graphics approach. Innovation lies in utilising GPUs for real-time cluster-based interactive rendering, but also intermediate graph representation that utilises industry leading GLTF file format. The current tool can visualise large scale graph datasets (millions of nodes), running smoothly in an immersive and interactive environment.
A key area of investigation for the 3DVAL is multi-user/social virtual reality, a powerful tool for communication and collaboration.
The 3DVAL are developing customised pipelines and rendering tools that allow real-time, interactive exploration of dense data sets containing millions of data points.