Our academics are open to working in new teams, with new partners to solve complex challenges.
A new AI-driven immersive visualisation system will help us to address the new generation of extreme fires caused by global warming. 

An immersive visualisation platform that virtually recreates the experience of being in a wildfire will help artists, designers, firefighters and scientists better understand and communicate the dynamics of these extreme events.

“Wildfires are a whole new generation of fires,” says lead researcher Scientia Professor Dennis Del Favero at UNSW Sydney. “We’re experiencing accelerating levels of global warming which are leading to fires of a scale, speed and violence never before seen in recorded human history.”

Unlike traditional bushfires that move relatively predictably, wildfires are fundamentally unpredictable, the ARC Laureate Fellow says. They can form their own weather systems generating lightning storms that can ignite new fires; this, in addition to their size and speed, makes their behaviour difficult to anticipate, he says.

“Situational awareness is critical in a wildfire... It’s a bit like being in a combat zone. You don't know where the dangers are. They can surround you and be above you,” he says.

“So we’re developing a way of visualising this type of dynamic by using artificial intelligence to drive the visualisation so that the fire behaves unpredictably according to its own logic, not according to our expectations.”

The sector-first platform will give users and researchers a visceral understanding of the dynamics of wildfires, at 1:1 scale and in real time, within a safe virtual environment, he says.

“It uses real-world data to visualise not only what they look like, but also what they feel [and sound] like. Sound is very important … [because] wildfires have a particular acoustic that is entirely unique.”

The platform will provide a tool for two distinct users. For fire scientists, fire-fighters and fire organisations around Australia and internationally, it will facilitate research and training in the dynamics of wildfire scenarios by enabling open-ended decision-making for a more agile and collaborative approach to fire planning, group training and fire management.

It will enable artists, curators and designers to imaginatively explore wildfire landscapes using a digital palette with a vast range of atmospheres, flora and topographies to enhance public engagement and understanding of these scenarios.

Users can share and explore the environment across multiple locations and platforms, including mobile 360-degree 3D cinemas as well as more portable 3D projection screens, 3D head-mounted displays to laptops and tablets.

The five-year project, iFire, brings together global experts in fire research, including computer and fire scientists at UNSW such as Prof. Maurice Pagnucco and Prof. Jason Sharples, Data 61, the University of Melbourne, San José State University and more than 15 international industry and government partners, including the Australasian Fire & Emergency Service Authority Council, Fire Rescue NSW, CALfire, Pau Costa Foundation and the ARC Centre of Excellence in Climate Extremes.

It is funded by Prof. Del Favero’s ARC Laureate Fellowship and based at the UNSW iCinema Centre for Interactive Cinema Research. The project, like the Centre, is interdisciplinary in approach, working across art, design, computing and science.

It continues Prof. Del Favero’s research as Director of the iCinema Research Centre at UNSW. The philosopher-turned-artist uses artistic simulation to sensorially explore diverse risk-laden scenarios, directly addressing issues like global warming in visceral and compelling ways. 

An AI-driven immersive visualisation suite that recreates the experience of being in a wildfire will transform how we understand, respond to and prepare for the phenomenon. Image: Supplied. 

Arming fire-vulnerable areas 

“In this age of violent pyroconvention, with heat dramatically transforming Earth’s atmosphere, wildfires are becoming endemic,” Prof. Del Favero says. “We saw these [kinds of] fires in Australia in 2019, we saw them in 2021 in California, Canada and in Southern Europe.”

2019 was the hottest year on record in Australia, characterised by drought, heatwaves and devastating bushfires. More than 16 million hectares of land were burnt, thousands of properties were damaged and an estimated 1.5 billion wildlife animals died.

In addition to use in fire science and the arts, the iFire project will develop a geo-specific software application as part of its resource toolkit. The application will be able to be downloaded in fire-vulnerable areas for use by fire researchers, first responders and the community in general.

“Local councils can apply it to their own geographic precinct to show people how wildfires could move into their community. It would become part of their portfolio of educational tools for fire preparedness,” he says.

The project will also develop a pipeline for sharing and integrating diverse data sets – of fire behaviours, management procedures and protocols, for example – collected by a range of agencies to facilitate research into wildfires, he says.

“It will set benchmarks for how we use this data to effectively visualise these events.”

AI a powerful creative and research partner 

Harnessing artificial intelligence is integral to understanding these data sets.

“AI optimises our ability to experience the dynamics of fire in the landscape,” he says. “It can help us process all this complex data more rapidly and in more insightful ways than what we [as humans] can do.

“And we really need help at the moment as extreme events such as wildfires are existential, beyond our imagination in terms of effect and difficult to model.”

The project will also explore wildfire landscapes through a range of creative applications for film, museums and the contemporary arts.

“[AI-driven immersive visualisations] allow you to imagine whole new creative worlds that you wouldn’t otherwise be able to simply with human cognition alone,” he says.

The iFire platform will be developed for more niche industry needs, potentially commercial in nature. For example, Data 61 will work with UNSW iCinema to create an immersive experience of their fire application, Spark, which models bushfire spread to help plan for and manage bushfires.

Research into virtual environments can help us better prepare for unpredictable scenarios in the real world. Image: Supplied. 

Visual technologies beneficial across disciplines

These kinds of advanced art and technology frameworks are applicable to a diverse set of needs, Prof. Del Favero says.

The UNSW iCinema’s research spans interactive art scenarios, intelligent database systems, immersive design modelling and extreme event simulation. Previous projects have contributed to contemporary artcultural heritagedefence memorialisationdigital museology and mining simulation.

The iCASTS project, for example, delivered a suite of virtual reality simulations for China’s leading research and training institute for mine safety, the Shenyang Research Institute of China Technology & Engineering Group.

The project, later commercialised, created highly realistic simulations of an underground mine that allowed up to 30 trainees to simultaneously interact with hazard and technology scenarios. The immersive modules provided a highly effective alternative to training via lengthy manuals, training more than 30,000 miners and reducing fatalities and serious injury in the mining industries in China and Australia.

Artistic technologies that provide life-like experiences can help us better understand and address the unpredictable and turbulent scenarios that characterise the terrestrial changes we are experiencing, he says.

“I’m very interested in creating virtual worlds to enhance the way we engage with the physical world around us,” Prof. Del Favero says.

“Creating simulated worlds is a way of collaborating with an artificially intelligent twin to form a new type of partnership that integrates the speed and scale of AI in establishing patterns and predicting behaviours with the subtlety and adaptability of human situational understanding and decision making.”

Lead image: The platform will develop greater lateral and collaborative thinking in users working in firefighting, group training and fire planning. Image: Supplied. 

This article was originally published in 2022.

Written by Kay Harrison