iFire
AI fire visualisation system
AI fire visualisation system
The iFire program connects globally located researchers and 3D immersive systems in the world’s first AI environment able to visualise interaction with unpredictable extreme fire scenarios such as those of the Australian Black Summer 2029/2020 and Los Angeles 2025. The 3D systems are networked across a range of platforms (Fig. 1) using software that enables users to interact with the fire ground by sharing the same 3D setting in real time, no matter their smart screen platform. The platforms range from mobile 3D cinemas, 3D virtual production volumes, 3D LED walls, 3D head-mounted displays to 2D laptops and tablets, providing interaction for multiple distributed users at any one time. It is underpinned by an AI framework that analyses, learns from and responds to individual and group behaviour in real-time. It provides specific applications for the four distinct types of end users: researchers, creatives, responders and residents. Collaborators include UNSW iCinema Research Centre, UNSW Climate Change Research Centre, University of Melbourne Virtual Production Lab, the ABC, AFAC (Australasian Fire & Emergency Service Authorities Council), CSIRO/Data61, Düsseldorf/Cologne Open, Fire and Rescue NSW and Hawkesbury City Council.
iFire translates mathematical fire simulation of actual incidents into immersive cinematic scenarios using the film industry standard UNREAL real-time creative tool. This core artistic and scientific application is developed for use by creatives, responders and residents.
It is currently explored through three case studies using this approach, comprising: an Australian pine plantation fire (Fig. 2, 3), an Australian grassland fire (Fig. 4, 5) and a US mountain forest fire (Fig. 6, 7).
iFire: Pine Plantation Case Study. 2025
The case studies are translated into a range of applications for emergency broadcast for smart phones (Fig. 8), responder training for cinematic theatres (Fig. 9) and creative visualisations for immersive exhibitions (Fig. 10).
These applications are focused on providing diverse audiences and stakeholders the immersive experience and practical understanding of unpredictable extreme fires that are now becoming increasingly intense and frequent. Depicting this new landscape demands the modelling of multiple wildfire kinetic and spatial processes which cannot be understood by human cognition alone. This requires the integration of the speed and scale of AI in establishing patterns and predicting fire behaviours with the subtlety and adaptability of human perception. This involves an intelligent aesthetic that evolves and grows by learning from human behaviour.
The program allows researchers and stakeholders to interact with unanticipated fire scenarios that operate independently of user expectations. By generating unforeseen behaviours, the program enables users to better understand and master the distributed dynamics of fire scenarios in a safe virtual environment. Assembling histories and expertise from diverse backgrounds, it integrates them into an intelligent database with a library of fire behaviours, management procedures and protocols. By offering dynamic life-like encounters where users can rehearse their response, it enhances resident, responder, creative and researcher risk perception, situational awareness and collaborative decision making.
The program assembles a repertoire of expertise ranging across AI, computer graphics, creative arts, database architecture, interaction design, fire management and immersive visualisation. The end result is a visualisation eco-system that can be utilised by research labs, creative enterprises, emergency services and resident organisations in situ. The program is based on the award-winning iCASTS safety training simulation system commercialised for the Australian and Chinese mining industry that has trained over 30,000 personnel across six mine locations, reduced injuries by 67% with no fatalities.
The iFire program is financially supported under the Australian Research Council’s Laureate funding scheme.
ARC Project Director: ARC Laureate Fellow Professor Dennis Del Favero
Project ARC Project Collaborators and Partners: see Project collaborators and partners tab
ARC Project Title: Burning landscapes: reimagining unpredictable scenarios
Project Funding: ARC FL200100004
2021-2025
Position | Name |
---|---|
Executive Director | ARC Laureate Fellow Dennis Del Favero |
Co-Director | Prof. Michael J. Ostwald |
Co-Director | ARC Future Fellow Aspro Yang Song |
Office of National Intelligence Post Doc Fellow | Dr Baylee Brits |
ARC Laureate Post Doc Fellow | Dr Susanne Thurow (Associate Director) |
ARC Laureate Post Doc Fellow | Renhao Huang (Associate Director) |
ARC Laureate Senior Programmer | Navin Brohier |
ARC Laureate Programmer | Nora Perry |
ARC Laureate Programmer | Dylan Shorten |
ARC Laureate 3D Modeller | Scott Cotterell |
ARC Laureate PhD | Mario Flores Gonzalez |
ARC Laureate PhD | Frank Wu |
CSIRO PhD | Nagida Helsby-Clark |
ARC Laureate MA | Lara Clemente |
Australian Industry Advisory Committee
European Industry Advisory Committee
Research Committee