Researchers
Dr Kate Saunders, Monash University
At a glance
This case study reviews gaps in the real-time warning communication made available to the public during the 2022 floods and makes several data-driven recommendations to enhance future warnings.
Key search words
Universities, cyclone, failure, disruption of essential services, disruption infrastructure, fire, flood, storm, storm tide, storm surge, collaboration, coordination, community engagement, continuous engagement, continuous improvement, governance, managing risk, planning, plans, resilience.
Introduction
The effectiveness of natural hazard warnings relies on transforming the available data into actionable knowledge for the public. However, gaps exist between established data science best practices and how data is being used to support natural hazard warnings and their communication. At present, retrospective evaluation of warning effectiveness and hazard response is often limited, with empirical evaluation of warning systems and their effect on human behaviour lagging (Saunders et al, 2025).
During flooding in Queensland in 2022, the public faced a deluge of digital warning information. They were accessing multiple different websites to piece together the information relevant to them (Saunders et al, 2025). This data-synthesis was happening ad-hoc and across varying levels of digital literacy.
More research is needed to understand how data underpins warnings and their communication, and whether the current use of data to support warnings is effective. Importantly, this includes assessing whether the data and its visualisation are adequately supporting the public to make timely and well-informed decisions.
To address these gaps, an inter-disciplinary perspective was written, “Data-driven recommendations for real-time natural hazards warnings” (Saunders et al. 2025).
Data-driven recommendations
Remove existing data barriers
o Data may exist, but it may not be in a usable form.
e.g. There are a range of data sets that cannot be easily used as the data is not machine readable or stored appropriately.
Recommendation: Improve the quality and interoperability of this data for warnings by implementing five-star open data standards and using FAIR (Findable, Accessible, Interoperable, Reusable) data principles.
o Data silos prevent seamless information sharing across organisations.
e.g. Border communities should not need to access warning information from different state-based apps. A problem observed again during Tropical Cyclone Alfred 2025. Warning information should be shared seamlessly across platforms, such as the Queensland Hazard dashboard and NSW Hazard Watch.
Recommendation: Minimise institutional, legal, and logistical barriers contributing to data siloing by establish formal data sharing agreements, including provisions for sharing data, code, and models. Ensure these data pipelines are functional. To ease this process, where possible make the data open.
o Critical data disappears during and after the events.
e.g. Screenshots of warning information were taken to independently review the information in hindsight. Any data on public warnings should be archived appropriately for later reference.
Recommendation: Identify data loss and use gold standards in data-curation and reproducible research to better preserve this data for effective post-event evaluation.Adopt data visualisation best practices
o The visualisation provided to the public did not support their decision making
e.g. Static maps were common, yet people increasingly expect interactive tools to support decision making.
Recommendation: Data visualisation best-practices should always be used, including using appropriate visual hierarchy and graphical principles. Interactive features should be leveraged to position warnings in the user’s personal context and reduce cognitive load, such as searching, zooming, pop-ups, and information layering. Upskilling and education in visualisation are also important for public institutions sharing warning information, given the highly variable expertise across institutions that issuing warnings.Use novel data sources to improve warnings
o Traditional data sources are not enough to understand on-the-ground conditions
e.g. Road closures in Google Maps were out of date during the 2022 Brisbane floods. In contrast, Waze, a crowd-sourced plat form, provided more reliable crowd-sourced information about road closures.
Recommendation: Identify opportunities for where non-traditional data sources could fill knowledge gaps. Focus particularly on locations with limited official hazard and impact warning support. Leverage published data-science methods for integrating these low-cost data sources including, satellite remote sensing, drones, social media analyses, web-scraping and crowdsourced data.o Despite ad hoc information-sharing being a vital part of warning communications, such as on Facebook and X (formerly Twitter), there are no formal mechanisms for using crowd-sourced data in official warnings.
e.g. Healthy Land and Water showed how data can be effectively collected from the public, by setting up an online platform to collect photos of flooding in real-time and so they could record ecological impacts.
Recommendation: Invest in developing mechanisms to strategically use novel data sources, such as crowd-sourced data, to support warning communication. This will need to include developing methods to quality control the data.Embracing uncertainty
o Uncertainty is not currently well communicated or visualised.
e.g. There is uncertainty in predicted flood extents and there is uncertainty in the
o 1% AEP level (formerly 1-in-100-year return level).
Recommendation: Best practice visualisation principles apply here too. Uncertainty visualisation also needs to be tailored to specific audiences. For agencies coordinating the response, the visualisation needs may differ from those of the public
o Only one scenario or single-trajectory forecast is often communicated. How uncertainty is propagated along the warning value chain (Hoffmann et al. 2023) needs to be carefully considered and factored into emergency planning and warning communication.
e.g. Yet uncertainty can support decision making if communicated well, giving communities vital additional time to prepare to act. This is shown by weather forecasts which communicate uncertainty.
Recommendation:
Avoid relying on single-trajectory forecasts or summary statistics when an ensemble forecast is available. Where possible, propagate uncertainty through the warning value chain to assess the full range of possible outcomes (weather forecasts → hazard forecasts → impact forecasts → warnings).
Conclusion
The research reveals opportunities for how data science best practices can be used to improve the effectiveness of natural hazard warnings. The work also demonstrates how warning-value chain can be used to understand where and how data is used, and how data interconnects for effective warnings.
Overall recommendations and policy implications
Post-disaster evaluation should include assessment from a data-driven perspective. This should include:
- assessment of how the data was used
- assessment of whether the data was fit for purpose
- a review of whether data was shared effectively between organisations
- identification of any missing or incomplete data.
Effectiveness of communicated hazard and impact warnings should also be reviewed in the context of data visualisation best practices, and whether that data and visualisation adequately supported decision-making.
There is also clear need for cross-jurisdictional sharing of data. This requires:
- establishment of formalised data sharing agreements, particularly for border region
- revisiting existing government frameworks for data curation, sharing and updating these for modern needs.
The research highlights significant potential for ongoing collaboration between data science and natural hazards communities. Future work should focus on operationalising the above recommendations through partnerships that combine technical expertise with deep understanding of community needs and emergency management constraints.
Acknowledgements
The ideas of this case study were first identified during a hackathon run in rapid response to the 2022 Brisbane floods. This hackathon aimed to capture the ephemeral nature of data during a natural disaster event and characterise and review the real-time response. Affectionately, the volunteers who attended the hackathon formed a ‘digital’ mud army, using their technical experience to its greatest societal benefit. This case study is therefore motivated by the authors’ own experiences during the Brisbane flooding and provides the unique dual perspective of both experienced data practitioner and firsthand witness.
References
Saunders, K. R., Forbes, O., Hopf, J. K., Patterson, C. R., Vollert, S. A., Brown, K., ... & Helmstedt, K. J. (2025). Data-driven recommendations for enhancing real-time natural hazard warnings. One Earth, 8(5).
Data-driven recommendations for enhancing real-time natural hazard warnings - ScienceDirect
Hoffmann, D., Ebert, E. E., Mooney, C., Golding, B., & Potter, S. (2023). Using value chain approaches to evaluate the end-to-end warning chain. Advances in Science and Research 20: 73-79.


