Evaluating the Performance of Agreement Metrics in a Delphi Study on Chemical, Biological, Radiological and Nuclear Major Incidents Preparedness Using Classical and Machine Learning Approaches
Delphi studies in disaster medicine lack consensus on expert agreement metrics. This study examined various metrics using a Delphi study on chemical, biological, radiological, and nuclear (CBRN) preparedness in the Middle East and North Africa region. Forty international disaster medicine experts evaluated 133 items across ten CBRN Preparedness Assessment Tool themes using a 5‐point Likert scale. Agreement was measured using Kendall's W, Intraclass Correlation Coefficient, and Cohen's Kappa. Statistical and machine learning techniques compared metric performance. The overall agreement mean score was 4.91 ± 0.71, with 89.21% average agreement. Kappa emerged as the most sensitive metric in statistical and machine learning analyses, with a feature importance score of 168.32. The Kappa coefficient showed variations across CBRN PAT themes, including medical protocols, logistics, and infrastructure. The integrated statistical and machine learning approach provides a promising method for understanding expert consensus in disaster preparedness, with potential for future refinement by incorporating additional contextual factors.
Item Type | Article |
---|---|
Additional information | © 2025 The Author(s). Journal of Contingencies and Crisis Management published by John Wiley & Sons Ltd. This is an open access article distributed under the Creative Commons Attribution License (CC BY), https://creativecommons.org/licenses/by/4.0/ |
Keywords | disaster medicine, expert's opinion, agreement analysis, mena, delphi study, management information systems, management, monitoring, policy and law |
Date Deposited | 10 Jun 2025 14:50 |
Last Modified | 10 Jun 2025 14:50 |