Assessing Usefulness of the Dashboard Instrument to Review Equity (DIRE) Checklist to Evaluate Equity in Public Health Dashboards: Reliability Study #MMPMID41343855
Sosa P; Syailendra EA; Lehmann HP; Kharrazi H
JMIR Public Health Surveill 2025[Dec]; 11 (?): e71094 PMID41343855show ga
BACKGROUND: The COVID-19 pandemic was a critical time for public health, and though dashboards remained a source of critical health information for decision-makers, key gaps in equity-based decision support were revealed. The DIRE (Dashboard Instrument to Review Equity) Framework and Checklist tool was developed to be a practical tool for public health departments to use in evaluating equity-based decision support mechanisms in their dashboards. OBJECTIVE: The objective of this agreement and reliability study was to validate the DIRE Checklist tool as a practical and reliable instrument for data practitioners to use in evaluating dashboards. METHODS: This study was divided into 5 steps to conduct the necessary analysis for agreement and reliability. Step 1 completed the development of the DIRE Checklist tool in Qualtrics (Qualtrics International Inc). Step 2 focused on the parameters required for the selection of the 26 US state-based dashboards. Step 3 was the user testing and assessment process during which each reviewer applied the DIRE tool to each dashboard. Step 4 involved conducting different assessment methods to specifically calculate the comparative analysis, interrater agreement, intraclass correlation coefficients, and the cosine similarity for the Qualtrics, reviewer, and categorical scores. Finally, Step 5 involved conducting any qualitative assessment required on the notes. RESULTS: A total of 26 dashboards were evaluated using the DIRE Checklist tool by 2 reviewers. The overall percentage comparison for the Qualtrics Score was 31.7% (28.24/89) for Reviewer 1 and 41.8% (37.16/89) for Reviewer 2, resulting in a relative percent agreement of 72.7%. Additionally, the categorical scores showed substantial to high agreement across most categories based on percent agreement within each category. The intraclass correlation coefficient scores indicated varying levels of agreement across different categories, with good agreement observed for the Qualtrics score. CONCLUSIONS: The reliability and agreement result of the study confirmed strong performance of the DIRE checklist tool. The scores calculated were evaluated consistently and reliably by both raters-demonstrating the DIRE Checklist tool's ability to robustly evaluate different dashboards across a number of different categories and parameters.