TY - JOUR
T1 - Improving provider adoption with adaptive clinical decision support surveillance: An observational study
AU - Khan, S.
AU - Richardson, S.
AU - Liu, A.
AU - Mechery, V.
AU - McCullagh, L.
AU - Schachter, A.
AU - Pardo, S.
AU - McGinn, T.
AU - Liu, Alan Z
PY - 2019/1/1
Y1 - 2019/1/1
N2 - © Sundas Khan, Safiya Richardson, Andrew Liu, Vinodh Mechery, Lauren McCullagh, Andy Schachter, Salvatore Pardo, Thomas McGinn. Background: Successful clinical decision support (CDS) tools can help use evidence-based medicine to effectively improve patient outcomes. However, the impact of these tools has been limited by low provider adoption due to overtriggering, leading to alert fatigue. We developed a tracking mechanism for monitoring trigger (percent of total visits for which the tool triggers) and adoption (percent of completed tools) rates of a complex CDS tool based on the Wells criteria for pulmonary embolism (PE). Objective: We aimed to monitor and evaluate the adoption and trigger rates of the tool and assess whether ongoing tool modifications would improve adoption rates. Methods: As part of a larger clinical trial, a CDS tool was developed using the Wells criteria to calculate pretest probability for PE at 2 tertiary centers’ emergency departments (EDs). The tool had multiple triggers: any order for D-dimer, computed tomography (CT) of the chest with intravenous contrast, CT pulmonary angiography (CTPA), ventilation-perfusion scan, or lower extremity Doppler ultrasound. A tracking dashboard was developed using Tableau to monitor real-time trigger and adoption rates. Based on initial low provider adoption rates of the tool, we conducted small focus groups with key ED providers to elicit barriers to tool use. We identified overtriggering of the tool for non-PE-related evaluations and inability to order CT testing for intermediate-risk patients. Thus, the tool was modified to allow CT testing for the intermediate-risk group and not to trigger for CT chest with intravenous contrast orders. A dialogue box, “Are you considering PE for this patient?” was added before the tool triggered to account for CTPAs ordered for aortic dissection evaluation. Results: In the ED of tertiary center 1, 95,295 patients visited during the academic year. The tool triggered for an average of 509 patients per month (average trigger rate 2036/30,234, 6.73%) before the modifications, reducing to 423 patients per month (average trigger rate 1629/31,361, 5.22%). In the ED of tertiary center 2, 88,956 patients visited during the academic year, with the tool triggering for about 473 patients per month (average trigger rate 1892/29,706, 6.37%) before the modifications and for about 400 per month (average trigger rate 1534/30,006, 5.12%) afterward. The modifications resulted in a significant 4.5- and 3-fold increase in provider adoption rates in tertiary centers 1 and 2, respectively. The modifications increased the average monthly adoption rate from 23.20/360 (6.5%) tools to 81.60/280.20 (29.3%) tools and 46.60/318.80 (14.7%) tools to 111.20/263.40 (42.6%) tools in centers 1 and 2, respectively. Conclusions: Close postimplementation monitoring of CDS tools may help improve provider adoption. Adaptive modifications based on user feedback may increase targeted CDS with lower trigger rates, reducing alert fatigue and increasing provider adoption. Iterative improvements and a postimplementation monitoring dashboard can significantly improve adoption rates.
AB - © Sundas Khan, Safiya Richardson, Andrew Liu, Vinodh Mechery, Lauren McCullagh, Andy Schachter, Salvatore Pardo, Thomas McGinn. Background: Successful clinical decision support (CDS) tools can help use evidence-based medicine to effectively improve patient outcomes. However, the impact of these tools has been limited by low provider adoption due to overtriggering, leading to alert fatigue. We developed a tracking mechanism for monitoring trigger (percent of total visits for which the tool triggers) and adoption (percent of completed tools) rates of a complex CDS tool based on the Wells criteria for pulmonary embolism (PE). Objective: We aimed to monitor and evaluate the adoption and trigger rates of the tool and assess whether ongoing tool modifications would improve adoption rates. Methods: As part of a larger clinical trial, a CDS tool was developed using the Wells criteria to calculate pretest probability for PE at 2 tertiary centers’ emergency departments (EDs). The tool had multiple triggers: any order for D-dimer, computed tomography (CT) of the chest with intravenous contrast, CT pulmonary angiography (CTPA), ventilation-perfusion scan, or lower extremity Doppler ultrasound. A tracking dashboard was developed using Tableau to monitor real-time trigger and adoption rates. Based on initial low provider adoption rates of the tool, we conducted small focus groups with key ED providers to elicit barriers to tool use. We identified overtriggering of the tool for non-PE-related evaluations and inability to order CT testing for intermediate-risk patients. Thus, the tool was modified to allow CT testing for the intermediate-risk group and not to trigger for CT chest with intravenous contrast orders. A dialogue box, “Are you considering PE for this patient?” was added before the tool triggered to account for CTPAs ordered for aortic dissection evaluation. Results: In the ED of tertiary center 1, 95,295 patients visited during the academic year. The tool triggered for an average of 509 patients per month (average trigger rate 2036/30,234, 6.73%) before the modifications, reducing to 423 patients per month (average trigger rate 1629/31,361, 5.22%). In the ED of tertiary center 2, 88,956 patients visited during the academic year, with the tool triggering for about 473 patients per month (average trigger rate 1892/29,706, 6.37%) before the modifications and for about 400 per month (average trigger rate 1534/30,006, 5.12%) afterward. The modifications resulted in a significant 4.5- and 3-fold increase in provider adoption rates in tertiary centers 1 and 2, respectively. The modifications increased the average monthly adoption rate from 23.20/360 (6.5%) tools to 81.60/280.20 (29.3%) tools and 46.60/318.80 (14.7%) tools to 111.20/263.40 (42.6%) tools in centers 1 and 2, respectively. Conclusions: Close postimplementation monitoring of CDS tools may help improve provider adoption. Adaptive modifications based on user feedback may increase targeted CDS with lower trigger rates, reducing alert fatigue and increasing provider adoption. Iterative improvements and a postimplementation monitoring dashboard can significantly improve adoption rates.
UR - https://academicworks.medicine.hofstra.edu/publications/5058
U2 - 10.2196/10245/p
DO - 10.2196/10245/p
M3 - Article
SN - 2292-9495
VL - 21
JO - JMIR Hum Factors
JF - JMIR Hum Factors
ER -