13 research outputs found
Lessons Learned from Implementing Service-Oriented Clinical Decision Support at Four Sites: A Qualitative Study
Objective
To identify challenges, lessons learned and best practices for service-oriented clinical decision support, based on the results of the Clinical Decision Support Consortium, a multi-site study which developed, implemented and evaluated clinical decision support services in a diverse range of electronic health records.
Methods
Ethnographic investigation using the rapid assessment process, a procedure for agile qualitative data collection and analysis, including clinical observation, system demonstrations and analysis and 91 interviews.
Results
We identified challenges and lessons learned in eight dimensions: (1) hardware and software computing infrastructure, (2) clinical content, (3) human-computer interface, (4) people, (5) workflow and communication, (6) internal organizational policies, procedures, environment and culture, (7) external rules, regulations, and pressures and (8) system measurement and monitoring. Key challenges included performance issues (particularly related to data retrieval), differences in terminologies used across sites, workflow variability and the need for a legal framework.
Discussion
Based on the challenges and lessons learned, we identified eight best practices for developers and implementers of service-oriented clinical decision support: (1) optimize performance, or make asynchronous calls, (2) be liberal in what you accept (particularly for terminology), (3) foster clinical transparency, (4) develop a legal framework, (5) support a flexible front-end, (6) dedicate human resources, (7) support peer-to-peer communication, (8) improve standards.
Conclusion
The Clinical Decision Support Consortium successfully developed a clinical decision support service and implemented it in four different electronic health records and four diverse clinical sites; however, the process was arduous. The lessons identified by the Consortium may be useful for other developers and implementers of clinical decision support services
Recommended from our members
Analysis of clinical decision support system malfunctions: a case series and survey
Objective: To illustrate ways in which clinical decision support systems (CDSSs) malfunction and identify patterns of such malfunctions. Materials and Methods We identified and investigated several CDSS malfunctions at Brigham and Women’s Hospital and present them as a case series. We also conducted a preliminary survey of Chief Medical Information Officers to assess the frequency of such malfunctions. Results: We identified four CDSS malfunctions at Brigham and Women’s Hospital: (1) an alert for monitoring thyroid function in patients receiving amiodarone stopped working when an internal identifier for amiodarone was changed in another system; (2) an alert for lead screening for children stopped working when the rule was inadvertently edited; (3) a software upgrade of the electronic health record software caused numerous spurious alerts to fire; and (4) a malfunction in an external drug classification system caused an alert to inappropriately suggest antiplatelet drugs, such as aspirin, for patients already taking one. We found that 93% of the Chief Medical Information Officers who responded to our survey had experienced at least one CDSS malfunction, and two-thirds experienced malfunctions at least annually. Discussion CDSS malfunctions are widespread and often persist for long periods. The failure of alerts to fire is particularly difficult to detect. A range of causes, including changes in codes and fields, software upgrades, inadvertent disabling or editing of rules, and malfunctions of external systems commonly contribute to CDSS malfunctions, and current approaches for preventing and detecting such malfunctions are inadequate. Conclusion: CDSS malfunctions occur commonly and often go undetected. Better methods are needed to prevent and detect these malfunctions
Problem list completeness in electronic health records: A multi-site study and assessment of success factors.
OBJECTIVE: To assess problem list completeness using an objective measure across a range of sites, and to identify success factors for problem list completeness. METHODS: We conducted a retrospective analysis of electronic health record data and interviews at ten healthcare organizations within the United States, United Kingdom, and Argentina who use a variety of electronic health record systems: four self-developed and six commercial. At each site, we assessed the proportion of patients who have diabetes recorded on their problem list out of all patients with a hemoglobin A1c elevation >= 7.0%, which is diagnostic of diabetes. We then conducted interviews with informatics leaders at the four highest performing sites to determine factors associated with success. Finally, we surveyed all the sites about common practices implemented at the top performing sites to determine whether there was an association between problem list management practices and problem list completeness. RESULTS: Problem list completeness across the ten sites ranged from 60.2% to 99.4%, with a mean of 78.2%. Financial incentives, problem-oriented charting, gap reporting, shared responsibility, links to billing codes, and organizational culture were identified as success factors at the four hospitals with problem list completeness at or near 90.0%. DISCUSSION: Incomplete problem lists represent a global data integrity problem that could compromise quality of care and put patients at risk. There was a wide range of problem list completeness across the healthcare facilities. Nevertheless, some facilities have achieved high levels of problem list completeness, and it is important to better understand the factors that contribute to success to improve patient safety. CONCLUSION: Problem list completeness varies substantially across healthcare facilities. In our review of EHR systems at ten healthcare facilities, we identified six success factors which may be useful for healthcare organizations seeking to improve the quality of their problem list documentation: financial incentives, problem oriented charting, gap reporting, shared responsibility, links to billing codes, and organizational culture