18 research outputs found

    Clinical decision support improves the appropriateness of laboratory test ordering in primary care without increasing diagnostic error : the ELMO cluster randomized trial

    Get PDF
    Background: Inappropriate laboratory test ordering poses an important burden for healthcare. Clinical decision support systems (CDSS) have been cited as promising tools to improve laboratory test ordering behavior. The objectives of this study were to evaluate the effects of an intervention that integrated a clinical decision support service into a computerized physician order entry (CPOE) on the appropriateness and volume of laboratory test ordering, and on diagnostic error in primary care. Methods: This study was a pragmatic, cluster randomized, open-label, controlled clinical trial. Setting: Two hundred eighty general practitioners (GPs) from 72 primary care practices in Belgium. Patients: Patients aged >= 18 years with a laboratory test order for at least one of 17 indications: cardiovascular disease management, hypertension, check-up, chronic kidney disease (CKD), thyroid disease, type 2 diabetes mellitus, fatigue, anemia, liver disease, gout, suspicion of acute coronary syndrome (ACS), suspicion of lung embolism, rheumatoid arthritis, sexually transmitted infections (STI), acute diarrhea, chronic diarrhea, and follow-up of medication. Interventions: The CDSS was integrated into a computerized physician order entry (CPOE) in the form of evidence-based order sets that suggested appropriate tests based on the indication provided by the general physician. Measurements: The primary outcome of the ELMO study was the proportion of appropriate tests over the total number of ordered tests and inappropriately not-requested tests. Secondary outcomes of the ELMO study included diagnostic error, test volume, and cascade activities. Results: CDSS increased the proportion of appropriate tests by 0.21 (95% CI 0.16-0.26, p < 0.0001) for all tests included in the study. GPs in the CDSS arm ordered 7 (7.15 (95% CI 3.37-10.93, p = 0.0002)) tests fewer per panel. CDSS did not increase diagnostic error. The absolute difference in proportions was a decrease of 0.66% (95% CI 1.4% decrease-0.05% increase) in possible diagnostic error. Conclusions: A CDSS in the form of order sets, integrated within the CPOE improved appropriateness and decreased volume of laboratory test ordering without increasing diagnostic error

    Training professionals in clinical guideline development: Increasing quality by collaborating

    Full text link
    Background: To deliver high-quality clinical guidelines, we increasingly need a capacity of experts trained in the methods of developing guidelines. The Working group Development of primary care guidelines’ (WOREL) and the Belgian Centre for Evidence Based Medicine (Cebam) collaborate to train healthcare professionals and academics in guideline development. Objective: To describe the Belgian approach of guideline development training. Methods: A team of senior guideline development methodologists determined the content and methods of the training module which are based on the standard principles of guideline development. The didactic methods of the training respond to the needs of the participants. Results: Twice a year a four-day training on guideline development takes place. The target audience is an interdisciplinary group of healthcare professionals. The blended course consists of three online sessions, one face-to-face in-depth training and a self-study part. We plan the first 3 days of the course with intervals of 3 weeks to give the participants the possibility to train the newly learned skills. After six months a fourth training day aims to address problems experienced during their own guideline development process. The course takes place in both national languages. Discussion and conclusio: This training method has as added value that the training is in the native language and given by trainers who know the Belgian context. The blended factor gives the opportunity to select the most proper working method for the competence to be acquired. To anticipate the needs of future guideline developers, we also consider a continuous offer via e-learning

    Involving general practice trainees in clinical practice guideline adaptation

    No full text
    Abstract Background It is unclear whether it is feasible to involve residents in guideline development or adaptation. We designed a multifaceted training program that combines training sessions, a handbook and a documentation tool to assist general practice (GP)-trainees in the adaptation of clinical practice guidelines (CPGs). The aim of this study is to adapt a database of CPGs by involving GP-trainees and to build evidence-based practice (EBP) learning capacity. Methods We assessed each adaptation process and surveyed all GP-trainees who enrolled in our training program on their views on the program. They were asked to formulate an overall rating for the training and were asked to rate individual aspects of the training program (the training sessions, the handbook and the documentation tool). Results To date, 122 GP-trainees followed the training and have adapted 60 different CPGs. Overall quality of their work was good. Based on an assessment of the content of the documentation tool, 24 (40%) adapted CPGs rated as good quality and 30 (50%) rated as moderate quality. Only 3 adapted CPGs (5%) were evaluated as being of poor quality. 51 (42%) GP-trainees completed the survey on user satisfaction. 98% (50) of the GP-trainees found the training to be of good overall quality. 86% of the GP-trainees were satisfied with the handbook but satisfaction was lowest for the documentation tool (47% satisfied). Conclusion It is possible to engage GP-trainees in CPG adaptation using a formal process when provided with training, feedback and documentation tools

    Involving general practice trainees in clinical practice guideline adaptation

    No full text
    BACKGROUND: It is unclear whether it is feasible to involve residents in guideline development or adaptation. We designed a multifaceted training program that combines training sessions, a handbook and a documentation tool to assist general practice (GP)-trainees in the adaptation of clinical practice guidelines (CPGs). The aim of this study is to adapt a database of CPGs by involving GP-trainees and to build evidence-based practice (EBP) learning capacity. METHODS: We assessed each adaptation process and surveyed all GP-trainees who enrolled in our training program on their views on the program. They were asked to formulate an overall rating for the training and were asked to rate individual aspects of the training program (the training sessions, the handbook and the documentation tool). RESULTS: To date, 122 GP-trainees followed the training and have adapted 60 different CPGs. Overall quality of their work was good. Based on an assessment of the content of the documentation tool, 24 (40%) adapted CPGs rated as good quality and 30 (50%) rated as moderate quality. Only 3 adapted CPGs (5%) were evaluated as being of poor quality. 51 (42%) GP-trainees completed the survey on user satisfaction. 98% (50) of the GP-trainees found the training to be of good overall quality. 86% of the GP-trainees were satisfied with the handbook but satisfaction was lowest for the documentation tool (47% satisfied). CONCLUSION: It is possible to engage GP-trainees in CPG adaptation using a formal process when provided with training, feedback and documentation tools.status: publishe

    Clinical decision support improves the appropriateness of laboratory test ordering in primary care without increasing diagnostic error: the ELMO cluster randomized trial

    No full text
    BACKGROUND: Inappropriate laboratory test ordering poses an important burden for healthcare. Clinical decision support systems (CDSS) have been cited as promising tools to improve laboratory test ordering behavior. The objectives of this study were to evaluate the effects of an intervention that integrated a clinical decision support service into a computerized physician order entry (CPOE) on the appropriateness and volume of laboratory test ordering, and on diagnostic error in primary care. METHODS: This study was a pragmatic, cluster randomized, open-label, controlled clinical trial. SETTING: Two hundred eighty general practitioners (GPs) from 72 primary care practices in Belgium. PATIENTS: Patients aged ≥ 18 years with a laboratory test order for at least one of 17 indications: cardiovascular disease management, hypertension, check-up, chronic kidney disease (CKD), thyroid disease, type 2 diabetes mellitus, fatigue, anemia, liver disease, gout, suspicion of acute coronary syndrome (ACS), suspicion of lung embolism, rheumatoid arthritis, sexually transmitted infections (STI), acute diarrhea, chronic diarrhea, and follow-up of medication. INTERVENTIONS: The CDSS was integrated into a computerized physician order entry (CPOE) in the form of evidence-based order sets that suggested appropriate tests based on the indication provided by the general physician. MEASUREMENTS: The primary outcome of the ELMO study was the proportion of appropriate tests over the total number of ordered tests and inappropriately not-requested tests. Secondary outcomes of the ELMO study included diagnostic error, test volume, and cascade activities. RESULTS: CDSS increased the proportion of appropriate tests by 0.21 (95% CI 0.16-0.26, p < 0.0001) for all tests included in the study. GPs in the CDSS arm ordered 7 (7.15 (95% CI 3.37-10.93, p = 0.0002)) tests fewer per panel. CDSS did not increase diagnostic error. The absolute difference in proportions was a decrease of 0.66% (95% CI 1.4% decrease-0.05% increase) in possible diagnostic error. CONCLUSIONS: A CDSS in the form of order sets, integrated within the CPOE improved appropriateness and decreased volume of laboratory test ordering without increasing diagnostic error. TRIAL REGISTRATION: ClinicalTrials.gov Identifier: NCT02950142 , registered on October 25, 2016.status: publishe

    Clinical decision support improves the appropriateness of laboratory test ordering in primary care without increasing diagnostic error : the ELMO cluster randomized trial

    Get PDF
    Background: Inappropriate laboratory test ordering poses an important burden for healthcare. Clinical decision support systems (CDSS) have been cited as promising tools to improve laboratory test ordering behavior. The objectives of this study were to evaluate the effects of an intervention that integrated a clinical decision support service into a computerized physician order entry (CPOE) on the appropriateness and volume of laboratory test ordering, and on diagnostic error in primary care. Methods: This study was a pragmatic, cluster randomized, open-label, controlled clinical trial. Setting: Two hundred eighty general practitioners (GPs) from 72 primary care practices in Belgium. Patients: Patients aged >= 18 years with a laboratory test order for at least one of 17 indications: cardiovascular disease management, hypertension, check-up, chronic kidney disease (CKD), thyroid disease, type 2 diabetes mellitus, fatigue, anemia, liver disease, gout, suspicion of acute coronary syndrome (ACS), suspicion of lung embolism, rheumatoid arthritis, sexually transmitted infections (STI), acute diarrhea, chronic diarrhea, and follow-up of medication. Interventions: The CDSS was integrated into a computerized physician order entry (CPOE) in the form of evidence-based order sets that suggested appropriate tests based on the indication provided by the general physician. Measurements: The primary outcome of the ELMO study was the proportion of appropriate tests over the total number of ordered tests and inappropriately not-requested tests. Secondary outcomes of the ELMO study included diagnostic error, test volume, and cascade activities. Results: CDSS increased the proportion of appropriate tests by 0.21 (95% CI 0.16-0.26, p < 0.0001) for all tests included in the study. GPs in the CDSS arm ordered 7 (7.15 (95% CI 3.37-10.93, p = 0.0002)) tests fewer per panel. CDSS did not increase diagnostic error. The absolute difference in proportions was a decrease of 0.66% (95% CI 1.4% decrease-0.05% increase) in possible diagnostic error. Conclusions: A CDSS in the form of order sets, integrated within the CPOE improved appropriateness and decreased volume of laboratory test ordering without increasing diagnostic error

    A systematic review of trials evaluating success factors of interventions with computerised clinical decision support

    Get PDF
    Background Computerised clinical decision support (CDS) can potentially better inform decisions, and it can help with the management of information overload. It is perceived to be a key component of a learning health care system. Despite its increasing implementation worldwide, it remains uncertain why the effect of CDS varies and which factors make CDS more effective. Objective To examine which factors make CDS strategies more effective on a number of outcomes, including adherence to recommended practice, patient outcome measures, economic measures, provider or patient satisfaction, and medical decision quality. Methods We identified randomised controlled trials, non-randomised trials, and controlled before-and-after studies that directly compared CDS implementation with a given factor to CDS without that factor by searching CENTRAL, MEDLINE, EMBASE, and CINAHL and checking reference lists of relevant studies. We considered CDS with any objective for any condition in any healthcare setting. We included CDS interventions that were either displayed on screen or provided on paper and that were directed at healthcare professionals or targeted at both professionals and patients. The reviewers screened the potentially relevant studies in duplicate. They extracted data and assessed risk of bias in independent pairs or individually followed by a double check by another reviewer. We summarised results using medians and interquartile ranges and rated our certainty in the evidence using the GRADE system. Results We identified 66 head-to-head trials that we synthesised across 14 comparisons of CDS intervention factors. Providing CDS automatically versus on demand led to large improvements in adherence. Displaying CDS on-screen versus on paper led to moderate improvements and making CDS more versus less patient-specific improved adherence modestly. When CDS interventions were combined with professional-oriented strategies, combined with patient-oriented strategies, or combined with staff-oriented strategies, then adherence improved slightly. Providing CDS to patients slightly increased adherence versus CDS aimed at the healthcare provider only. Making CDS advice more explicit and requiring users to respond to the advice made little or no difference. The CDS intervention factors made little or no difference to patient outcomes. The results for economic outcomes and satisfaction outcomes were sparse. Conclusion Multiple factors may affect the success of CDS interventions. CDS may be more effective when the advice is provided automatically and displayed on-screen and when the suggestions are more patient-specific. CDS interventions combined with other strategies probably also improves adherence. Providing CDS directly to patients may also positively affect adherence. The certainty of the evidence was low to moderate for all factors. Trial registration PROSPERO, CRD4201603373
    corecore