103 research outputs found

    Can computerized clinical decision support systems improve practitioners' diagnostic test ordering behavior? A decision-maker-researcher partnership systematic review

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Underuse and overuse of diagnostic tests have important implications for health outcomes and costs. Decision support technology purports to optimize the use of diagnostic tests in clinical practice. The objective of this review was to assess whether computerized clinical decision support systems (CCDSSs) are effective at improving ordering of tests for diagnosis, monitoring of disease, or monitoring of treatment. The outcome of interest was effect on the diagnostic test-ordering behavior of practitioners.</p> <p>Methods</p> <p>We conducted a decision-maker-researcher partnership systematic review. We searched MEDLINE, EMBASE, Ovid's EBM Reviews database, Inspec, and reference lists for eligible articles published up to January 2010. We included randomized controlled trials comparing the use of CCDSSs to usual practice or non-CCDSS controls in clinical care settings. Trials were eligible if at least one component of the CCDSS gave suggestions for ordering or performing a diagnostic procedure. We considered studies 'positive' if they showed a statistically significant improvement in at least 50% of test ordering outcomes.</p> <p>Results</p> <p>Thirty-five studies were identified, with significantly higher methodological quality in those published after the year 2000 (<it>p </it>= 0.002). Thirty-three trials reported evaluable data on diagnostic test ordering, and 55% (18/33) of CCDSSs improved testing behavior overall, including 83% (5/6) for diagnosis, 63% (5/8) for treatment monitoring, 35% (6/17) for disease monitoring, and 100% (3/3) for other purposes. Four of the systems explicitly attempted to reduce test ordering rates and all succeeded. Factors of particular interest to decision makers include costs, user satisfaction, and impact on workflow but were rarely investigated or reported.</p> <p>Conclusions</p> <p>Some CCDSSs can modify practitioner test-ordering behavior. To better inform development and implementation efforts, studies should describe in more detail potentially important factors such as system design, user interface, local context, implementation strategy, and evaluate impact on user satisfaction and workflow, costs, and unintended consequences.</p

    Preoperative idoxuridine and radiation for large soft tissue sarcomas: Clinical results with five-year follow-up

    Full text link
    Background: Local control remains an important issue in the management of large soft tissue sarcomas. Radiation is the main adjuvant to surgery for local therapy of sarcomas, but it requires relatively high doses, hitherto considered prohibitive in areas such as the retroperitoneum. We developed a preoperative treatment approach to large soft tissue sarcomas that would deliver a high total dose of radiation administered in conjunction with the halogenated pyrimidine radiosensitizer idoxuridine (IdUrd).Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/41417/1/10434_2006_Article_BF02303842.pd

    Creating Creative Technologists: playing with(in) education

    Get PDF
    Since the industrial revolution, the organization of knowledge into distinct scientific, technical or creative categories has resulted in educational systems designed to produce and validate particular occupations. The methods by which students are exposed to different kinds of knowledge are critical in creating and reproducing individual, professional or cultural identities. (“I am an Engineer. You are an Artist”). The emergence of more open, creative and socialised technologies generates challenges for discipline-based education. At the same time, the term “Creative Technologies” also suggests a new occupational category (“I am a Creative Technologist”). This chapter presents a case-study of an evolving ‘anti-disciplinary’ project-based degree that challenges traditional degree structures to stimulate new forms of connective, imaginative and explorative learning, and to equip students to respond to a changing world. Learning is conceived as an emergent process; self-managed by students through critique and open peer review. We focus on ‘playfulness’ as a methodology for achieving multi-modal learning across the boundaries of art, design, computer science, engineering, games and entrepreneurship. In this new cultural moment, playfulness also re-frames the institutional identities of teacher and learner in response to new expectations for learning

    Computerized clinical decision support systems for drug prescribing and management: A decision-maker-researcher partnership systematic review

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Computerized clinical decision support systems (CCDSSs) for drug therapy management are designed to promote safe and effective medication use. Evidence documenting the effectiveness of CCDSSs for improving drug therapy is necessary for informed adoption decisions. The objective of this review was to systematically review randomized controlled trials assessing the effects of CCDSSs for drug therapy management on process of care and patient outcomes. We also sought to identify system and study characteristics that predicted benefit.</p> <p>Methods</p> <p>We conducted a decision-maker-researcher partnership systematic review. We updated our earlier reviews (1998, 2005) by searching MEDLINE, EMBASE, EBM Reviews, Inspec, and other databases, and consulting reference lists through January 2010. Authors of 82% of included studies confirmed or supplemented extracted data. We included only randomized controlled trials that evaluated the effect on process of care or patient outcomes of a CCDSS for drug therapy management compared to care provided without a CCDSS. A study was considered to have a positive effect (<it>i.e.</it>, CCDSS showed improvement) if at least 50% of the relevant study outcomes were statistically significantly positive.</p> <p>Results</p> <p>Sixty-five studies met our inclusion criteria, including 41 new studies since our previous review. Methodological quality was generally high and unchanged with time. CCDSSs improved process of care performance in 37 of the 59 studies assessing this type of outcome (64%, 57% of all studies). Twenty-nine trials assessed patient outcomes, of which six trials (21%, 9% of all trials) reported improvements.</p> <p>Conclusions</p> <p>CCDSSs inconsistently improved process of care measures and seldomly improved patient outcomes. Lack of clear patient benefit and lack of data on harms and costs preclude a recommendation to adopt CCDSSs for drug therapy management.</p

    A Computational Approach to Analyze the Mechanism of Action of the Kinase Inhibitor Bafetinib

    Get PDF
    Prediction of drug action in human cells is a major challenge in biomedical research. Additionally, there is strong interest in finding new applications for approved drugs and identifying potential side effects. We present a computational strategy to predict mechanisms, risks and potential new domains of drug treatment on the basis of target profiles acquired through chemical proteomics. Functional protein-protein interaction networks that share one biological function are constructed and their crosstalk with the drug is scored regarding function disruption. We apply this procedure to the target profile of the second-generation BCR-ABL inhibitor bafetinib which is in development for the treatment of imatinib-resistant chronic myeloid leukemia. Beside the well known effect on apoptosis, we propose potential treatment of lung cancer and IGF1R expressing blast crisis

    Synaptic Reorganization in the Adult Rat's Ventral Cochlear Nucleus following Its Total Sensory Deafferentation

    Get PDF
    Ablation of a cochlea causes total sensory deafferentation of the cochlear nucleus in the brainstem, providing a model to investigate nervous degeneration and formation of new synaptic contacts in the adult brain. In a quantitative electron microscopical study on the plasticity of the central auditory system of the Wistar rat, we first determined what fraction of the total number of synaptic contact zones (SCZs) in the anteroventral cochlear nucleus (AVCN) is attributable to primary sensory innervation and how many synapses remain after total unilateral cochlear ablation. Second, we attempted to identify the potential for a deafferentation-dependent synaptogenesis. SCZs were ultrastructurally identified before and after deafferentation in tissue treated for ethanolic phosphotungstic acid (EPTA) staining. This was combined with pre-embedding immunocytochemistry for gephyrin identifying inhibitory SCZs, the growth-associated protein GAP-43, glutamate, and choline acetyltransferase. A stereological analysis of EPTA stained sections revealed 1.11±0.09 (S.E.M.)×109 SCZs per mm3 of AVCN tissue. Within 7 days of deafferentation, this number was down by 46%. Excitatory and inhibitory synapses were differentially affected on the side of deafferentation. Excitatory synapses were quickly reduced and then began to increase in number again, necessarily being complemented from sources other than cochlear neurons, while inhibitory synapses were reduced more slowly and continuously. The result was a transient rise of the relative fraction of inhibitory synapses with a decline below original levels thereafter. Synaptogenesis was inferred by the emergence of morphologically immature SCZs that were consistently associated with GAP-43 immunoreactivity. SCZs of this type were estimated to make up a fraction of close to 30% of the total synaptic population present by ten weeks after sensory deafferentation. In conclusion, there appears to be a substantial potential for network reorganization and synaptogenesis in the auditory brainstem after loss of hearing, even in the adult brain

    Host genetic signatures of susceptibility to fungal disease

    Get PDF
    Our relative inability to predict the development of fungal disease and its clinical outcome raises fundamental questions about its actual pathogenesis. Several clinical risk factors are described to predispose to fungal disease, particularly in immunocompromised and severely ill patients. However, these alone do not entirely explain why, under comparable clinical conditions, only some patients develop infection. Recent clinical and epidemiological studies have reported an expanding number of monogenic defects and common polymorphisms associated with fungal disease. By directly implicating genetic variation in the functional regulation of immune mediators and interacting pathways, these studies have provided critical insights into the human immunobiology of fungal disease. Most of the common genetic defects reported were described or suggested to impair fungal recognition by the innate immune system. Here, we review common genetic variation in pattern recognition receptors and its impact on the immune response against the two major fungal pathogens Candida albicans and Aspergillus fumigatus. In addition, we discuss potential strategies and opportunities for the clinical translation of genetic information in the field of medical mycology. These approaches are expected to transfigure current clinical practice by unleashing an unprecedented ability to personalize prophylaxis, therapy and monitoring for fungal disease.This work was supported by the Northern Portugal Regional Operational Programme (NORTE 2020), under the Portugal 2020 Partnership Agreement, through the European Regional Development Fund (FEDER) (NORTE-01-0145-FEDER-000013), the Fundação para a Ciência e Tecnologia (FCT) (IF/00735/2014 to AC, and SFRH/BPD/96176/2013 to CC), the Institut Mérieux (Mérieux Research Grant 2017 to CC), and the European Society of Clinical Microbiology and Infectious Diseases (ESCMID Research Grant 2017 to AC)

    Behavioural indicators of welfare in farmed fish

    Get PDF
    Behaviour represents a reaction to the environment as fish perceive it and is therefore a key element of fish welfare. This review summarises the main findings on how behavioural changes have been used to assess welfare in farmed fish, using both functional and feeling-based approaches. Changes in foraging behaviour, ventilatory activity, aggression, individual and group swimming behaviour, stereotypic and abnormal behaviour have been linked with acute and chronic stressors in aquaculture and can therefore be regarded as likely indicators of poor welfare. On the contrary, measurements of exploratory behaviour, feed anticipatory activity and reward-related operant behaviour are beginning to be considered as indicators of positive emotions and welfare in fish. Despite the lack of scientific agreement about the existence of sentience in fish, the possibility that they are capable of both positive and negative emotions may contribute to the development of new strategies (e. g. environmental enrichment) to promote good welfare. Numerous studies that use behavioural indicators of welfare show that behavioural changes can be interpreted as either good or poor welfare depending on the fish species. It is therefore essential to understand the species-specific biology before drawing any conclusions in relation to welfare. In addition, different individuals within the same species may exhibit divergent coping strategies towards stressors, and what is tolerated by some individuals may be detrimental to others. Therefore, the assessment of welfare in a few individuals may not represent the average welfare of a group and vice versa. This underlines the need to develop on-farm, operational behavioural welfare indicators that can be easily used to assess not only the individual welfare but also the welfare of the whole group (e. g. spatial distribution). With the ongoing development of video technology and image processing, the on-farm surveillance of behaviour may in the near future represent a low-cost, noninvasive tool to assess the welfare of farmed fish.Fundação para a Ciência e Tecnologia, Portugal [SFRH/BPD/42015/2007]info:eu-repo/semantics/publishedVersio

    Metamorphosis of Subarachnoid Hemorrhage Research: from Delayed Vasospasm to Early Brain Injury

    Get PDF
    Delayed vasospasm that develops 3–7 days after aneurysmal subarachnoid hemorrhage (SAH) has traditionally been considered the most important determinant of delayed ischemic injury and poor outcome. Consequently, most therapies against delayed ischemic injury are directed towards reducing the incidence of vasospasm. The clinical trials based on this strategy, however, have so far claimed limited success; the incidence of vasospasm is reduced without reduction in delayed ischemic injury or improvement in the long-term outcome. This fact has shifted research interest to the early brain injury (first 72 h) evoked by SAH. In recent years, several pathological mechanisms that activate within minutes after the initial bleed and lead to early brain injury are identified. In addition, it is found that many of these mechanisms evolve with time and participate in the pathogenesis of delayed ischemic injury and poor outcome. Therefore, a therapy or therapies focused on these early mechanisms may not only prevent the early brain injury but may also help reduce the intensity of later developing neurological complications. This manuscript reviews the pathological mechanisms of early brain injury after SAH and summarizes the status of current therapies
    corecore