57 research outputs found

    RESKO: Repositioning drugs by using side effects and knowledge from ontologies

    Get PDF
    The objective of drug repositioning is to apply existing drugs to different diseases or medical conditions than the original target, and thus alleviate to a certain extent the time and cost expended in drug development. Our system RESKO, REpositioning drugs using Side Effects and Knowledge from Ontologies, identifies drugs with similar side-effects which are potential candidates for use elsewhere, the supposition is that similar side-effects may be caused by drugs targeting similar proteins and pathways. RESKO, integrates drug chemical data, protein interaction and ontological knowledge. The novel aspects of our system include a high level of biological knowledge through the use of pathway and biological ontology integration. This provides a explanation facility lacking in most of the existing methods and improves the repositioning process. We evaluate the shared side effects from the eight conventional Alzheimer drugs, from which sixty-seven candidate drugs based on a side-effect commonality were identified. The top 25 drugs on the list were further investigated in depth for their suitability to be repositioned, the literature revealed that many of the candidate drugs appear to have been trialed for Alzheimer's disease. Thus verifying the accuracy of our system, we also compare our technique with several competing systems found in the literature

    The Software Engineering Laboratory: An operational software experience factory

    Get PDF
    For 15 years, the Software Engineering Laboratory (SEL) has been carrying out studies and experiments for the purpose of understanding, assessing, and improving software and software processes within a production software development environment at NASA/GSFC. The SEL comprises three major organizations: (1) NASA/GSFC, Flight Dynamics Division; (2) University of Maryland, Department of Computer Science; and (3) Computer Sciences Corporation, Flight Dynamics Technology Group. These organizations have jointly carried out several hundred software studies, producing hundreds of reports, papers, and documents, all of which describe some aspect of the software engineering technology that was analyzed in the flight dynamics environment at NASA. The studies range from small, controlled experiments (such as analyzing the effectiveness of code reading versus that of functional testing) to large, multiple project studies (such as assessing the impacts of Ada on a production environment). The organization's driving goal is to improve the software process continually, so that sustained improvement may be observed in the resulting products. This paper discusses the SEL as a functioning example of an operational software experience factory and summarizes the characteristics of and major lessons learned from 15 years of SEL operations

    Software process improvement in the NASA software engineering laboratory

    Get PDF
    The Software Engineering Laboratory (SEL) was established in 1976 for the purpose of studying and measuring software processes with the intent of identifying improvements that could be applied to the production of ground support software within the Flight Dynamics Division (FDD) at the National Aeronautics and Space Administration (NASA)/Goddard Space Flight Center (GSFC). The SEL has three member organizations: NASA/GSFC, the University of Maryland, and Computer Sciences Corporation (CSC). The concept of process improvement within the SEL focuses on the continual understanding of both process and product as well as goal-driven experimentation and analysis of process change within a production environment

    Genotyping Analyses of Tuberculosis Cases in U.S.- and Foreign-Born Massachusetts Residents

    Get PDF
    We used molecular genotyping to further understand the epidemiology and transmission patterns of tuberculosis (TB) in Massachusetts. The study population included 983 TB patients whose cases were verified by the Massachusetts Department of Public Health between July 1, 1996, and December 31, 2000, and for whom genotyping results and information on country of origin were available. Two hundred seventy-two (28%) of TB patients were in genetic clusters, and isolates from U.S-born were twice as likely to cluster as those of foreign-born (odds ratio [OR] 2.29, 95% confidence interval [CI] 1.69, 3.12). Our results suggest that restriction fragment length polymorphism analysis has limited capacity to differentiate TB strains when the isolate contains six or fewer copies of IS6110, even with spoligotyping. Clusters of TB patients with more than six copies of IS6110 were more likely to have epidemiologic connections than were clusters of TB patients with isolates with few copies of IS6110 (OR 8.01, 95%; CI 3.45,18.93)

    Cost-effectiveness of HBV and HCV screening strategies:a systematic review of existing modelling techniques

    Get PDF
    Introduction: Studies evaluating the cost-effectiveness of screening for Hepatitis B Virus (HBV) and Hepatitis C Virus (HCV) are generally heterogeneous in terms of risk groups, settings, screening intervention, outcomes and the economic modelling framework. It is therefore difficult to compare cost-effectiveness results between studies. This systematic review aims to summarise and critically assess existing economic models for HBV and HCV in order to identify the main methodological differences in modelling approaches. Methods: A structured search strategy was developed and a systematic review carried out. A critical assessment of the decision-analytic models was carried out according to the guidelines and framework developed for assessment of decision-analytic models in Health Technology Assessment of health care interventions. Results: The overall approach to analysing the cost-effectiveness of screening strategies was found to be broadly consistent for HBV and HCV. However, modelling parameters and related structure differed between models, producing different results. More recent publications performed better against a performance matrix, evaluating model components and methodology. Conclusion: When assessing screening strategies for HBV and HCV infection, the focus should be on more recent studies, which applied the latest treatment regimes, test methods and had better and more complete data on which to base their models. In addition to parameter selection and associated assumptions, careful consideration of dynamic versus static modelling is recommended. Future research may want to focus on these methodological issues. In addition, the ability to evaluate screening strategies for multiple infectious diseases, (HCV and HIV at the same time) might prove important for decision makers

    Impact of Genotyping of Mycobacterium tuberculosis on Public Health Practice in Massachusetts

    Get PDF
    Massachusetts was one of seven sentinel surveillance sites in the National Tuberculosis Genotyping and Surveillance Network. From 1996 through 2000, isolates from new patients with tuberculosis (TB) underwent genotyping. We describe the impact that genotyping had on public health practice in Massachusetts and some limitations of the technique. Through genotyping, we explored the dynamics of TB outbreaks, investigated laboratory cross-contamination, and identified Mycobacterium tuberculosis strains, transmission sites, and accurate epidemiologic links. Genotyping should be used with epidemiologic follow-up to identify how resources can best be allocated to investigate genotypic findings

    A simplified mesoscale 3D model for characterizing fibrinolysis under flow conditions

    Get PDF
    One of the routine clinical treatments to eliminate ischemic stroke thrombi is injecting a biochemical product into the patient’s bloodstream, which breaks down the thrombi’s fibrin fibers: intravenous or intravascular thrombolysis. However, this procedure is not without risk for the patient; the worst circumstances can cause a brain hemorrhage or embolism that can be fatal. Improvement in patient management drastically reduced these risks, and patients who benefited from thrombolysis soon after the onset of the stroke have a significantly better 3-month prognosis, but treatment success is highly variable. The causes of this variability remain unclear, and it is likely that some fundamental aspects still require thorough investigations. For that reason, we conducted in vitro flow-driven fibrinolysis experiments to study pure fibrin thrombi breakdown in controlled conditions and observed that the lysis front evolved non-linearly in time. To understand these results, we developed an analytical 1D lysis model in which the thrombus is considered a porous medium. The lytic cascade is reduced to a second-order reaction involving fibrin and a surrogate pro-fibrinolytic agent. The model was able to reproduce the observed lysis evolution under the assumptions of constant fluid velocity and lysis occurring only at the front. For adding complexity, such as clot heterogeneity or complex flow conditions, we propose a 3-dimensional mesoscopic numerical model of blood flow and fibrinolysis, which validates the analytical model’s results. Such a numerical model could help us better understand the spatial evolution of the thrombi breakdown, extract the most relevant physiological parameters to lysis efficiency, and possibly explain the failure of the clinical treatment. These findings suggest that even though real-world fibrinolysis is a complex biological process, a simplified model can recover the main features of lysis evolution.</p
    • …
    corecore