118 research outputs found

    A characteristics framework for Semantic Information Systems Standards

    Get PDF
    Semantic Information Systems (IS) Standards play a critical role in the development of the networked economy. While their importance is undoubted by all stakeholders—such as businesses, policy makers, researchers, developers—the current state of research leaves a number of questions unaddressed. Terminological confusion exists around the notions of “business semantics”, “business-to-business interoperability”, and “interoperability standards” amongst others. And, moreover, a comprehensive understanding about the characteristics of Semantic IS Standards is missing. The paper addresses this gap in literature by developing a characteristics framework for Semantic IS Standards. Two case studies are used to check the applicability of the framework in a “real-life” context. The framework lays the foundation for future research in an important field of the IS discipline and supports practitioners in their efforts to analyze, compare, and evaluate Semantic IS Standard

    Process evaluation for complex interventions in primary care: understanding trials using the normalization process model

    Get PDF
    Background: the Normalization Process Model is a conceptual tool intended to assist in understanding the factors that affect implementation processes in clinical trials and other evaluations of complex interventions. It focuses on the ways that the implementation of complex interventions is shaped by problems of workability and integration.Method: in this paper the model is applied to two different complex trials: (i) the delivery of problem solving therapies for psychosocial distress, and (ii) the delivery of nurse-led clinics for heart failure treatment in primary care.Results: application of the model shows how process evaluations need to focus on more than the immediate contexts in which trial outcomes are generated. Problems relating to intervention workability and integration also need to be understood. The model may be used effectively to explain the implementation process in trials of complex interventions.Conclusion: the model invites evaluators to attend equally to considering how a complex intervention interacts with existing patterns of service organization, professional practice, and professional-patient interaction. The justification for this may be found in the abundance of reports of clinical effectiveness for interventions that have little hope of being implemented in real healthcare setting

    Performance of the CMS Cathode Strip Chambers with Cosmic Rays

    Get PDF
    The Cathode Strip Chambers (CSCs) constitute the primary muon tracking device in the CMS endcaps. Their performance has been evaluated using data taken during a cosmic ray run in fall 2008. Measured noise levels are low, with the number of noisy channels well below 1%. Coordinate resolution was measured for all types of chambers, and fall in the range 47 microns to 243 microns. The efficiencies for local charged track triggers, for hit and for segments reconstruction were measured, and are above 99%. The timing resolution per layer is approximately 5 ns

    Outcomes research in the development and evaluation of practice guidelines

    Get PDF
    BACKGROUND: Practice guidelines have been developed in response to the observation that variations exist in clinical medicine that are not related to variations in the clinical presentation and severity of the disease. Despite their widespread use, however, practice guideline evaluation lacks a rigorous scientific methodology to support its development and application. DISCUSSION: Firstly, we review the major epidemiological foundations of practice guideline development. Secondly, we propose a chronic disease epidemiological model in which practice patterns are viewed as the exposure and outcomes of interest such as quality or cost are viewed as the disease. Sources of selection, information, confounding and temporal trend bias are identified and discussed. SUMMARY: The proposed methodological framework for outcomes research to evaluate practice guidelines reflects the selection, information and confounding biases inherent in its observational nature which must be accounted for in both the design and the analysis phases of any outcomes research study

    Highly-efficient Cas9-mediated transcriptional programming

    Get PDF
    The RNA-guided nuclease Cas9 can be reengineered as a programmable transcription factor. However, modest levels of gene activation have limited potential applications. We describe an improved transcriptional regulator obtained through the rational design of a tripartite activator, VP64-p65-Rta (VPR), fused to nuclease-null Cas9. We demonstrate its utility in activating endogenous coding and noncoding genes, targeting several genes simultaneously and stimulating neuronal differentiation of human induced pluripotent stem cells (iPSCs).National Human Genome Research Institute (U.S.) (Grant P50 HG005550)United States. Dept. of Energy (Grant DE-FG02-02ER63445)Wyss Institute for Biologically Inspired EngineeringNational Science Foundation (U.S.). Graduate Research FellowshipMassachusetts Institute of Technology. Department of Biological EngineeringHarvard Medical School. Department of Genetic

    CMS Data Processing Workflows during an Extended Cosmic Ray Run

    Get PDF
    Peer reviewe

    Aligning the CMS Muon Chambers with the Muon Alignment System during an Extended Cosmic Ray Run

    Get PDF
    Peer reviewe

    Commissioning of the CMS high-level trigger with cosmic rays

    Get PDF
    This is the Pre-print version of the Article. The official published version of the paper can be accessed from the link below - Copyright @ 2010 IOPThe CMS High-Level Trigger (HLT) is responsible for ensuring that data samples with potentially interesting events are recorded with high efficiency and good quality. This paper gives an overview of the HLT and focuses on its commissioning using cosmic rays. The selection of triggers that were deployed is presented and the online grouping of triggered events into streams and primary datasets is discussed. Tools for online and offline data quality monitoring for the HLT are described, and the operational performance of the muon HLT algorithms is reviewed. The average time taken for the HLT selection and its dependence on detector and operating conditions are presented. The HLT performed reliably and helped provide a large dataset. This dataset has proven to be invaluable for understanding the performance of the trigger and the CMS experiment as a whole.This work is supported by FMSR (Austria); FNRS and FWO (Belgium); CNPq, CAPES, FAPERJ, and FAPESP (Brazil); MES (Bulgaria); CERN; CAS, MoST, and NSFC (China); COLCIENCIAS (Colombia); MSES (Croatia); RPF (Cyprus); Academy of Sciences and NICPB (Estonia); Academy of Finland, ME, and HIP (Finland); CEA and CNRS/IN2P3 (France); BMBF, DFG, and HGF (Germany); GSRT (Greece); OTKA and NKTH (Hungary); DAE and DST (India); IPM (Iran); SFI (Ireland); INFN (Italy); NRF (Korea); LAS (Lithuania); CINVESTAV, CONACYT, SEP, and UASLP-FAI (Mexico); PAEC (Pakistan); SCSR (Poland); FCT (Portugal); JINR (Armenia, Belarus, Georgia, Ukraine, Uzbekistan); MST and MAE (Russia); MSTDS (Serbia); MICINN and CPAN (Spain); Swiss Funding Agencies (Switzerland); NSC (Taipei); TUBITAK and TAEK (Turkey); STFC (United Kingdom); DOE and NSF (USA)
    • …
    corecore