575 research outputs found

    A Lightweight Multilevel Markup Language for Connecting Software Requirements and Simulations

    Get PDF
    [Context] Simulation is a powerful tool to validate specified requirements especially for complex systems that constantly monitor and react to characteristics of their environment. The simulators for such systems are complex themselves as they simulate multiple actors with multiple interacting functions in a number of different scenarios. To validate requirements in such simulations, the requirements must be related to the simulation runs. [Problem] In practice, engineers are reluctant to state their requirements in terms of structured languages or models that would allow for a straightforward relation of requirements to simulation runs. Instead, the requirements are expressed as unstructured natural language text that is hard to assess in a set of complex simulation runs. Therefore, the feedback loop between requirements and simulation is very long or non-existent at all. [Principal idea] We aim to close the gap between requirements specifications and simulation by proposing a lightweight markup language for requirements. Our markup language provides a set of annotations on different levels that can be applied to natural language requirements. The annotations are mapped to simulation events. As a result, meaningful information from a set of simulation runs is shown directly in the requirements specification. [Contribution] Instead of forcing the engineer to write requirements in a specific way just for the purpose of relating them to a simulator, the markup language allows annotating the already specified requirements up to a level that is interesting for the engineer. We evaluate our approach by analyzing 8 original requirements of an automotive system in a set of 100 simulation runs

    Nuclear Resonance Vibrational Spectroscopy of Iron Sulfur Proteins

    Full text link
    Nuclear inelastic scattering in conjunction with density functional theory (DFT) calculations has been applied for the identification of vibrational modes of the high-spin ferric and the high-spin ferrous iron-sulfur center of a rubredoxin-type protein from the thermophylic bacterium Pyrococcus abysii

    Elucidating the structural composition of a Fe-N-C catalyst by nuclear and electron resonance techniques

    Get PDF
    Fe–N–C catalysts are very promising materials for fuel cells and metal–air batteries. This work gives fundamental insights into the structural composition of an Fe–N–C catalyst and highlights the importance of an in‐depth characterization. By nuclear‐ and electron‐resonance techniques, we are able to show that even after mild pyrolysis and acid leaching, the catalyst contains considerable fractions of α‐iron and, surprisingly, iron oxide. Our work makes it questionable to what extent FeN4 sites can be present in Fe–N–C catalysts prepared by pyrolysis at 900 °C and above. The simulation of the iron partial density of phonon states enables the identification of three FeN4 species in our catalyst, one of them comprising a sixfold coordination with end‐on bonded oxygen as one of the axial ligands

    Violation of Bells inequality using continuous variable measurements

    Get PDF
    A Bell inequality is a fundamental test to rule out local hidden variable model descriptions of correlations between two physically separated systems. There have been a number of experiments in which a Bell inequality has been violated using discrete-variable systems. We demonstrate a violation of Bells inequality using continuous variable quadrature measurements. By creating a four-mode entangled state with homodyne detection, we recorded a clear violation with a Bell value of B=2.31±0.02B = 2.31 \pm 0.02. This opens new possibilities for using continuous variable states for device independent quantum protocols.Comment: 5 pages, 4 figures, lette

    Improving the use of research evidence in guideline development: 1. Guidelines for guidelines

    Get PDF
    BACKGROUND: The World Health Organization (WHO), like many other organisations around the world, has recognised the need to use more rigorous processes to ensure that health care recommendations are informed by the best available research evidence. This is the first of a series of 16 reviews that have been prepared as background for advice from the WHO Advisory Committee on Health Research to WHO on how to achieve this. OBJECTIVES: We reviewed the literature on guidelines for the development of guidelines. METHODS: We searched PubMed and three databases of methodological studies for existing systematic reviews and relevant methodological research. We did not conduct systematic reviews ourselves. Our conclusions are based on the available evidence, consideration of what WHO and other organisations are doing and logical arguments. KEY QUESTIONS AND ANSWERS: We found no experimental research that compared different formats of guidelines for guidelines or studies that compared different components of guidelines for guidelines. However, there are many examples, surveys and other observational studies that compared the impact of different guideline development documents on guideline quality. WHAT HAVE OTHER ORGANIZATIONS DONE TO DEVELOP GUIDELINES FOR GUIDELINES FROM WHICH WHO CAN LEARN? • Establish a credible, independent committee that evaluates existing methods for developing guidelines or that updates existing ones. • Obtain feedback and approval from various stakeholders during the development process of guidelines for guidelines. • Develop a detailed source document (manual) that guideline developers can use as reference material. WHAT SHOULD BE THE KEY COMPONENTS OF WHO GUIDELINES FOR GUIDELINES? • Guidelines for guidelines should include information and instructions about the following components: 1) Priority setting; 2) Group composition and consultations; 3) Declaration and avoidance of conflicts of interest; 4) Group processes; 5) Identification of important outcomes; 6) Explicit definition of the questions and eligibility criteria ; 7) Type of study designs for different questions; 8) Identification of evidence; 9) Synthesis and presentation of evidence; 10) Specification and integration of values; 11) Making judgments about desirable and undesirable effects; 12) Taking account of equity; 13) Grading evidence and recommendations; 14) Taking account of costs; 15) Adaptation, applicability, transferability of guidelines; 16) Structure of reports; 17) Methods of peer review; 18) Planned methods of dissemination & implementation; 19) Evaluation of the guidelines. WHAT HAVE OTHER ORGANIZATIONS DONE TO IMPLEMENT GUIDELINES FOR GUIDELINES FROM WHICH WHO CAN LEARN? • Obtain buy-in from regions and country level representatives for guidelines for guidelines before dissemination of a revised version. • Disseminate the guidelines for guidelines widely and make them available (e.g. on the Internet). • Develop examples of guidelines that guideline developers can use as models when applying the guidelines for guidelines. • Ensure training sessions for those responsible for developing guidelines. • Continue to monitor the methodological literature on guideline development

    Urinary estrogen metabolites and prostate cancer : a case-control study and meta-analysis

    Get PDF
    Objective: To investigate prostate cancer (Pca) risk in relation to estrogen metabolism, expressed as urinary 2-hydroxyestrone (2-OHE1), 16α-hydroxyestrone (16α-OHE1) and 2-OHE1 to 16α-OHE1 ratio. Methods: We conducted a case-control study within the Western New York Health Cohort Study (WNYHCS) from 1996 to 2001. From January 2003 through September 2004, we completed the re-call and follow-up of 1092 cohort participants. Cases (n = 26) and controls (n = 110) were matched on age, race and recruitment period according to a 1:4 ratio. We used the unconditional logistic regression to compute crude and adjusted odds ratios (OR) and 95% confident interval (CI) of Pca in relation to 2-OHE1, 16αOHE1 and 2-OHE1 to 16α-OHE1 by tertiles of urine concentrations (stored in a biorepository for an average of 4 years). We identified age, race, education and body mass index as covariates. We also conducted a systematic review of the literature which revealed no additional studies, but we pooled the results from this study with those from a previously conducted case-control study using the DerSimonian-Laird random effects method. Results: We observed a non-significant risk reduction in the highest tertile of 2-OHE1 (OR 0.72, 95% CI 0.25-2.10). Conversely, the odds in the highest tertile of 16α-OHE1 showed a non-significant risk increase (OR 1.76 95% CI 0.62-4.98). There was a suggestion of reduced Pca risk for men in the highest tertile of 2-OHE1 to 16α-OHE1 ratio (OR 0.56, 95% CI 0.19-1.68). The pooled estimates confirmed the association between an increased Pca risk and higher urinary levels of 16α-OHE1 (third vs. first tertile: OR 1.82, 95% CI 1.09-3.05) and the protective effect of a higher 2-OHE 1 to 16α-OHE1 ratio (third vs. first tertile: OR 0.53, 95% CI 0.31-0.90). Conclusion: Our study and the pooled results provide evidence for a differential role of the estrogen hydroxylation pathway in Pca development and encourage further study

    Autler-Townes splitting in two-color photoassociation of 6Li

    Full text link
    We report on high-resolution two-color photoassociation spectroscopy in the triplet system of magneto-optically trapped 6Li. The absolute transition frequencies have been measured. Strong optical coupling of the bound molecular states has been observed as Autler-Townes splitting in the photoassociation signal. The spontaneous bound-bound transition rate is determined and the molecule formation rate is estimated. The observed lineshapes are in good agreement with the theoretical model.Comment: 5 pages, 4 figures, accepted for publication in Phys. Rev. A (Rapid Communication

    “AI’s gonna have an impact on everything in society, so it has to have an impact on public health”: a fundamental qualitative descriptive study of the implications of artificial intelligence for public health

    Get PDF
    Background: Our objective was to determine the impacts of artificial intelligence (AI) on public health practice. Methods: We used a fundamental qualitative descriptive study design, enrolling 15 experts in public health and AI from June 2018 until July 2019 who worked in North America and Asia. We conducted in-depth semi-structured interviews, iteratively coded the resulting transcripts, and analyzed the results thematically. Results: We developed 137 codes, from which nine themes emerged. The themes included opportunities such as leveraging big data and improving interventions; barriers to adoption such as confusion regarding AI’s applicability, limited capacity, and poor data quality; and risks such as propagation of bias, exacerbation of inequity, hype, and poor regulation. Conclusions: Experts are cautiously optimistic about AI’s impacts on public health practice, particularly for improving disease surveillance. However, they perceived substantial barriers, such as a lack of available expertise, and risks, including inadequate regulation. Therefore, investment and research into AI for public health practice would likely be beneficial. However, increased access to high-quality data, research and education regarding the limitations of AI, and development of rigorous regulation are necessary to realize these benefits

    Saturation in heteronuclear photoassociation of 6Li7Li

    Full text link
    We report heteronuclear photoassociation spectroscopy in a mixture of magneto-optically trapped 6Li and 7Li. Hyperfine resolved spectra of the vibrational level v=83 of the singlet state have been taken up to intensities of 1000 W/cm^2. Saturation of the photoassociation rate has been observed for two hyperfine transitions, which can be shown to be due to saturation of the rate coefficient near the unitarity limit. Saturation intensities on the order of 40 W/cm^2 can be determined.Comment: 5 pages, 3 figures, to appear in Phys. Rev. A (Rapid Communication

    Improving the interpretation of quality of life evidence in meta-analyses: the application of minimal important difference units

    Get PDF
    Systematic reviews of randomized trials that include measurements of health-related quality of life potentially provide critical information for patient and clinicians facing challenging health care decisions. When, as is most often the case, individual randomized trials use different measurement instruments for the same construct (such as physical or emotional function), authors typically report differences between intervention and control in standard deviation units (so-called "standardized mean difference" or "effect size"). This approach has statistical limitations (it is influenced by the heterogeneity of the population) and is non-intuitive for decision makers. We suggest an alternative approach: reporting results in minimal important difference units (the smallest difference patients experience as important). This approach provides a potential solution to both the statistical and interpretational problems of existing methods
    corecore