914 research outputs found

    Solving the subset-sum problem with a light-based device

    Full text link
    We propose a special computational device which uses light rays for solving the subset-sum problem. The device has a graph-like representation and the light is traversing it by following the routes given by the connections between nodes. The nodes are connected by arcs in a special way which lets us to generate all possible subsets of the given set. To each arc we assign either a number from the given set or a predefined constant. When the light is passing through an arc it is delayed by the amount of time indicated by the number placed in that arc. At the destination node we will check if there is a ray whose total delay is equal to the target value of the subset sum problem (plus some constants).Comment: 14 pages, 6 figures, Natural Computing, 200

    Measuring the effect of enhanced cleaning in a UK hospital : a prospective cross-over study

    Get PDF
    Increasing hospital-acquired infections have generated much attention over the last decade. There is evidence that hygienic cleaning has a role in the control of hospital-acquired infections. This study aimed to evaluate the potential impact of one additional cleaner by using microbiological standards based on aerobic colony counts and the presence of Staphylococcus aureus including meticillin-resistant S. aureus. We introduced an additional cleaner into two matched wards from Monday to Friday, with each ward receiving enhanced cleaning for six months in a cross-over design. Ten hand-touch sites on both wards were screened weekly using standardised methods and patients were monitored for meticillin-resistant S. aureus infection throughout the year-long study. Patient and environmental meticillin-resistant S. aureus isolates were characterised using molecular methods in order to investigate temporal and clonal relationships. Enhanced cleaning was associated with a 32.5% reduction in levels of microbial contamination at handtouch sites when wards received enhanced cleaning (P < 0.0001: 95% CI 20.2%, 42.9%). Near-patient sites (lockers, overbed tables and beds) were more frequently contaminated with meticillin-resistant S. aureus/S. aureus than sites further from the patient (P = 0.065). Genotyping identified indistinguishable strains from both handtouch sites and patients. There was a 26.6% reduction in new meticillin-resistant S. aureus infections on the wards receiving extra cleaning, despite higher meticillin-resistant S. aureus patient-days and bed occupancy rates during enhanced cleaning periods (P = 0.032: 95% CI 7.7%, 92.3%). Adjusting for meticillin-resistant S. aureus patient-days and based upon nine new meticillin-resistant S. aureus infections seen during routine cleaning, we expected 13 new infections during enhanced cleaning periods rather than the four that actually occurred. Clusters of new meticillin-resistant S. aureus infections were identified 2 to 4 weeks after the cleaner left both wards. Enhanced cleaning saved the hospital £30,000 to £70,000.Introducing one extra cleaner produced a measurable effect on the clinical environment, with apparent benefit to patients regarding meticillin-resistant S. aureus infection. Molecular epidemiological methods supported the possibility that patients acquired meticillin-resistant S. aureus from environmental sources. These findings suggest that additional research is warranted to further clarify the environmental, clinical and economic impact of enhanced hygienic cleaning as a component in the control of hospital-acquired infection

    Mycobacterium tuberculosis osteomyelitis in a patient with human immunodeficiency virus/acquired immunodeficiency syndrome (HIV/AIDS): a case report

    Get PDF
    The incidence of tuberculosis is increasing in the United States. Extra-pulmonary involvement is more common in patients with HIV/AIDS. The diagnosis of Tuberculosis osteomyelitis requires a high degree of suspicion for accurate and timely diagnosis

    Bayesian Methods for Exoplanet Science

    Full text link
    Exoplanet research is carried out at the limits of the capabilities of current telescopes and instruments. The studied signals are weak, and often embedded in complex systematics from instrumental, telluric, and astrophysical sources. Combining repeated observations of periodic events, simultaneous observations with multiple telescopes, different observation techniques, and existing information from theory and prior research can help to disentangle the systematics from the planetary signals, and offers synergistic advantages over analysing observations separately. Bayesian inference provides a self-consistent statistical framework that addresses both the necessity for complex systematics models, and the need to combine prior information and heterogeneous observations. This chapter offers a brief introduction to Bayesian inference in the context of exoplanet research, with focus on time series analysis, and finishes with an overview of a set of freely available programming libraries.Comment: Invited revie

    Submillimeter Studies of Prestellar Cores and Protostars: Probing the Initial Conditions for Protostellar Collapse

    Full text link
    Improving our understanding of the initial conditions and earliest stages of protostellar collapse is crucial to gain insight into the origin of stellar masses, multiple systems, and protoplanetary disks. Observationally, there are two complementary approaches to this problem: (1) studying the structure and kinematics of prestellar cores observed prior to protostar formation, and (2) studying the structure of young (e.g. Class 0) accreting protostars observed soon after point mass formation. We discuss recent advances made in this area thanks to (sub)millimeter mapping observations with large single-dish telescopes and interferometers. In particular, we argue that the beginning of protostellar collapse is much more violent in cluster-forming clouds than in regions of distributed star formation. Major breakthroughs are expected in this field from future large submillimeter instruments such as Herschel and ALMA.Comment: 12 pages, 9 figures, to appear in the proceedings of the conference "Chemistry as a Diagnostic of Star Formation" (C.L. Curry & M. Fich eds.

    Decision-Making in Research Tasks with Sequential Testing

    Get PDF
    Background: In a recent controversial essay, published by JPA Ioannidis in PLoS Medicine, it has been argued that in some research fields, most of the published findings are false. Based on theoretical reasoning it can be shown that small effect sizes, error-prone tests, low priors of the tested hypotheses and biases in the evaluation and publication of research findings increase the fraction of false positives. These findings raise concerns about the reliability of research. However, they are based on a very simple scenario of scientific research, where single tests are used to evaluate independent hypotheses. Methodology/Principal Findings: In this study, we present computer simulations and experimental approaches for analyzing more realistic scenarios. In these scenarios, research tasks are solved sequentially, i.e. subsequent tests can be chosen depending on previous results. We investigate simple sequential testing and scenarios where only a selected subset of results can be published and used for future rounds of test choice. Results from computer simulations indicate that for the tasks analyzed in this study, the fraction of false among the positive findings declines over several rounds of testing if the most informative tests are performed. Our experiments show that human subjects frequently perform the most informative tests, leading to a decline of false positives as expected from the simulations. Conclusions/Significance: For the research tasks studied here, findings tend to become more reliable over time. We also find that the performance in those experimental settings where not all performed tests could be published turned out to be surprisingly inefficient. Our results may help optimize existing procedures used in the practice of scientific research and provide guidance for the development of novel forms of scholarly communication.Engineering and Applied SciencesPsycholog

    Performance of Proximity Loggers in Recording Intra- and Inter-Species Interactions: A Laboratory and Field-Based Validation Study

    Get PDF
    Knowledge of the way in which animals interact through social networks can help to address questions surrounding the ecological and evolutionary consequences of social organisation, and to understand and manage the spread of infectious diseases. Automated proximity loggers are increasingly being used to record interactions between animals, but the accuracy and reliability of the collected data remain largely un-assessed. Here we use laboratory and observational field data to assess the performance of these devices fitted to a herd of 32 beef cattle (Bos taurus) and nine groups of badgers (Meles meles, n  = 77) living in the surrounding woods. The distances at which loggers detected each other were found to decrease over time, potentially related to diminishing battery power that may be a function of temperature. Loggers were highly accurate in recording the identification of contacted conspecifics, but less reliable at determining contact duration. There was a tendency for extended interactions to be recorded as a series of shorter contacts. We show how data can be manipulated to correct this discrepancy and accurately reflect observed interaction patterns by combining records between any two loggers that occur within a 1 to 2 minute amalgamation window, and then removing any remaining 1 second records. We make universally applicable recommendations for the effective use of proximity loggers, to improve the validity of data arising from future studies
    • …
    corecore