1,302 research outputs found

    LNCS

    Get PDF
    We present layered concurrent programs, a compact and expressive notation for specifying refinement proofs of concurrent programs. A layered concurrent program specifies a sequence of connected concurrent programs, from most concrete to most abstract, such that common parts of different programs are written exactly once. These programs are expressed in the ordinary syntax of imperative concurrent programs using gated atomic actions, sequencing, choice, and (recursive) procedure calls. Each concurrent program is automatically extracted from the layered program. We reduce refinement to the safety of a sequence of concurrent checker programs, one each to justify the connection between every two consecutive concurrent programs. These checker programs are also automatically extracted from the layered program. Layered concurrent programs have been implemented in the CIVL verifier which has been successfully used for the verification of several complex concurrent programs

    Invariant Synthesis for Incomplete Verification Engines

    Full text link
    We propose a framework for synthesizing inductive invariants for incomplete verification engines, which soundly reduce logical problems in undecidable theories to decidable theories. Our framework is based on the counter-example guided inductive synthesis principle (CEGIS) and allows verification engines to communicate non-provability information to guide invariant synthesis. We show precisely how the verification engine can compute such non-provability information and how to build effective learning algorithms when invariants are expressed as Boolean combinations of a fixed set of predicates. Moreover, we evaluate our framework in two verification settings, one in which verification engines need to handle quantified formulas and one in which verification engines have to reason about heap properties expressed in an expressive but undecidable separation logic. Our experiments show that our invariant synthesis framework based on non-provability information can both effectively synthesize inductive invariants and adequately strengthen contracts across a large suite of programs

    Quantum nature of laser light

    Get PDF
    All compositions of a mixed-state density operator are equivalent for the prediction of the probabilities of future outcomes of measurements. For retrodiction, however, this is not the case. The retrodictive formalism of quantum mechanics provides a criterion for deciding that some compositions are fictional. Fictional compositions do not contain preparation device operators, that is operators corresponding to states that could have been prepared. We apply this to Molmer's controversial conjecture that optical coherences in laser light are a fiction and find agreement with his conjecture. We generalise Molmer's derivation of the interference between two lasers to avoid the use of any fictional states. We also examine another possible method for discriminating between conerent states and photon number states in laser light and find that it does not work, with the equivalence for prediction saved by entanglement

    Suitability of Tilting Technology to the Tyne and Wear Metro System.

    Get PDF
    This paper attempts to determine the suitability of tilting technology as applied to metro systems, taking the Tyne and Wear Metro as its base case study. This is done through designing and implementing of several tests which show the current metro situation and reveals possible impacts on ride comfort and speed, in case tilting technology has been implemented. The paper provides brief background literature review on tilting technology, its different designs and types, control systems, customer satisfaction and history on the Tyne and Wear metro system. Ride comfort evaluation methods, testing of the Metro fleet comfort levels and simulation modelling through the use of OpenTrack simulator software are also introduced. Results and findings include test accuracy and validations and suggest that although tilting technology could be beneficial with respect to speed (minimal improvements) and comfort, implementing it to the Tyne and Wear metro would be an unwise decision owing to the immense amount of upgrades that would be needed on both the network and the metro car fleet. Therefore, recommendations are subsequently made on alternative systems which could achieve or surpass the levels of comfort achievable by tilting technology without the need for an outright overhaul of lines and trains

    Dopamine Beta Hydroxylase Genotype Identifies Individuals Less Susceptible to Bias in Computer-Assisted Decision Making

    Get PDF
    Computerized aiding systems can assist human decision makers in complex tasks but can impair performance when they provide incorrect advice that humans erroneously follow, a phenomenon known as “automation bias.” The extent to which people exhibit automation bias varies significantly and may reflect inter-individual variation in the capacity of working memory and the efficiency of executive function, both of which are highly heritable and under dopaminergic and noradrenergic control in prefrontal cortex. The dopamine beta hydroxylase (DBH) gene is thought to regulate the differential availability of dopamine and norepinephrine in prefrontal cortex. We therefore examined decision-making performance under imperfect computer aiding in 100 participants performing a simulated command and control task. Based on two single nucleotide polymorphism (SNPs) of the DBH gene, −1041 C/T (rs1611115) and 444 G/A (rs1108580), participants were divided into groups of low and high DBH enzyme activity, where low enzyme activity is associated with greater dopamine relative to norepinephrine levels in cortex. Compared to those in the high DBH enzyme activity group, individuals in the low DBH enzyme activity group were more accurate and speedier in their decisions when incorrect advice was given and verified automation recommendations more frequently. These results indicate that a gene that regulates relative prefrontal cortex dopamine availability, DBH, can identify those individuals who are less susceptible to bias in using computerized decision-aiding systems

    Synergistic roles of climate warming and human occupation in Patagonian megafaunal extinctions during the Last Deglaciation.

    Full text link
    The causes of Late Pleistocene megafaunal extinctions (60,000 to 11,650 years ago, hereafter 60 to 11.65 ka) remain contentious, with major phases coinciding with both human arrival and climate change around the world. The Americas provide a unique opportunity to disentangle these factors as human colonization took place over a narrow time frame (~15 to 14.6 ka) but during contrasting temperature trends across each continent. Unfortunately, limited data sets in South America have so far precluded detailed comparison. We analyze genetic and radiocarbon data from 89 and 71 Patagonian megafaunal bones, respectively, more than doubling the high-quality Pleistocene megafaunal radiocarbon data sets from the region. We identify a narrow megafaunal extinction phase 12,280 ± 110 years ago, some 1 to 3 thousand years after initial human presence in the area. Although humans arrived immediately prior to a cold phase, the Antarctic Cold Reversal stadial, megafaunal extinctions did not occur until the stadial finished and the subsequent warming phase commenced some 1 to 3 thousand years later. The increased resolution provided by the Patagonian material reveals that the sequence of climate and extinction events in North and South America were temporally inverted, but in both cases, megafaunal extinctions did not occur until human presence and climate warming coincided. Overall, metapopulation processes involving subpopulation connectivity on a continental scale appear to have been critical for megafaunal species survival of both climate change and human impacts

    Comparing estimates of influenza-associated hospitalization and death among adults with congestive heart failure based on how influenza season is defined

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>There is little consensus about how the influenza season should be defined in studies that assess influenza-attributable risk. The objective of this study was to compare estimates of influenza-associated risk in a defined clinical population using four different methods of defining the influenza season.</p> <p>Methods</p> <p>Using the Studies of Left Ventricular Dysfunction (SOLVD) clinical database and national influenza surveillance data from 1986–87 to 1990–91, four definitions were used to assess influenza-associated risk: (a) three-week moving average of positive influenza isolates is at least 5%, (b) three-week moving average of positive influenza isolates is at least 10%, (c) first and last positive influenza isolate are identified, and (d) 5% of total number of positive isolates for the season are obtained. The clinical data were from adults aged 21 to 80 with physician-diagnosed congestive heart failure. All-cause hospitalization and all-cause mortality during the influenza seasons and non-influenza seasons were compared using four definitions of the influenza season. Incidence analyses and Cox regression were used to assess the effect of exposure to influenza season on all-cause hospitalization and death using all four definitions.</p> <p>Results</p> <p>There was a higher risk of hospitalization associated with the influenza season, regardless of how the start and stop of the influenza season was defined. The adjusted risk of hospitalization was 8 to 10 percent higher during the influenza season compared to the non-influenza season when the different definitions were used. However, exposure to influenza was not consistently associated with higher risk of death when all definitions were used. When the 5% moving average and first/last positive isolate definitions were used, exposure to influenza was associated with a higher risk of death compared to non-exposure in this clinical population (adjusted hazard ratios [HR], 1.16; 95% confidence interval [CI], 1.04 to 1.29 and adjusted HR, 1.19; 95% CI, 1.06 to 1.33, respectively).</p> <p>Conclusion</p> <p>Estimates of influenza-attributable risk may vary depending on how influenza season is defined and the outcome being assessed.</p

    A Large Change in Temperature between Neighbouring Days Increases the Risk of Mortality

    Get PDF
    Background: Previous studies have found high temperatures increase the risk of mortality in summer. However, little is known about whether a sharp decrease or increase in temperature between neighbouring days has any effect on mortality. Method: Poisson regression models were used to estimate the association between temperature change and mortality in summer in Brisbane, Australia during 1996–2004 and Los Angeles, United States during 1987–2000. The temperature change was calculated as the current day’s mean temperature minus the previous day’s mean. Results: In Brisbane, a drop of more than 3 °C in temperature between days was associated with relative risks (RRs) of 1.157 (95% confidence interval (CI): 1.024, 1.307) for total non external mortality (NEM), 1.186 (95%CI: 1.002, 1.405) for NEM in females, and 1.442 (95%CI: 1.099, 1.892) for people aged 65–74 years. An increase of more than 3 °C was associated with RRs of 1.353 (95%CI: 1.033, 1.772) for cardiovascular mortality and 1.667 (95%CI: 1.146, 2.425) for people aged < 65 years. In Los Angeles, only a drop of more than 3 °C was significantly associated with RRs of 1.133 (95%CI: 1.053, 1.219) for total NEM, 1.252 (95%CI: 1.131, 1.386) for cardiovascular mortality, and 1.254 (95%CI: 1.135, 1.385) for people aged ≥75 years. In both cities, there were joint effects of temperature change and mean temperature on NEM. Conclusion : A significant change in temperature of more than 3 °C, whether positive or negative, has an adverse impact on mortality even after controlling for the current temperature
    corecore