97 research outputs found

    Admit your weakness: Verifying correctness on TSO architectures

    Get PDF
    “The final publication is available at http://link.springer.com/chapter/10.1007%2F978-3-319-15317-9_22 ”.Linearizability has become the standard correctness criterion for fine-grained non-atomic concurrent algorithms, however, most approaches assume a sequentially consistent memory model, which is not always realised in practice. In this paper we study the correctness of concurrent algorithms on a weak memory model: the TSO (Total Store Order) memory model, which is commonly implemented by multicore architectures. Here, linearizability is often too strict, and hence, we prove a weaker criterion, quiescent consistency instead. Like linearizability, quiescent consistency is compositional making it an ideal correctness criterion in a component-based context. We demonstrate how to model a typical concurrent algorithm, seqlock, and prove it quiescent consistent using a simulation-based approach. Previous approaches to proving correctness on TSO architectures have been based on linearizabilty which makes it necessary to modify the algorithm’s high-level requirements. Our approach is the first, to our knowledge, for proving correctness without the need for such a modification

    LNCS

    Get PDF
    Concurrent accesses to shared data structures must be synchronized to avoid data races. Coarse-grained synchronization, which locks the entire data structure, is easy to implement but does not scale. Fine-grained synchronization can scale well, but can be hard to reason about. Hand-over-hand locking, in which operations are pipelined as they traverse the data structure, combines fine-grained synchronization with ease of use. However, the traditional implementation suffers from inherent overheads. This paper introduces snapshot-based synchronization (SBS), a novel hand-over-hand locking mechanism. SBS decouples the synchronization state from the data, significantly improving cache utilization. Further, it relies on guarantees provided by pipelining to minimize synchronization that requires cross-thread communication. Snapshot-based synchronization thus scales much better than traditional hand-over-hand locking, while maintaining the same ease of use

    Novel approach to analysing large data sets of personal sun exposure measurements

    Get PDF
    Personal sun exposure measurements provide important information to guide the development of sun awareness and disease prevention campaigns. We assess the scaling properties of personal ultraviolet radiation (pUVR) sun exposure measurements using the wavelet transform (WT) spectral analysis to process long-range, high-frequency personal recordings collected by electronic UVR dosimeters designed to measure erythemal UVR exposure. We analysed the sun exposure recordings of school children, farmers, marathon runners and outdoor workers in South Africa, and construction workers and work site supervisors in New Zealand. We found scaling behaviour in all the analysed pUVR data sets. We found that the observed scaling changes from uncorrelated to long-range correlated with increasing duration of sun exposure. Peaks in the WT spectra that we found suggest the existence of characteristic times in sun exposure behaviour that were to some extent universal across our data set. Our study also showed that WT measures enable group classification, as well as distinction between individual UVR exposures, otherwise unattainable by conventional statistical methods

    Assessing Historical Fish Community Composition Using Surveys, Historical Collection Data, and Species Distribution Models

    Get PDF
    Accurate establishment of baseline conditions is critical to successful management and habitat restoration. We demonstrate the ability to robustly estimate historical fish community composition and assess the current status of the urbanized Barton Creek watershed in central Texas, U.S.A. Fish species were surveyed in 2008 and the resulting data compared to three sources of fish occurrence information: (i) historical records from a museum specimen database and literature searches; (ii) a nearly identical survey conducted 15 years earlier; and (iii) a modeled historical community constructed with species distribution models (SDMs). This holistic approach, and especially the application of SDMs, allowed us to discover that the fish community in Barton Creek was more diverse than the historical data and survey methods alone indicated. Sixteen native species with high modeled probability of occurrence within the watershed were not found in the 2008 survey, seven of these were not found in either survey or in any of the historical collection records. Our approach allowed us to more rigorously establish the true baseline for the pre-development fish fauna and then to more accurately assess trends and develop hypotheses regarding factors driving current fish community composition to better inform management decisions and future restoration efforts. Smaller, urbanized freshwater systems, like Barton Creek, typically have a relatively poor historical biodiversity inventory coupled with long histories of alteration, and thus there is a propensity for land managers and researchers to apply inaccurate baseline standards. Our methods provide a way around that limitation by using SDMs derived from larger and richer biodiversity databases of a broader geographic scope. Broadly applied, we propose that this technique has potential to overcome limitations of popular bioassessment metrics (e.g., IBI) to become a versatile and robust management tool for determining status of freshwater biotic communities

    Provenancing Archaeological Wool Textiles from Medieval Northern Europe by Light Stable Isotope Analysis (δ13C, δ15N, δ2H)

    Get PDF
    We investigate the origin of archaeological wool textiles preserved by anoxic waterlogging from seven medieval archaeological deposits in north-western Europe (c. 700-1600 AD), using geospatial patterning in carbon (δ13C), nitrogen (δ15N) and non-exchangeable hydrogen (δ2H) composition of modern and ancient sheep proteins. δ13C, δ15N and δ2H values from archaeological wool keratin (n = 83) and bone collagen (n = 59) from four sites were interpreted with reference to the composition of modern sheep wool from the same regions. The isotopic composition of wool and bone collagen samples clustered strongly by settlement; inter-regional relationships were largely parallel in modern and ancient samples, though landscape change was also significant. Degradation in archaeological wool samples, examined by elemental and amino acid composition, was greater in samples from Iceland (Reykholt) than in samples from north-east England (York, Newcastle) or northern Germany (Hessens). A nominal assignment approach was used to classify textiles into local/non-local at each site, based on maximal estimates of isotopic variability in modern sheep wool. Light element stable isotope analysis provided new insights into the origins of wool textiles, and demonstrates that isotopic provenancing of keratin preserved in anoxic waterlogged contexts is feasible. We also demonstrate the utility of δ2H analysis to understand the location of origin of archaeological protein samples

    Can We Monitor All Multithreaded Programs?

    Get PDF
    International audienceRuntime Verification (RV) is a lightweight formal method which consists in verifying that an execution of a program is correct wrt a specification. The specification formalizes with properties the expected correct behavior of the system. Programs are instrumented to extract necessary information from the execution and feed it to monitors tasked with checking the properties. From the perspective of a monitor, the system is a black box; the trace is the only system information provided. Parallel programs generally introduce an added level of complexity on the program execution due to concurrency. A concurrent execution of a parallel program is best represented as a partial order. A large number of RV approaches generate monitors using formalisms that rely on total order, while more recent approaches utilize formalisms that consider multiple traces. In this tutorial, we review some of the main RV approaches and tools that handle multithreaded Java programs. We discuss their assumptions, limitations, ex-pressiveness, and suitability when tackling parallel programs such as producer-consumer and readers-writers. By analyzing the interplay between specification formalisms and concurrent executions of programs, we identify four questions RV practitioners may ask themselves to classify and determine the situations in which it is sound to use the existing tools and approaches

    Perispinal Etanercept for Post-Stroke Neurological and Cognitive Dysfunction: Scientific Rationale and Current Evidence

    Get PDF

    Revisiting Abstraction Functions For Reasoning About Concurrency

    No full text
    • …
    corecore