911 research outputs found

    A wide-spectrum language for verification of programs on weak memory models

    Full text link
    Modern processors deploy a variety of weak memory models, which for efficiency reasons may (appear to) execute instructions in an order different to that specified by the program text. The consequences of instruction reordering can be complex and subtle, and can impact on ensuring correctness. Previous work on the semantics of weak memory models has focussed on the behaviour of assembler-level programs. In this paper we utilise that work to extract some general principles underlying instruction reordering, and apply those principles to a wide-spectrum language encompassing abstract data types as well as low-level assembler code. The goal is to support reasoning about implementations of data structures for modern processors with respect to an abstract specification. Specifically, we define an operational semantics, from which we derive some properties of program refinement, and encode the semantics in the rewriting engine Maude as a model-checking tool. The tool is used to validate the semantics against the behaviour of a set of litmus tests (small assembler programs) run on hardware, and also to model check implementations of data structures from the literature against their abstract specifications

    Raising argument strength using negative evidence: A constraint on models of induction

    Get PDF
    Both intuitively, and according to similarity-based theories of induction, relevant evidence raises argument strength when it is positive and lowers it when it is negative. In three experiments, we tested the hypothesis that argument strength can actually increase when negative evidence is introduced. Two kinds of argument were compared through forced choice or sequential evaluation: single positive arguments (e.g., “Shostakovich’s music causes alpha waves in the brain; therefore, Bach’s music causes alpha waves in the brain”) and double mixed arguments (e.g., “Shostakovich’s music causes alpha waves in the brain, X’s music DOES NOT; therefore, Bach’s music causes alpha waves in the brain”). Negative evidence in the second premise lowered credence when it applied to an item X from the same subcategory (e.g., Haydn) and raised it when it applied to a different subcategory (e.g., AC/DC). The results constitute a new constraint on models of induction

    MRI in multiple myeloma : a pictorial review of diagnostic and post-treatment findings

    Get PDF
    Magnetic resonance imaging (MRI) is increasingly being used in the diagnostic work-up of patients with multiple myeloma. Since 2014, MRI findings are included in the new diagnostic criteria proposed by the International Myeloma Working Group. Patients with smouldering myeloma presenting with more than one unequivocal focal lesion in the bone marrow on MRI are considered having symptomatic myeloma requiring treatment, regardless of the presence of lytic bone lesions. However, bone marrow evaluation with MRI offers more than only morphological information regarding the detection of focal lesions in patients with MM. The overall performance of MRI is enhanced by applying dynamic contrast-enhanced MRI and diffusion weighted imaging sequences, providing additional functional information on bone marrow vascularization and cellularity. This pictorial review provides an overview of the most important imaging findings in patients with monoclonal gammopathy of undetermined significance, smouldering myeloma and multiple myeloma, by performing a 'total' MRI investigation with implications for the diagnosis, staging and response assessment. Main message aEuro cent Conventional MRI diagnoses multiple myeloma by assessing the infiltration pattern. aEuro cent Dynamic contrast-enhanced MRI diagnoses multiple myeloma by assessing vascularization and perfusion. aEuro cent Diffusion weighted imaging evaluates bone marrow composition and cellularity in multiple myeloma. aEuro cent Combined morphological and functional MRI provides optimal bone marrow assessment for staging. aEuro cent Combined morphological and functional MRI is of considerable value in treatment follow-up

    Development of an invasively monitored porcine model of acetaminophen-induced acute liver failure

    Get PDF
    Background: The development of effective therapies for acute liver failure (ALF) is limited by our knowledge of the pathophysiology of this condition, and the lack of suitable large animal models of acetaminophen toxicity. Our aim was to develop a reproducible invasively-monitored porcine model of acetaminophen-induced ALF. Method: 35kg pigs were maintained under general anaesthesia and invasively monitored. Control pigs received a saline infusion, whereas ALF pigs received acetaminophen intravenously for 12 hours to maintain blood concentrations between 200-300 mg/l. Animals surviving 28 hours were euthanased. Results: Cytochrome p450 levels in phenobarbital pre-treated animals were significantly higher than non pre-treated animals (300 vs 100 pmol/mg protein). Control pigs (n=4) survived 28-hour anaesthesia without incident. Of nine pigs that received acetaminophen, four survived 20 hours and two survived 28 hours. Injured animals developed hypotension (mean arterial pressure; 40.8+/-5.9 vs 59+/-2.0 mmHg), increased cardiac output (7.26+/-1.86 vs 3.30+/-0.40 l/min) and decreased systemic vascular resistance (8.48+/-2.75 vs 16.2+/-1.76 mPa/s/m3). Dyspnoea developed as liver injury progressed and the increased pulmonary vascular resistance (636+/-95 vs 301+/-26.9 mPa/s/m3) observed may reflect the development of respiratory distress syndrome. Liver damage was confirmed by deterioration in pH (7.23+/-0.05 vs 7.45+/-0.02) and prothrombin time (36+/-2 vs 8.9+/-0.3 seconds) compared with controls. Factor V and VII levels were reduced to 9.3 and 15.5% of starting values in injured animals. A marked increase in serum AST (471.5+/-210 vs 42+/-8.14) coincided with a marked reduction in serum albumin (11.5+/-1.71 vs 25+/-1 g/dL) in injured animals. Animals displayed evidence of renal impairment; mean creatinine levels 280.2+/-36.5 vs 131.6+/-9.33 mumol/l. Liver histology revealed evidence of severe centrilobular necrosis with coagulative necrosis. Marked renal tubular necrosis was also seen. Methaemoglobin levels did not rise >5%. Intracranial hypertension was not seen (ICP monitoring), but there was biochemical evidence of encephalopathy by the reduction of Fischer's ratio from 5.6 +/- 1.1 to 0.45 +/- 0.06. Conclusion: We have developed a reproducible large animal model of acetaminophen-induced liver failure, which allows in-depth investigation of the pathophysiological basis of this condition. Furthermore, this represents an important large animal model for testing artificial liver support systems

    Vitamin D in the general population of young adults with autism in the Faroe Islands

    Get PDF
    Vitamin D deficiency has been proposed as a possible risk factor for developing autism spectrum disorder (ASD). 25-Hydroxyvitamin D3 (25(OH)D3) levels were examined in a cross-sectional population-based study in the Faroe Islands. The case group consisting of a total population cohort of 40 individuals with ASD (aged 15–24 years) had significantly lower 25(OH)D3 than their 62 typically-developing siblings and their 77 parents, and also significantly lower than 40 healthy age and gender matched comparisons. There was a trend for males having lower 25(OH)D3 than females. Effects of age, month/season of birth, IQ, various subcategories of ASD and Autism Diagnostic Observation Schedule score were also investigated, however, no association was found. The very low 25(OH)D3 in the ASD group suggests some underlying pathogenic mechanism

    Time-dependent ARMA modeling of genomic sequences

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Over the past decade, many investigators have used sophisticated time series tools for the analysis of genomic sequences. Specifically, the correlation of the nucleotide chain has been studied by examining the properties of the power spectrum. The main limitation of the power spectrum is that it is restricted to stationary time series. However, it has been observed over the past decade that genomic sequences exhibit non-stationary statistical behavior. Standard statistical tests have been used to verify that the genomic sequences are indeed not stationary. More recent analysis of genomic data has relied on time-varying power spectral methods to capture the statistical characteristics of genomic sequences. Techniques such as the evolutionary spectrum and evolutionary periodogram have been successful in extracting the time-varying correlation structure. The main difficulty in using time-varying spectral methods is that they are extremely unstable. Large deviations in the correlation structure results from very minor perturbations in the genomic data and experimental procedure. A fundamental new approach is needed in order to provide a stable platform for the non-stationary statistical analysis of genomic sequences.</p> <p>Results</p> <p>In this paper, we propose to model non-stationary genomic sequences by a time-dependent autoregressive moving average (TD-ARMA) process. The model is based on a classical ARMA process whose coefficients are allowed to vary with time. A series expansion of the time-varying coefficients is used to form a generalized Yule-Walker-type system of equations. A recursive least-squares algorithm is subsequently used to estimate the time-dependent coefficients of the model. The non-stationary parameters estimated are used as a basis for statistical inference and biophysical interpretation of genomic data. In particular, we rely on the TD-ARMA model of genomic sequences to investigate the statistical properties and differentiate between coding and non-coding regions in the nucleotide chain. Specifically, we define a quantitative measure of randomness to assess how far a process deviates from white noise. Our simulation results on various gene sequences show that both the coding and non-coding regions are non-random. However, coding sequences are "whiter" than non-coding sequences as attested by a higher index of randomness.</p> <p>Conclusion</p> <p>We demonstrate that the proposed TD-ARMA model can be used to provide a stable time series tool for the analysis of non-stationary genomic sequences. The estimated time-varying coefficients are used to define an index of randomness, in order to assess the statistical correlations in coding and non-coding DNA sequences. It turns out that the statistical differences between coding and non-coding sequences are more subtle than previously thought using stationary analysis tools: Both coding and non-coding sequences exhibit statistical correlations, with the coding regions being "whiter" than the non-coding regions. These results corroborate the evolutionary periodogram analysis of genomic sequences and revoke the stationary analysis' conclusion that coding DNA behaves like random sequences.</p

    Estimating Genetic Variability in Non-Model Taxa: A General Procedure for Discriminating Sequence Errors from Actual Variation

    Get PDF
    Genetic variation is the driving force of evolution and as such is of central interest for biologists. However, inadequate discrimination of errors from true genetic variation could lead to incorrect estimates of gene copy number, population genetic parameters, phylogenetic relationships and the deposition of gene and protein sequences in databases that are not actually present in any organism. Misincorporation errors in multi-template PCR cloning methods, still commonly used for obtaining novel gene sequences in non-model species, are difficult to detect, as no previous information may be available about the number of expected copies of genes belonging to multi-gene families. However, studies employing these techniques rarely describe in any great detail how errors arising in the amplification process were detected and accounted for. Here, we estimated the rate of base misincorporation of a widely-used PCR-cloning method, using a single copy mitochondrial gene from a single individual to minimise variation in the template DNA, as 1.62×10−3 errors per site, or 9.26×10−5 per site per duplication. The distribution of errors among sequences closely matched that predicted by a binomial distribution function. The empirically estimated error rate was applied to data, obtained using the same methods, from the Phospholipase A2 toxin family from the pitviper Ovophis monticola. The distribution of differences detected closely matched the expected distribution of errors and we conclude that, when undertaking gene discovery or assessment of genetic diversity using this error-prone method, it will be informative to empirically determine the rate of base misincorporation

    A Genetic Screen for Attenuated Growth Identifies Genes Crucial for Intraerythrocytic Development of Plasmodium falciparum

    Get PDF
    A majority of the Plasmodium falciparum genome codes for genes with unknown functions, which presents a major challenge to understanding the parasite's biology. Large-scale functional analysis of the parasite genome is essential to pave the way for novel therapeutic intervention strategies against the disease and yet difficulties in genetic manipulation of this deadly human malaria parasite have been a major hindrance for functional analysis of its genome. Here, we used a forward functional genomic approach to study P. falciparum and identify genes important for optimal parasite development in the disease-causing, intraerythrocytic stages. We analyzed 123 piggyBac insertion mutants of P. falciparum for proliferation efficiency in the intraerythrocytic stages, in vitro. Almost 50% of the analyzed mutants showed significant reduction in proliferation efficiency, with 20% displaying severe defects. Functional categorization of genes in the severely attenuated mutants revealed significant enrichment for RNA binding proteins, suggesting the significance of post-transcriptional gene regulation in parasite development and emphasizing its importance as an antimalarial target. This study demonstrates the feasibility of much needed forward genetics approaches for P. falciparum to better characterize its genome and accelerate drug and vaccine development

    Computational Fluid Dynamics of Catalytic Reactors

    Get PDF
    Today, the challenge in chemical and material synthesis is not only the development of new catalysts and supports to synthesize a desired product, but also the understanding of the interaction of the catalyst with the surrounding flow field. Computational Fluid Dynamics or CFD is the analysis of fluid flow, heat and mass transfer and chemical reactions by means of computer-based numerical simulations. CFD has matured into a powerful tool with a wide range of applications in industry and academia. From a reaction engineering perspective, main advantages are reduction of time and costs for reactor design and optimization, and the ability to study systems where experiments can hardly be performed, e.g., hazardous conditions or beyond normal operation limits. However, the simulation results will always remain a reflection of the uncertainty in the underlying models and physicochemical parameters so that in general a careful experimental validation is required. This chapter introduces the application of CFD simulations in heterogeneous catalysis. Catalytic reactors can be classified by the geometrical design of the catalyst material (e.g. monoliths, particles, pellets, washcoats). Approaches for modeling and numerical simulation of the various catalyst types are presented. Focus is put on the principal concepts for coupling the physical and chemical processes on different levels of details, and on illustrative applications. Models for surface reaction kinetics and turbulence are described and an overview on available numerical methods and computational tools is provided
    corecore