2,575 research outputs found

    Quiescent consistency: Defining and verifying relaxed linearizability

    Get PDF
    Concurrent data structures like stacks, sets or queues need to be highly optimized to provide large degrees of parallelism with reduced contention. Linearizability, a key consistency condition for concurrent objects, sometimes limits the potential for optimization. Hence algorithm designers have started to build concurrent data structures that are not linearizable but only satisfy relaxed consistency requirements. In this paper, we study quiescent consistency as proposed by Shavit and Herlihy, which is one such relaxed condition. More precisely, we give the first formal definition of quiescent consistency, investigate its relationship with linearizability, and provide a proof technique for it based on (coupled) simulations. We demonstrate our proof technique by verifying quiescent consistency of a (non-linearizable) FIFO queue built using a diffraction tree. © 2014 Springer International Publishing Switzerland

    Depression, School Performance, and the Veridicality of Perceived Grades and Causal Attributions

    Get PDF
    An external criterion was assessed to test whether depressives have distorted perceptions of covariation information and whether their attributions are consistent with this information. Students’ actual and self-perceived grades, depression status, and attributions for failures were assessed. Furthermore, partici pants estimated average grades. Generally, self-perceived own past grades were inflated. Depressed students and those with low grades distorted their own grades (but not the average grade) more to their favor than individuals low in depression and those with high grades. Depression went along with lower actual grades and with internal, stable, and global failure attributions. Mood differences in attributions were not due to differences in previous grades. Depressed individuals drew (unrealistically) more depressogenic causal inferences when they perceived average grades to be low than when average grades were perceived to be high. However, they (realistically) attributed failure more in a depressogenic fashion than did nondepressives when their own grade history was low

    Operation Moshtarak and the manufacture of credible, “heroic” warfare

    Get PDF
    Richard Lance Keeble argues that Fleet Street’s coverage of the Afghan conflict has served largely to promote the interests of the military/industrial/media complex – and marginalise the views of the public who have consistently appealed in polls for the troops to be brought back hom

    On the Properties of Plastic Ablators in Laser-Driven Material Dynamics Experiments

    Get PDF
    Radiation hydrodynamics simulations were used to study the effect of plastic ablators in laser-driven shock experiments. The sensitivity to composition and equation of state was found to be 5-10% in ablation pressure. As was found for metals, a laser pulse of constant irradiance gave a pressure history which decreased by several percent per nanosecond. The pressure history could be made more constant by adjusting the irradiance history. The impedance mismatch with the sample gave an increase o(100%) in the pressure transmitted into the sample, for a reduction of several tens of percent in the duration of the peak load applied to the sample, and structured the release history by adding a release step to a pressure close to the ablation pressure. Algebraic relations were found between the laser pulse duration, the ablator thickness, and the duration of the peak pressure applied to the sample, involving quantities calculated from the equations of state of the ablator and sample using shock dynamics.Comment: Typos fixe

    Influence of Nanoparticle Size and Shape on Oligomer Formation of an Amyloidogenic Peptide

    Full text link
    Understanding the influence of macromolecular crowding and nanoparticles on the formation of in-register ÎČ\beta-sheets, the primary structural component of amyloid fibrils, is a first step towards describing \emph{in vivo} protein aggregation and interactions between synthetic materials and proteins. Using all atom molecular simulations in implicit solvent we illustrate the effects of nanoparticle size, shape, and volume fraction on oligomer formation of an amyloidogenic peptide from the transthyretin protein. Surprisingly, we find that inert spherical crowding particles destabilize in-register ÎČ\beta-sheets formed by dimers while stabilizing ÎČ\beta-sheets comprised of trimers and tetramers. As the radius of the nanoparticle increases crowding effects decrease, implying smaller crowding particles have the largest influence on the earliest amyloid species. We explain these results using a theory based on the depletion effect. Finally, we show that spherocylindrical crowders destabilize the ordered ÎČ\beta-sheet dimer to a greater extent than spherical crowders, which underscores the influence of nanoparticle shape on protein aggregation

    Transient x-ray diffraction used to diagnose shock compressed Si crystals on the Nova laser

    Get PDF
    Transient x-ray diffraction is used to record time-resolved information about the shock compression of materials. This technique has been applied on Nova shock experiments driven using a hohlraum x-ray drive. Data were recorded from the shock release at the free surface of a Si crystal, as well as from Si at an embedded ablator/Si interface. Modeling has been done to simulate the diffraction data incorporating the strained crystal rocking curves and Bragg diffraction efficiencies. Examples of the data and post-processed simulations are presented

    Faster linearizability checking via PP-compositionality

    Full text link
    Linearizability is a well-established consistency and correctness criterion for concurrent data types. An important feature of linearizability is Herlihy and Wing's locality principle, which says that a concurrent system is linearizable if and only if all of its constituent parts (so-called objects) are linearizable. This paper presents PP-compositionality, which generalizes the idea behind the locality principle to operations on the same concurrent data type. We implement PP-compositionality in a novel linearizability checker. Our experiments with over nine implementations of concurrent sets, including Intel's TBB library, show that our linearizability checker is one order of magnitude faster and/or more space efficient than the state-of-the-art algorithm.Comment: 15 pages, 2 figure

    Reviews

    Get PDF
    Miscellany. . Reviewed by George Colvin. Wilkie Collins: A Critical and Biographical Study. Dorothy L. Sayers, ed. E.R. Gregory. Reviewed by J. R. Christopher. Bloodhounds of Heaven: The Detective in English Fiction from Godwin to Doyle. Ian Ousby. Reviewed by J. R. Christopher. The Dark Tower and Other Stories. C.S. Lewis, Ed. Walter Hooper. Reviewed by Nancy-Lou Patterson. The Mythology of Middle-earth. Ruth S. Noel. Reviewed by Nancy-Lou Patterson. Faeries. Brian Froud and Alan Lee. Reviewed by Robert S. Ellwood Jr.. Eschatus. Bruce Pennington. Reviewed by Robert S. Ellwood Jr.. The Lord of the Rings. Ralph Bakshi, director; Saul Zaentz, producer. Reviewed by Steven C. Walker. The Lord of the Rings. Ralph Bakshi, director; Saul Zaentz, producer. Reviewed by Dale Ziegler

    A wide-spectrum language for verification of programs on weak memory models

    Full text link
    Modern processors deploy a variety of weak memory models, which for efficiency reasons may (appear to) execute instructions in an order different to that specified by the program text. The consequences of instruction reordering can be complex and subtle, and can impact on ensuring correctness. Previous work on the semantics of weak memory models has focussed on the behaviour of assembler-level programs. In this paper we utilise that work to extract some general principles underlying instruction reordering, and apply those principles to a wide-spectrum language encompassing abstract data types as well as low-level assembler code. The goal is to support reasoning about implementations of data structures for modern processors with respect to an abstract specification. Specifically, we define an operational semantics, from which we derive some properties of program refinement, and encode the semantics in the rewriting engine Maude as a model-checking tool. The tool is used to validate the semantics against the behaviour of a set of litmus tests (small assembler programs) run on hardware, and also to model check implementations of data structures from the literature against their abstract specifications
    • 

    corecore