469 research outputs found

    Admit your weakness: Verifying correctness on TSO architectures

    Get PDF
    “The final publication is available at http://link.springer.com/chapter/10.1007%2F978-3-319-15317-9_22 ”.Linearizability has become the standard correctness criterion for fine-grained non-atomic concurrent algorithms, however, most approaches assume a sequentially consistent memory model, which is not always realised in practice. In this paper we study the correctness of concurrent algorithms on a weak memory model: the TSO (Total Store Order) memory model, which is commonly implemented by multicore architectures. Here, linearizability is often too strict, and hence, we prove a weaker criterion, quiescent consistency instead. Like linearizability, quiescent consistency is compositional making it an ideal correctness criterion in a component-based context. We demonstrate how to model a typical concurrent algorithm, seqlock, and prove it quiescent consistent using a simulation-based approach. Previous approaches to proving correctness on TSO architectures have been based on linearizabilty which makes it necessary to modify the algorithm’s high-level requirements. Our approach is the first, to our knowledge, for proving correctness without the need for such a modification

    On the nature of progress

    Get PDF
    15th International Conference, OPODIS 2011, Toulouse, France, December 13-16, 2011. ProceedingsWe identify a simple relationship that unifies seemingly unrelated progress conditions ranging from the deadlock-free and starvation-free properties common to lock-based systems, to non-blocking conditions such as obstruction-freedom, lock-freedom, and wait-freedom. Properties can be classified along two dimensions based on the demands they make on the operating system scheduler. A gap in the classification reveals a new non-blocking progress condition, weaker than obstruction-freedom, which we call clash-freedom. The classification provides an intuitively-appealing explanation why programmers continue to devise data structures that mix both blocking and non-blocking progress conditions. It also explains why the wait-free property is a natural basis for the consensus hierarchy: a theory of shared-memory computation requires an independent progress condition, not one that makes demands of the operating system scheduler

    Filtering Erroneous Soundings from Multibeam Survey Data

    Get PDF
    As part of its continuing efforts to improve data quality, the National Oceanic and Atmospheric Administration (NOAA) has recently implemented a "prefiltering" procedure designed to identify and remove erroneous or questionable soundings from multibeam sonar data collected in support of the United States Exclusive Economic Zone Bathymetric Mapping Programme. Since the start of the 1991 field season, a simple, yet effective, prefiltering algorithm has been incorporated into the standard post-processing software used aboard NOAA ships equipped with MicroVAX-based survey systems. In addition, the prefiltering routine is also being utilized as part of NOAA's current effort to convert its archive of older PDP-11 multibeam surveys to standard full-resolution "beam" format. The sounding verification criteria employed by the prefiltering algorithm is discussed in detail and statistical results from the first season of its implementation are presented

    Supervision Experiences of School Counselors-in-Training: An Interpretative Phenomenological Study

    Get PDF
    School counselors-in-training receive university and site supervision during their field experiences. University supervision may be provided by a faculty member or doctoral student who lacks school counseling experience. School counselors as site supervisors may not be trained to supervise. Further, the multiple systems may have differing expectations for supervisees. Interpretative Phenomenological Analysis was used to explore the lived experiences of eight master’s level school counselors-in-training with supervision. The four super-ordinate themes included: impact of counselor education program, supervisor characteristics, significance of feedback, and characteristics of the supervisee. Findings suggested programmatic changes counselor educators can make to strengthen student preparation

    Quiescent consistency: Defining and verifying relaxed linearizability

    Get PDF
    Concurrent data structures like stacks, sets or queues need to be highly optimized to provide large degrees of parallelism with reduced contention. Linearizability, a key consistency condition for concurrent objects, sometimes limits the potential for optimization. Hence algorithm designers have started to build concurrent data structures that are not linearizable but only satisfy relaxed consistency requirements. In this paper, we study quiescent consistency as proposed by Shavit and Herlihy, which is one such relaxed condition. More precisely, we give the first formal definition of quiescent consistency, investigate its relationship with linearizability, and provide a proof technique for it based on (coupled) simulations. We demonstrate our proof technique by verifying quiescent consistency of a (non-linearizable) FIFO queue built using a diffraction tree. © 2014 Springer International Publishing Switzerland

    Open Transactions on Shared Memory

    Full text link
    Transactional memory has arisen as a good way for solving many of the issues of lock-based programming. However, most implementations admit isolated transactions only, which are not adequate when we have to coordinate communicating processes. To this end, in this paper we present OCTM, an Haskell-like language with open transactions over shared transactional memory: processes can join transactions at runtime just by accessing to shared variables. Thus a transaction can co-operate with the environment through shared variables, but if it is rolled-back, also all its effects on the environment are retracted. For proving the expressive power of TCCS we give an implementation of TCCS, a CCS-like calculus with open transactions

    The Impending Wave of Legal Malpractice Litigation - Predictions, Analysis, and Proposals for Change.

    Get PDF
    Attorneys tend to be viewed antithetically, at once both greedy and manipulative, but also respected and admired. Given this odd mixture of respect and disdain, attorneys are fortunate to have generally avoided being targets as potential defendants. Nevertheless, circumstances in Texas have changed, creating a new legal climate wherein attorneys may soon become defendants of choice. Attorneys in Texas are at a significantly greater risk of becoming the subject of a malpractice suit than they were in the past. Yet, simply because statistics indicate an increase in the number of malpractice claims, this does not mean more malpractice is being committed or attorneys are less competent than in previous years. A variety of factors can explain the statistics, these include the disappearance of the traditional congeniality of the bar and the willingness of lawyers to bring suit against each other. Furthermore, these figures show plaintiffs’ claims today are more fact-specific and based on a myriad of legal theories spanning the entire spectrum of the attorney’s representation. Little argument can be made that the number of suits against attorneys will not increase dramatically in the next few years. For lawyers to continue to play the role of advocates in the justice system, establishing safeguards is crucial to prevent every unhappy outcome for a litigant from turning into a subsequent malpractice claim. Rather than reacting after the inundation of malpractice claims is underway, Texas and the Texas bar would be better served if proactive measures were taken. Such measures must be carefully drafted to not only provide attorneys with protection from unwarranted claims, but also to promote the public interest in ensuring truly egregious malpractice claims are brought to the attention of the bar grievance committee

    Cryptocurrency Competition and Market Concentration in the Presence of Network Effects

    Get PDF
    When network products and services become more valuable as their userbase grows (network effects), this tendency can become a major determinant of how they compete with each other in the market and how the market is structured. Network effects are traditionally linked to high market concentration, early-mover advantages, and entry barriers, and in the market they have also been used as a valuation tool. The recent resurgence of Bitcoin has been partly attributed to network effects, too. We study the existence of network effects in six cryptocurrencies from their inception to obtain a high-level overview of the application of network effects in the cryptocurrency market. We show that, contrary to the usual implications of network effects, they do not serve to concentrate the cryptocurrency market, nor do they accord any one cryptocurrency a definitive competitive advantage, nor are they consistent enough to be reliable valuation tools. Therefore, while network effects do occur in cryptocurrency networks, they are not (yet) a defining feature of the cryptocurrency market as a whole

    Fabrication and Assessment of 3D Printed Anatomical Models of the Lower Limb for Anatomical Teaching and Femoral Vessel Access Training in Medicine

    Get PDF
    For centuries, cadaveric dissection has been the touchstone of anatomy education. It offers a medical student intimate access to his or her first patient. In contrast to idealized artisan anatomical models, it presents the natural variation of anatomy in fine detail. However, a new teaching construct has appeared recently in which artificial cadavers are manufactured through three-dimensional (3D) printing of patient specific radiological data sets. In this article, a simple powder based printer is made more versatile to manufacture hard bones, silicone muscles and perfusable blood vessels. The approach involves blending modern approaches (3D printing) with more ancient ones (casting and lost-wax techniques). These anatomically accurate models can augment the approach to anatomy teaching from dissection to synthesis of 3D-printed parts held together with embedded rare earth magnets. Vascular simulation is possible through application of pumps and artificial blood. The resulting arteries and veins can be cannulated and imaged with Doppler ultrasound. In some respects, 3D-printed anatomy is superior to older teaching methods because the parts are cheap, scalable, they can cover the entire age span, they can be both dissected and reassembled and the data files can be printed anywhere in the world and mass produced. Anatomical diversity can be collated as a digital repository and reprinted rather than waiting for the rare variant to appear in the dissection room. It is predicted that 3D printing will revolutionize anatomy when poly-material printing is perfected in the early 21st century. (C) 2015 American Association of Anatomists
    • …
    corecore