1,460 research outputs found
Linearizability with Ownership Transfer
Linearizability is a commonly accepted notion of correctness for libraries of
concurrent algorithms. Unfortunately, it assumes a complete isolation between a
library and its client, with interactions limited to passing values of a given
data type. This is inappropriate for common programming languages, where
libraries and their clients can communicate via the heap, transferring the
ownership of data structures, and can even run in a shared address space
without any memory protection. In this paper, we present the first definition
of linearizability that lifts this limitation and establish an Abstraction
Theorem: while proving a property of a client of a concurrent library, we can
soundly replace the library by its abstract implementation related to the
original one by our generalisation of linearizability. This allows abstracting
from the details of the library implementation while reasoning about the
client. We also prove that linearizability with ownership transfer can be
derived from the classical one if the library does not access some of data
structures transferred to it by the client
Specifying and Verifying Concurrent Algorithms with Histories and Subjectivity
We present a lightweight approach to Hoare-style specifications for
fine-grained concurrency, based on a notion of time-stamped histories that
abstractly capture atomic changes in the program state. Our key observation is
that histories form a partial commutative monoid, a structure fundamental for
representation of concurrent resources. This insight provides us with a
unifying mechanism that allows us to treat histories just like heaps in
separation logic. For example, both are subject to the same assertion logic and
inference rules (e.g., the frame rule). Moreover, the notion of ownership
transfer, which usually applies to heaps, has an equivalent in histories. It
can be used to formally represent helping---an important design pattern for
concurrent algorithms whereby one thread can execute code on behalf of another.
Specifications in terms of histories naturally abstract granularity, in the
sense that sophisticated fine-grained algorithms can be given the same
specifications as their simplified coarse-grained counterparts, making them
equally convenient for client-side reasoning. We illustrate our approach on a
number of examples and validate all of them in Coq.Comment: 17 page
Faster linearizability checking via -compositionality
Linearizability is a well-established consistency and correctness criterion
for concurrent data types. An important feature of linearizability is Herlihy
and Wing's locality principle, which says that a concurrent system is
linearizable if and only if all of its constituent parts (so-called objects)
are linearizable. This paper presents -compositionality, which generalizes
the idea behind the locality principle to operations on the same concurrent
data type. We implement -compositionality in a novel linearizability
checker. Our experiments with over nine implementations of concurrent sets,
including Intel's TBB library, show that our linearizability checker is one
order of magnitude faster and/or more space efficient than the state-of-the-art
algorithm.Comment: 15 pages, 2 figure
The discrimination capabilities of Micromegas detectors at low energy
The latest generation of Micromegas detectors show a good energy resolution,
spatial resolution and low threshold, which make them idoneous in low energy
applications. Two micromegas detectors have been built for dark matter
experiments: CAST, which uses a dipole magnet to convert axion into detectable
x-ray photons, and MIMAC, which aims to reconstruct the tracks of low energy
nuclear recoils in a mixture of CF4 and CHF3. These readouts have been
respectively built with the microbulk and bulk techniques, which show different
gain, electron transmission and energy resolutions. The detectors and the
operation conditions will be described in detail as well as their
discrimination capabilities for low energy photons will be discussed.Comment: To be published in the proceedings of the TIPP2011 conference
(Physics Procedia
Lowering the background level and the energy threshold of Micromegas x-ray detectors for axion searches
Axion helioscopes search for solar axions by their conversion in x-rays in
the presence of high magnetic fields. The use of low background x-ray detectors
is an essential component contributing to the sensitivity of these searches. In
this work, we review the recent advances on Micromegas detectors used in the
CERN Axion Solar Telescope (CAST) and proposed for the future International
Axion Observatory (IAXO). The actual setup in CAST has achieved background
levels below 10 keV cm s, a factor 100 lower than
the first generation of Micromegas detectors. This reduction is based on active
and passive shielding techniques, the selection of radiopure materials, offline
discrimination techniques and the high granularity of the readout. We describe
in detail the background model of the detector, based on its operation at CAST
site and at the Canfranc Underground Laboratory (LSC), as well as on Geant4
simulations. The best levels currently achieved at LSC are low than 10
keV cm s and show good prospects for the application of
this technology in IAXO. Finally, we present some ideas and results for
reducing the energy threshold of these detectors below 1 keV, using
high-transparent windows, autotrigger electronics and studying the cluster
shape at different energies. As a high flux of axion-like-particles is expected
in this energy range, a sub-keV threshold detector could enlarge the physics
case of axion helioscopes.Comment: Proceedings of 3rd International Conference on Technology and
Instrumentation in Particle Physics (TIPP 2014
Malfunction diagnosis in industrial process systems using data mining for knowledge discovery
The determination of abnormal behavior at process industries gains increasing interest as strict regulations and highly competitive operation conditions are regularly applied at the process systems. A synergetic approach in exploring the behavior of industrial processes is proposed, targeting at the discovery of patterns and implement fault detection (malfunction) diagnosis. The patterns are based on highly correlated time series. The concept is based on the fact that if independent time series are combined based on rules, we can extract scenarios of functional and non-functional situations so as to monitor hazardous procedures occurring in workplaces. The selected methods combine and apply actions on historically stored, experimental data from a chemical pilot plant, located at CERTH/CPERI. The implementation of the clustering and classification methods showed promising results of determining with great accuracy (97%) the potential abnormal situations
CAST microbulk micromegas in the Canfranc Underground Laboratory
During the last taking data campaigns of the CAST experiment, the micromegas
detectors have achieved background levels of keVcms between 2 and 9 keV. This performance has
been possible thanks to the introduction of the microbulk technology, the
implementation of a shielding and the development of discrimination algorithms.
It has motivated new studies towards a deeper understanding of CAST detectors
background. One of the working lines includes the construction of a replica of
the set-up used in CAST by micromegas detectors and its installation in the
Canfranc Underground Laboratory. Thanks to the comparison between the
performance of the detectors underground and at surface, shielding upgrades,
etc, different contributions to the detectors background have been evaluated.
In particular, an upper limit keVcms
for the intrinsic background of the detector has been obtained. This work means
a first evaluation of the potential of the newest micromegas technology in an
underground laboratory, the most suitable environment for Rare Event Searches.Comment: 6 pages, 8 figures. To appear in the proceedings of the 2nd
International Conference on Technology and Instrumentation for Particle
Physics (TIPP 2011
Low Background Micromegas in CAST
Solar axions could be converted into x-rays inside the strong magnetic field
of an axion helioscope, triggering the detection of this elusive particle. Low
background x-ray detectors are an essential component for the sensitivity of
these searches. We report on the latest developments of the Micromegas
detectors for the CERN Axion Solar Telescope (CAST), including technological
pathfinder activities for the future International Axion Observatory (IAXO).
The use of low background techniques and the application of discrimination
algorithms based on the high granularity of the readout have led to background
levels below 10 counts/keV/cm/s, more than a factor 100 lower than
the first generation of Micromegas detectors. The best levels achieved at the
Canfranc Underground Laboratory (LSC) are as low as 10
counts/keV/cm/s, showing good prospects for the application of this
technology in IAXO. The current background model, based on underground and
surface measurements, is presented, as well as the strategies to further reduce
the background level. Finally, we will describe the R&D paths to achieve
sub-keV energy thresholds, which could broaden the physics case of axion
helioscopes.Comment: 6 pages, 3 figures, Large TPC Conference 2014, Pari
- …