645 research outputs found
Admit your weakness: Verifying correctness on TSO architectures
“The final publication is available at http://link.springer.com/chapter/10.1007%2F978-3-319-15317-9_22 ”.Linearizability has become the standard correctness criterion for fine-grained non-atomic concurrent algorithms, however, most approaches assume a sequentially consistent memory model, which is not always realised in practice. In this paper we study the correctness of concurrent algorithms on a weak memory model: the TSO (Total Store Order) memory model, which is commonly implemented by multicore architectures. Here, linearizability is often too strict, and hence, we prove a weaker criterion, quiescent consistency instead. Like linearizability, quiescent consistency is compositional making it an ideal correctness criterion in a component-based context. We demonstrate how to model a typical concurrent algorithm, seqlock, and prove it quiescent consistent using a simulation-based approach. Previous approaches to proving correctness on TSO architectures have been based on linearizabilty which makes it necessary to modify the algorithm’s high-level requirements. Our approach is the first, to our knowledge, for proving correctness without the need for such a modification
Recommended from our members
Using Intonationally-Marked Presuppositional Information in On-Line Language Processing: Evidence from Eye Movement s to a Visual Model
This study evaluates the effect of presuppositional
information associated with contrastive stress on on-line
language processing. An eye-tracking methodology was
used, in which eye movement latencies to real objects in
a visual display are taken as a measure of on-line
reference resolution. Results indicate that presupposed
contrast sets are being computed on-line, and can be used
to speed reference resolution by narrowing the referential
domain of an utterance. In addition, presupposed contrast
sets appear to play a role in managing attention in the
processing of a discourse
Using keystroke logging to understand writers’ processes on a reading-into-writing test
Background
Integrated reading-into-writing tasks are increasingly used in large-scale language proficiency tests. Such tasks are said to possess higher authenticity as they reflect real-life writing conditions better than independent, writing-only tasks. However, to effectively define the reading-into-writing construct, more empirical evidence regarding how writers compose from sources both in real-life and under test conditions is urgently needed. Most previous process studies used think aloud or questionnaire to collect evidence. These methods rely on participants’ perceptions of their processes, as well as their ability to report them.
Findings
This paper reports on a small-scale experimental study to explore writers’ processes on a reading-into-writing test by employing keystroke logging. Two L2 postgraduates completed an argumentative essay on computer. Their text production processes were captured by a keystroke logging programme. Students were also interviewed to provide additional information. Keystroke logging like most computing tools provides a range of measures. The study examined the students’ reading-into-writing processes by analysing a selection of the keystroke logging measures in conjunction with students’ final texts and interview protocols.
Conclusions
The results suggest that the nature of the writers’ reading-into-writing processes might have a major influence on the writer’s final performance. Recommendations for future process studies are provided
Quasars: a supermassive rotating toroidal black hole interpretation
A supermassive rotating toroidal black hole (TBH) is proposed as the
fundamental structure of quasars and other jet-producing active galactic
nuclei. Rotating protogalaxies gather matter from the central gaseous region
leading to the birth of massive toroidal stars whose internal nuclear reactions
proceed very rapidly. Once the nuclear fuel is spent, gravitational collapse
produces a slender ring-shaped TBH remnant. These events are typically the
first supernovae of the host galaxies. Given time the TBH mass increases
through continued accretion by several orders of magnitude, the event horizon
swells whilst the central aperture shrinks. The difference in angular
velocities between the accreting matter and the TBH induces a magnetic field
that is strongest in the region of the central aperture and innermost
ergoregion. Due to the presence of negative energy states when such a
gravitational vortex is immersed in an electromagnetic field, circumstances are
near ideal for energy extraction via non-thermal radiation including the
Penrose process and superradiant scattering. This establishes a self-sustaining
mechanism whereby the transport of angular momentum away from the quasar by
relativistic bi-directional jets reinforces both the modulating magnetic field
and the TBH/accretion disk angular velocity differential. Quasar behaviour is
extinguished once the BH topology becomes spheroidal. Similar mechanisms may be
operating in microquasars, SNe and GRBs when neutron density or BH tori arise.
In certain circumstances, long-term TBH stability can be maintained by a
negative cosmological constant, otherwise the classical topology theorems must
somehow be circumvented. Preliminary evidence is presented that Planck-scale
quantum effects may be responsible.Comment: 26 pages, 14 figs, various corrections and enhancements, final
versio
Biliary Bicarbonate Secretion Constitutes a Protective Mechanism against Bile Acid-Induced Injury in Man
Background: Cholangiocytes expose a striking resistance against bile acids: while other cell types, such as hepatocytes, are susceptible to bile acid-induced toxicity and apoptosis already at micromolar concentrations, cholangiocytes are continuously exposed to millimolar concentrations as present in bile. We present a hypothesis suggesting that biliary secretion of HCO(3)(-) in man serves to protect cholangiocytes against bile acid-induced damage by fostering the deprotonation of apolar bile acids to more polar bile salts. Here, we tested if bile acid-induced toxicity is pH-dependent and if anion exchanger 2 (AE2) protects against bile acid-induced damage. Methods: A human cholangiocyte cell line was exposed to chenodeoxycholate (CDC), or its glycine conjugate, from 0.5 mM to 2.0 mM at pH 7.4, 7.1, 6.7 or 6.4, or after knockdown of AE2. Cell viability and apoptosis were determined by WST and caspase-3/-7 assays, respectively. Results: Glycochenodeoxycholate (GCDC) uptake in cholangiocytes is pH-dependent. Furthermore, CDC and GCDC (pK(a) 4-5) induce cholangiocyte toxicity in a pH-dependent manner: 0.5 mM CDC and 1 mM GCDC at pH 7.4 had no effect on cell viability, but at pH 6.4 decreased viability by >80% and increased caspase activity almost 10- and 30-fold, respectively. Acidification alone had no effect. AE2 knockdown led to 3- and 2-fold enhanced apoptosis induced by 0.75 mM CDC or 2 mM GCDC at pH 7.4. Discussion: These data support our hypothesis of a biliary HCO(3)(-) umbrella serving to protect human cholangiocytes against bile acid-induced injury. AE2 is a key contributor to this protective mechanism. The development and progression of cholangiopathies, such as primary biliary cirrhosis, may be a consequence of genetic and acquired functional defects of genes involved in maintaining the biliary HCO(3)(-) umbrella. Copyright (C) 2011 S. Karger AG, Base
Modulation of EGFR activity by molecularly imprinted polymer nanoparticles targeting intracellular epitopes
In recent years, molecularly imprinted polymer nanoparticles (nanoMIPs) have proven to be an attractive alternative to antibodies in diagnostic and therapeutic applications. However, several key questions remain: how suitable are intracellular epitopes as targets for nanoMIP binding? And to what extent can protein function be modulated via targeting specific epitopes? To investigate this, three extracellular and three intracellular epitopes of epidermal growth factor receptor (EGFR) were used as templates for the synthesis of nanoMIPs which were then used to treat cancer cells with different expression levels of EGFR. It was observed that nanoMIPs imprinted with epitopes from the intracellular kinase domain and the extracellular ligand binding domain of EGFR caused cells to form large foci of EGFR sequestered away from the cell surface, caused a reduction in autophosphorylation, and demonstrated effects on cell viability. Collectively, this suggests that intracellular domain-targeting nanoMIPs can be a potential new tool for cancer therapy
Pointfree factorization of operation refinement
The standard operation refinement ordering is a kind of “meet of op- posites”: non-determinism reduction suggests “smaller” behaviour while increase of definition suggests “larger” behaviour. Groves’ factorization of this ordering into two simpler relations, one per refinement concern, makes it more mathe- matically tractable but is far from fully exploited in the literature. We present a pointfree theory for this factorization which is more agile and calculational than the standard set-theoretic approach. In particular, we show that factorization leads to a simple proof of structural refinement for arbitrary parametric types and ex- ploit factor instantiation across different subclasses of (relational) operation. The prospect of generalizing the factorization to coalgebraic refinement is discussedFundação para a Ciência e a Tecnologia (FCT) - PURE Project (Program Understanding and Re-engineering: Calculi
and Applications), contract POSI/ICHS/44304/2002
Principles of Stakes Fairness in Sport
Fairness in sport is not just about assigning the top prizes to the worthiest competitors. It is also about the way the prize structure itself is organised. For many sporting competitions, although it may be acceptable for winners to receive more than losers, it can seem unfair for winners to take everything and for losers to get nothing. Yet this insight leaves unanswered some difficult questions about what stakes fairness requires and which principles of stakes fairness are appropriate for particular competitions. In this article I specify a range of different principles of stakes fairness (ten in total) that could regulate sporting competitions. I also put forward a theoretical method for pairing up appropriate principles of stakes fairness with given sporting competitions. Specifically, I argue that the underlying rationales for holding sporting competitions can provide useful guides for identifying appropriate principles of stakes fairness. I then seek to clarify and work through some of the implications of this method for a sample of real world controversies over sporting prize structures. I also attempt to refine the method in response to two possible objections from indeterminacy and relativism. Finally, I compare and contrast my conclusions with more general philosophical debates about justice
- …