592 research outputs found
Ultrahigh Error Threshold for Surface Codes with Biased Noise
We show that a simple modification of the surface code can exhibit an
enormous gain in the error correction threshold for a noise model in which
Pauli Z errors occur more frequently than X or Y errors. Such biased noise,
where dephasing dominates, is ubiquitous in many quantum architectures. In the
limit of pure dephasing noise we find a threshold of 43.7(1)% using a tensor
network decoder proposed by Bravyi, Suchara and Vargo. The threshold remains
surprisingly large in the regime of realistic noise bias ratios, for example
28.2(2)% at a bias of 10. The performance is in fact at or near the hashing
bound for all values of the bias. The modified surface code still uses only
weight-4 stabilizers on a square lattice, but merely requires measuring
products of Y instead of Z around the faces, as this doubles the number of
useful syndrome bits associated with the dominant Z errors. Our results
demonstrate that large efficiency gains can be found by appropriately tailoring
codes and decoders to realistic noise models, even under the locality
constraints of topological codes.Comment: 6 pages, 5 figures, comments welcome; v2 includes minor improvements
to the numerical results, additional references, and an extended discussion;
v3 published version (incorporating supplementary material into main body of
paper
Rate constants and Arrhenius parameters for the reactions of OH radicals and Cl atoms with CF3CH2OCHF2, CF3CHClOCHF2 and CF3CH2OCClF2, using the discharge-flow/resonance fluorescence method
Rate constants have been determined for the reactions of OH radicals and Cl atoms with the three partially halogenated methyl-ethyl ethers, CFCHOCHF, CFCHClOCHF and CFCHOCClF, using discharge-flow techniques to generate the OH radicals and the Cl atoms and resonance fluorescence to observe changes in their relative concentrations in the presence of added ether. For each combination of radical and ether, experiments were carried out at three temperatures between 292 and 410 K, yielding the following Arrhenius expressions for the rate constants within this range of temperature:
OH + CFCHOCHF: = (2.00.8) 10 exp( – 2110 150 K / T) cm molecule s
OH + CFCHClOCHF: = (4.5 1.3) 10 exp( – 940 100 K / T) cm molecule s
OH + CFCHOCClF: = (1.6 0.6) 10 exp( – 1100 125 K / T) cm molecule s
Cl + CFCHOCHF: = (6.1 1.4) 10 exp( – 1830 90 K / T) cm molecule s
Cl + CFCHClOCHF: = (7.8 2.6) 10 exp( – 2980 130 K / T) cm molecule s
Cl + CFCHOCClF: = (2.2 0.2) 10 exp( – 2700 40 K / T) cm molecule s
The results are compared with those obtained previously for the same and related reactions of OH radicals and Cl atoms, and the atmospheric implications of the results are considered briefly
Tailoring surface codes for highly biased noise
The surface code, with a simple modification, exhibits ultra-high error
correction thresholds when the noise is biased towards dephasing. Here, we
identify features of the surface code responsible for these ultra-high
thresholds. We provide strong evidence that the threshold error rate of the
surface code tracks the hashing bound exactly for all biases, and show how to
exploit these features to achieve significant improvement in logical failure
rate. First, we consider the infinite bias limit, meaning pure dephasing. We
prove that the error threshold of the modified surface code for pure dephasing
noise is , i.e., that all qubits are fully dephased, and this threshold
can be achieved by a polynomial time decoding algorithm. We demonstrate that
the sub-threshold behavior of the code depends critically on the precise shape
and boundary conditions of the code. That is, for rectangular surface codes
with standard rough/smooth open boundaries, it is controlled by the parameter
, where and are dimensions of the surface code lattice. We
demonstrate a significant improvement in logical failure rate with pure
dephasing for co-prime codes that have , and closely-related rotated
codes, which have a modified boundary. The effect is dramatic: the same logical
failure rate achievable with a square surface code and physical qubits can
be obtained with a co-prime or rotated surface code using only
physical qubits. Finally, we use approximate maximum likelihood decoding to
demonstrate that this improvement persists for a general Pauli noise biased
towards dephasing. In particular, comparing with a square surface code, we
observe a significant improvement in logical failure rate against biased noise
using a rotated surface code with approximately half the number of physical
qubits.Comment: 18+4 pages, 24 figures; v2 includes additional coauthor (ASD) and new
results on the performance of surface codes in the finite-bias regime,
obtained with beveled surface codes and an improved tensor network decoder;
v3 published versio
Stepping across the line: Information sharing, truth-telling and the role of the personal carer in the Australian Nursing Home
The author draws on an Australian study using multiple qualitative methods to investigate truth telling in aged care. Thematic analysis of data from five nursing homes involving 23 personal care assistants revealed participants’ role understanding as influencing their perceptions about truth telling in practice. Five themes emerged: role as the happy comfort carer, division of labor, division of disclosure, role tension and frustration, and managing the division of disclosure. Role emphasis on comfort and happiness and a dominant perception that telling the truth can cause harm mean that disclosure will be withheld, edited, or partial. Participants’ role understanding divides labor and disclosure responsibility between the personal carer and registered nurse. Personal carers’ strategies for managing the division of disclosure include game playing, obfuscation, lying (denial), and the use of nonverbals. These perceptions about personal carer role, information sharing, and truth telling are paramount for understanding and improving nursing home eldercare
A Halomethane thermochemical network from iPEPICO experiments and quantum chemical calculations
Internal energy selected halomethane cations CH3Cl+, CH2Cl2+, CHCl3+, CH3F+, CH2F2+, CHClF2+ and CBrClF2+ were prepared by vacuum ultraviolet photoionization, and their lowest energy dissociation channel studied using imaging photoelectron photoion coincidence spectroscopy (iPEPICO). This channel involves hydrogen atom loss for CH3F+, CH2F2+ and CH3Cl+, chlorine atom loss for CH2Cl2+, CHCl3+ and CHClF2+, and bromine atom loss for CBrClF2+. Accurate 0 K appearance energies, in conjunction with ab initio isodesmic and halogen exchange reaction energies, establish a thermochemical network, which is optimized to update and confirm the enthalpies of formation of the sample molecules and their dissociative photoionization products. The ground electronic states of CHCl3+, CHClF2+ and CBrClF2+ do not confirm to the deep well assumption, and the experimental breakdown curve deviates from the deep well model at low energies. Breakdown curve analysis of such shallow well systems supplies a satisfactorily succinct route to the adiabatic ionization energy of the parent molecule, particularly if the threshold photoelectron spectrum is not resolved and a purely computational route is unfeasible. The ionization energies have been found to be 11.47 ± 0.01 eV, 12.30 ± 0.02 eV and 11.23 ± 0.03 eV for CHCl3, CHClF2 and CBrClF2, respectively. The updated 0 K enthalpies of formation, ∆fHo0K(g) for the ions CH2F+, CHF2+, CHCl2+, CCl3+, CCl2F+ and CClF2+ have been derived to be 844.4 ± 2.1, 601.6 ± 2.7, 890.3 ± 2.2, 849.8 ± 3.2, 701.2 ± 3.3 and 552.2 ± 3.4 kJ mol–1, respectively. The ∆fHo0K(g) values for the neutrals CCl4, CBrClF2, CClF3, CCl2F2 and CCl3F and have been determined to be –94.0 ± 3.2, –446.6 ± 2.7, –702.1 ± 3.5, –487.8 ± 3.4 and –285.2 ± 3.2 kJ mol–1, respectively
Taspase1-dependent TFIIA cleavage coordinates head morphogenesis by limiting Cdkn2a locus transcription
Head morphogenesis requires complex signal relays to enable precisely coordinated proliferation, migration, and patterning. Here, we demonstrate that, during mouse head formation, taspase1-mediated (TASP1-mediated) cleavage of the general transcription factor TFIIA ensures proper coordination of rapid cell proliferation and morphogenesis by maintaining limited transcription of the negative cell cycle regulators p16Ink4a and p19Arf from the Cdkn2a locus. In mice, loss of TASP1 function led to catastrophic craniofacial malformations that were associated with inadequate cell proliferation. Compound deficiency of Cdkn2a, especially p16Ink4a deficiency, markedly reduced the craniofacial anomalies of TASP1-deficent mice. Furthermore, evaluation of mice expressing noncleavable TASP1 targets revealed that TFIIA is the principal TASP1 substrate that orchestrates craniofacial morphogenesis. ChIP analyses determined that noncleaved TFIIA accumulates at the p16Ink4a and p19Arf promoters to drive transcription of these negative regulators. In summary, our study elucidates a regulatory circuit comprising proteolysis, transcription, and proliferation that is pivotal for construction of the mammalian head
A Phenomenological Critique of the Idea of Social Science
Social science is in crisis. The task of social science is to study “man in situation”: to understand the world as it is for “man”. This thesis charges that this crisis consists in a failure to properly address the philosophical anthropological question “What is man?”. The various social scientific methodologies who have as their object “man” suffer rampant disagreements because they presuppose, rather than consider, what is meant by “man”. It is our intention to show that the root of the crisis is that social science can provide no formal definition of “man”. In order to understand this we propose a phenomenological analysis into the essence of social science.
This phenomenological approach will give us reason to abandon the (sexist) word “man” and instead we will speak of wer: the beings which we are. That we have not used the more usual “human being” (or some equivalent) is due to the human prejudice which is one of the major constituents of this crisis we seek to analyse.
This thesis is divided into two Parts: normative and evaluative. In the normative Part we will seek a clarification of both “phenomenology” and “social science”. Due to the various ways in which “phenomenology” has been invented we must secure a simipliciter definition of phenomenology as an approach to philosophical anthropology (Chapter 2). Importantly, we will show how the key instigators of the branches of phenomenology, Husserl, Scheler, Heidegger, and Sartre, were all engaged in this task. To clarify our phenomenology we will define the Phenomenological Movement according to various strictures by drawing on the work of Schutz and his notion of provinces of meaning (Chapter 3). This will then be carried forward to show how Schutz’s postulates of social science (with certain clarifications) constitute the eidetic structure of social science (Chapter 4).
The eidetic structures of social science identified will prompt several challenges that will be addressed in the evaluative Part. Here we engage in an imperial argument to sort proper science from pseudo-science. The first challenge is the mistaken assumption that universities and democratic states make science possible (Chapter 5). Contra this, we argue that science is predicated on “spare time” and that much institutional “science” is not in fact science. The second challenge is the “humanist challenge”: there is no such thing as nonpractical knowledge (Chapter 6). Dealing with this will require a reconsideration of the epistemic status that science has and lead to the claim of epistemic inferiority.
Having cut away pseudo-science we will be able to focus on the “social” of social science through a consideration of intersubjectivity (Chapter 7). Drawing on the above phenomenologists we will focus on how an Other is recognised as Other. Emphasising Sartre’s radical re-conception of “subject” and “object” we will argue that there can be no formal criteria for how this recognition occurs. By consequence we must begin to move away from the assumption of one life-world to various life-worlds, each constituted by different conceptions of wer
The role of conviction and narrative in decision-making under radical uncertainty
We propose conviction narrative theory (CNT) to broaden decision-making theory for it better to understand and analyse how subjectively means-end rational actors cope in contexts in which the traditional assumptions in decision-making models fail to hold. Conviction narratives enable actors to draw on their beliefs, causal models and rules of thumb to identify opportunities worth acting on, to simulate the future outcome of their actions and to feel sufficiently convinced to act. The framework focuses on how narrative and emotion combine to allow actors to deliberate and to select actions that they think will produce the outcomes they desire. It specifies connections between particular emotions and deliberative thought, hypothesizing that approach and avoidance emotions evoked during narrative simulation play a crucial role. Two mental states, Divided and Integrated, in which narratives can be formed or updated, are introduced and used to explain some familiar problems that traditional models cannot
Sexual Harassment in Our Schools; and How to Stop Sexual Harassment in Our Schools [book reviews]
Book reviews of: Sexual Harassment in Our Schools: What Parents and Teachers Need to Know to Spot It and Stop It, by Robert J. Shoop and Jack W. Hayhow, Jr. ; and How to Stop Sexual Harassment in Our Schools: A Handbook and Curriculum Guide for Administrators and Teachers, by Robert J. Shoop and Debra J. Edwards
News and narratives in financial systems: Exploiting big data for systemic risk assessment
This paper applies algorithmic analysis to financial market text-based data to assess how narratives and sentiment might drive financial system developments. We find changes in emotional content in narratives are highly correlated across data sources and show the formation (and subsequent collapse) of exuberance prior to the global financial crisis. Our metrics also have predictive power for other commonly used indicators of sentiment and appear to influence economic variables. A novel machine learning application also points towards increasing consensus around the strongly positive narrative prior to the crisis. Together, our metrics might help to warn about impending financial system distress
- …
