446 research outputs found
Ultrahigh Error Threshold for Surface Codes with Biased Noise
We show that a simple modification of the surface code can exhibit an
enormous gain in the error correction threshold for a noise model in which
Pauli Z errors occur more frequently than X or Y errors. Such biased noise,
where dephasing dominates, is ubiquitous in many quantum architectures. In the
limit of pure dephasing noise we find a threshold of 43.7(1)% using a tensor
network decoder proposed by Bravyi, Suchara and Vargo. The threshold remains
surprisingly large in the regime of realistic noise bias ratios, for example
28.2(2)% at a bias of 10. The performance is in fact at or near the hashing
bound for all values of the bias. The modified surface code still uses only
weight-4 stabilizers on a square lattice, but merely requires measuring
products of Y instead of Z around the faces, as this doubles the number of
useful syndrome bits associated with the dominant Z errors. Our results
demonstrate that large efficiency gains can be found by appropriately tailoring
codes and decoders to realistic noise models, even under the locality
constraints of topological codes.Comment: 6 pages, 5 figures, comments welcome; v2 includes minor improvements
to the numerical results, additional references, and an extended discussion;
v3 published version (incorporating supplementary material into main body of
paper
A Phenomenological Critique of the Idea of Social Science
Social science is in crisis. The task of social science is to study “man in situation”: to understand the world as it is for “man”. This thesis charges that this crisis consists in a failure to properly address the philosophical anthropological question “What is man?”. The various social scientific methodologies who have as their object “man” suffer rampant disagreements because they presuppose, rather than consider, what is meant by “man”. It is our intention to show that the root of the crisis is that social science can provide no formal definition of “man”. In order to understand this we propose a phenomenological analysis into the essence of social science.
This phenomenological approach will give us reason to abandon the (sexist) word “man” and instead we will speak of wer: the beings which we are. That we have not used the more usual “human being” (or some equivalent) is due to the human prejudice which is one of the major constituents of this crisis we seek to analyse.
This thesis is divided into two Parts: normative and evaluative. In the normative Part we will seek a clarification of both “phenomenology” and “social science”. Due to the various ways in which “phenomenology” has been invented we must secure a simipliciter definition of phenomenology as an approach to philosophical anthropology (Chapter 2). Importantly, we will show how the key instigators of the branches of phenomenology, Husserl, Scheler, Heidegger, and Sartre, were all engaged in this task. To clarify our phenomenology we will define the Phenomenological Movement according to various strictures by drawing on the work of Schutz and his notion of provinces of meaning (Chapter 3). This will then be carried forward to show how Schutz’s postulates of social science (with certain clarifications) constitute the eidetic structure of social science (Chapter 4).
The eidetic structures of social science identified will prompt several challenges that will be addressed in the evaluative Part. Here we engage in an imperial argument to sort proper science from pseudo-science. The first challenge is the mistaken assumption that universities and democratic states make science possible (Chapter 5). Contra this, we argue that science is predicated on “spare time” and that much institutional “science” is not in fact science. The second challenge is the “humanist challenge”: there is no such thing as nonpractical knowledge (Chapter 6). Dealing with this will require a reconsideration of the epistemic status that science has and lead to the claim of epistemic inferiority.
Having cut away pseudo-science we will be able to focus on the “social” of social science through a consideration of intersubjectivity (Chapter 7). Drawing on the above phenomenologists we will focus on how an Other is recognised as Other. Emphasising Sartre’s radical re-conception of “subject” and “object” we will argue that there can be no formal criteria for how this recognition occurs. By consequence we must begin to move away from the assumption of one life-world to various life-worlds, each constituted by different conceptions of wer
Rate constants and Arrhenius parameters for the reactions of OH radicals and Cl atoms with CF3CH2OCHF2, CF3CHClOCHF2 and CF3CH2OCClF2, using the discharge-flow/resonance fluorescence method
Rate constants have been determined for the reactions of OH radicals and Cl atoms with the three partially halogenated methyl-ethyl ethers, CFCHOCHF, CFCHClOCHF and CFCHOCClF, using discharge-flow techniques to generate the OH radicals and the Cl atoms and resonance fluorescence to observe changes in their relative concentrations in the presence of added ether. For each combination of radical and ether, experiments were carried out at three temperatures between 292 and 410 K, yielding the following Arrhenius expressions for the rate constants within this range of temperature:
OH + CFCHOCHF: = (2.00.8) 10 exp( – 2110 150 K / T) cm molecule s
OH + CFCHClOCHF: = (4.5 1.3) 10 exp( – 940 100 K / T) cm molecule s
OH + CFCHOCClF: = (1.6 0.6) 10 exp( – 1100 125 K / T) cm molecule s
Cl + CFCHOCHF: = (6.1 1.4) 10 exp( – 1830 90 K / T) cm molecule s
Cl + CFCHClOCHF: = (7.8 2.6) 10 exp( – 2980 130 K / T) cm molecule s
Cl + CFCHOCClF: = (2.2 0.2) 10 exp( – 2700 40 K / T) cm molecule s
The results are compared with those obtained previously for the same and related reactions of OH radicals and Cl atoms, and the atmospheric implications of the results are considered briefly
News and narratives in financial systems: Exploiting big data for systemic risk assessment
This paper applies algorithmic analysis to financial market text-based data to assess how narratives and sentiment might drive financial system developments. We find changes in emotional content in narratives are highly correlated across data sources and show the formation (and subsequent collapse) of exuberance prior to the global financial crisis. Our metrics also have predictive power for other commonly used indicators of sentiment and appear to influence economic variables. A novel machine learning application also points towards increasing consensus around the strongly positive narrative prior to the crisis. Together, our metrics might help to warn about impending financial system distress
Sexual Harassment in Our Schools; and How to Stop Sexual Harassment in Our Schools [book reviews]
Book reviews of: Sexual Harassment in Our Schools: What Parents and Teachers Need to Know to Spot It and Stop It, by Robert J. Shoop and Jack W. Hayhow, Jr. ; and How to Stop Sexual Harassment in Our Schools: A Handbook and Curriculum Guide for Administrators and Teachers, by Robert J. Shoop and Debra J. Edwards
Tailoring surface codes for highly biased noise
The surface code, with a simple modification, exhibits ultra-high error
correction thresholds when the noise is biased towards dephasing. Here, we
identify features of the surface code responsible for these ultra-high
thresholds. We provide strong evidence that the threshold error rate of the
surface code tracks the hashing bound exactly for all biases, and show how to
exploit these features to achieve significant improvement in logical failure
rate. First, we consider the infinite bias limit, meaning pure dephasing. We
prove that the error threshold of the modified surface code for pure dephasing
noise is , i.e., that all qubits are fully dephased, and this threshold
can be achieved by a polynomial time decoding algorithm. We demonstrate that
the sub-threshold behavior of the code depends critically on the precise shape
and boundary conditions of the code. That is, for rectangular surface codes
with standard rough/smooth open boundaries, it is controlled by the parameter
, where and are dimensions of the surface code lattice. We
demonstrate a significant improvement in logical failure rate with pure
dephasing for co-prime codes that have , and closely-related rotated
codes, which have a modified boundary. The effect is dramatic: the same logical
failure rate achievable with a square surface code and physical qubits can
be obtained with a co-prime or rotated surface code using only
physical qubits. Finally, we use approximate maximum likelihood decoding to
demonstrate that this improvement persists for a general Pauli noise biased
towards dephasing. In particular, comparing with a square surface code, we
observe a significant improvement in logical failure rate against biased noise
using a rotated surface code with approximately half the number of physical
qubits.Comment: 18+4 pages, 24 figures; v2 includes additional coauthor (ASD) and new
results on the performance of surface codes in the finite-bias regime,
obtained with beveled surface codes and an improved tensor network decoder;
v3 published versio
Judgments in the Sharing Economy: The Effect of User-Generated Trust and Reputation Information on Decision-Making Accuracy and Bias
The growing ecosystem of peer-to-peer enterprise – the Sharing Economy (SE) – has
brought with it a substantial change in how we access and provide goods and services.
Within the SE, individuals make decisions based mainly on user-generated trust and
reputation information (TRI). Recent research indicates that the use of such information
tends to produce a positivity bias in the perceived trustworthiness of fellow users.
Across two experimental studies performed on an artificial SE accommodation platform,
we test whether users’ judgments can be accurate when presented with diagnostic
information relating to the quality of the profiles they see or if these overly positive
perceptions persist. In study 1, we find that users are quite accurate overall (70%)
at determining the quality of a profile, both when presented with full profiles or with
profiles where they selected three TRI elements they considered useful for their decisionmaking. However, users tended to exhibit an “upward quality bias” when making errors.
In study 2, we leveraged patterns of frequently vs. infrequently selected TRI elements
to understand whether users have insights into which are more diagnostic and find
that presenting frequently selected TRI elements improved users’ accuracy. Overall, our
studies demonstrate that – positivity bias notwithstanding – users can be remarkably
accurate in their online SE judgments
Threshold photoelectron photoion coincidence study of the fragmentation of valence states of CF 3 –CH 3+ and CHF 2 –CH 2 F + in the range 12–24 eV
- …