13,343 research outputs found
Thermal correlators of anyons in two dimensions
The anyon fields have trivial -commutator for not integer.
For integer the commutators become temperature-dependent operator
valued distributions. The -point functions do not factorize as for quasifree
states.Comment: 14 pages, LaTeX (misprints corrected, a reference added
Assumptions that imply quantum dynamics is linear
A basic linearity of quantum dynamics, that density matrices are mapped
linearly to density matrices, is proved very simply for a system that does not
interact with anything else. It is assumed that at each time the physical
quantities and states are described by the usual linear structures of quantum
mechanics. Beyond that, the proof assumes only that the dynamics does not
depend on anything outside the system but must allow the system to be described
as part of a larger system. The basic linearity is linked with previously
established results to complete a simple derivation of the linear Schrodinger
equation. For this it is assumed that density matrices are mapped one-to-one
onto density matrices. An alternative is to assume that pure states are mapped
one-to-one onto pure states and that entropy does not decrease.Comment: 10 pages. Added references. Improved discussion of equations of
motion for mean values. Expanded Introductio
Exploring the Referral and Usage of Science Fiction in HCI Literature
Research on science fiction (sci-fi) in scientific publications has indicated
the usage of sci-fi stories, movies or shows to inspire novel Human-Computer
Interaction (HCI) research. Yet no studies have analysed sci-fi in a top-ranked
computer science conference at present. For that reason, we examine the CHI
main track for the presence and nature of sci-fi referrals in relationship to
HCI research. We search for six sci-fi terms in a dataset of 5812 CHI main
proceedings and code the context of 175 sci-fi referrals in 83 papers indexed
in the CHI main track. In our results, we categorize these papers into five
contemporary HCI research themes wherein sci-fi and HCI interconnect: 1)
Theoretical Design Research; 2) New Interactions; 3) Human-Body Modification or
Extension; 4) Human-Robot Interaction and Artificial Intelligence; and 5)
Visions of Computing and HCI. In conclusion, we discuss results and
implications located in the promising arena of sci-fi and HCI research.Comment: v1: 20 pages, 4 figures, 3 tables, HCI International 2018 accepted
submission v2: 20 pages, 4 figures, 3 tables, added link/doi for Springer
proceedin
Precision frequency measurements with interferometric weak values
We demonstrate an experiment which utilizes a Sagnac interferometer to
measure a change in optical frequency of 129 kHz per root Hz with only 2 mW of
continuous wave, single mode input power. We describe the measurement of a weak
value and show how even higher frequency sensitivities may be obtained over a
bandwidth of several nanometers. This technique has many possible applications,
such as precision relative frequency measurements and laser locking without the
use of atomic lines.Comment: 4 pages, 3 figures, published in PR
Ultrasensitive Beam Deflection Measurement via Interferometric Weak Value Amplification
We report on the use of an interferometric weak value technique to amplify
very small transverse deflections of an optical beam. By entangling the beam's
transverse degrees of freedom with the which-path states of a Sagnac
interferometer, it is possible to realize an optical amplifier for polarization
independent deflections. The theory for the interferometric weak value
amplification method is presented along with the experimental results, which
are in good agreement. Of particular interest, we measured the angular
deflection of a mirror down to 560 femtoradians and the linear travel of a
piezo actuator down to 20 femtometers
Optimizing the Signal to Noise Ratio of a Beam Deflection Measurement with Interferometric Weak Values
The amplification obtained using weak values is quantified through a detailed
investigation of the signal to noise ratio for an optical beam deflection
measurement. We show that for a given deflection, input power and beam radius,
the use of interferometric weak values allows one to obtain the optimum signal
to noise ratio using a coherent beam. This method has the advantage of reduced
technical noise and allows for the use of detectors with a low saturation
intensity. We report on an experiment which improves the signal to noise ratio
for a beam deflection measurement by a factor of 54 when compared to a
measurement using the same beam size and a quantum limited detector
Recommended from our members
Enhancing Small Group Teaching in Plant Sciences: A Research and Development Project in Higher Education
The Department of Plant Sciences at the University of Cambridge uses a range of learning and teaching environments including lectures, practical laboratories and small group tutorials'. Under the auspices of the Cambridge-MIT Institute's Pedagogy Programme, a two-year research and development project concerned with the development of small-group teaching is being undertaken. The research element of this project endeavours to illuminate current practice and identify areas in which evidence-based development might take place. The development element will include professional development activities and the production of curriculum resources including appropriate online material. This is a multi-method study including a series of student questionnaires; focus groups of students; semi-structured interviews with staff members; and the collection of video of small group teaching. In this paper we report selected findings from the 'student data' of the first year of this project.The questionnaire, conducted with two cohorts of students (2nd and 3rd year Undergraduates), used a double-scale questionnaire in which students were asked to report both on the prevalence of a range of teaching and learning practices and on how valuable these were in supporting their learning. This type of questionnaire instrument is particularly appropriate because the data it generates is suggestive of areas for changes in practice. The gaps between 'practices' and 'values' (across both cohorts) suggested that students valued activities which improved their understanding of how elements of the course were interrelated; which related course content to 'authentic' examples; and those in which teachers made explicit the characteristics of 'high quality' student work. Small group teaching, in the view of most students, was best used to extend and explore concepts introduced in lectures rather than simply reinforcing them or assessing student understanding.Data gathered through focus group activities illuminated the questionnaire data, providing detailed accounts of how students managed their own learning, and the roles played in this by lectures, small group teaching and other resources. Students identified the processes of planning and writing essays as key learning activities during which they integrated diverse course content and reflected on problematic knowledge. Questionnaire and focus group data suggested that students had less clear views regarding the value of collaborative learning, peer-assessment or activities such as making presentations to other students. When students talked in positive terms about these activities, they often referred to the learning benefits of preparation for the tasks rather than of the collaborative activities themselves. These views may provide indications of potential barriers to changes in learning and teaching environments, and suggest that any such changes may have to be carefully justified to students in terms of benefits to their own learning. Many of our findings are broadly in accord with other work on teaching and learning in Higher Education settings (such as the 'Oxford Learning Context Project' and the 'Enhancing Teaching-Learning Environments in Undergraduate Courses' Project) in that 'deep learning' and 'authenticity' in learning activities are valued by students, and that the introduction of specific formative practices (such as sharing notions of 'quality') would be welcomed. At the same time, amongst the students in our sample, a view of learning as an individual process of 'learning-as-acquisition' predominates over a view that it is a social process of 'learning-as-participation', and this will inform the planning of the 'development' aspect of the project. We conclude with a discussion of how the approach we have used might be more widely applied both within and beyond the Cambridge-MIT partnership. We also identify potential affordances of, and barriers to, the development of research-informed teaching in Higher Education
(Never) Mind your p's and q's: Von Neumann versus Jordan on the Foundations of Quantum Theory
In two papers entitled "On a new foundation [Neue Begr\"undung] of quantum
mechanics," Pascual Jordan (1927b,g) presented his version of what came to be
known as the Dirac-Jordan statistical transformation theory. As an alternative
that avoids the mathematical difficulties facing the approach of Jordan and
Paul A. M. Dirac (1927), John von Neumann (1927a) developed the modern Hilbert
space formalism of quantum mechanics. In this paper, we focus on Jordan and von
Neumann. Central to the formalisms of both are expressions for conditional
probabilities of finding some value for one quantity given the value of
another. Beyond that Jordan and von Neumann had very different views about the
appropriate formulation of problems in quantum mechanics. For Jordan, unable to
let go of the analogy to classical mechanics, the solution of such problems
required the identication of sets of canonically conjugate variables, i.e., p's
and q's. For von Neumann, not constrained by the analogy to classical
mechanics, it required only the identication of a maximal set of commuting
operators with simultaneous eigenstates. He had no need for p's and q's. Jordan
and von Neumann also stated the characteristic new rules for probabilities in
quantum mechanics somewhat differently. Jordan (1927b) was the first to state
those rules in full generality. Von Neumann (1927a) rephrased them and, in a
subsequent paper (von Neumann, 1927b), sought to derive them from more basic
considerations. In this paper we reconstruct the central arguments of these
1927 papers by Jordan and von Neumann and of a paper on Jordan's approach by
Hilbert, von Neumann, and Nordheim (1928). We highlight those elements in these
papers that bring out the gradual loosening of the ties between the new quantum
formalism and classical mechanics.Comment: New version. The main difference with the old version is that the
introduction has been rewritten. Sec. 1 (pp. 2-12) in the old version has
been replaced by Secs. 1.1-1.4 (pp. 2-31) in the new version. The paper has
been accepted for publication in European Physical Journal
Slope Instability of the Earthen Levee in Boston, UK: Numerical Simulation and Sensor Data Analysis
The paper presents a slope stability analysis for a heterogeneous earthen
levee in Boston, UK, which is prone to occasional slope failures under tidal
loads. Dynamic behavior of the levee under tidal fluctuations was simulated
using a finite element model of variably saturated linear elastic perfectly
plastic soil. Hydraulic conductivities of the soil strata have been calibrated
according to piezometers readings, in order to obtain correct range of
hydraulic loads in tidal mode. Finite element simulation was complemented with
series of limit equilibrium analyses. Stability analyses have shown that slope
failure occurs with the development of a circular slip surface located in the
soft clay layer. Both models (FEM and LEM) confirm that the least stable
hydraulic condition is the combination of the minimum river levels at low tide
with the maximal saturation of soil layers. FEM results indicate that in winter
time the levee is almost at its limit state, at the margin of safety (strength
reduction factor values are 1.03 and 1.04 for the low-tide and high-tide
phases, respectively); these results agree with real-life observations. The
stability analyses have been implemented as real-time components integrated
into the UrbanFlood early warning system for flood protection
Revisiting two-step Forbush decreases
Interplanetary coronal mass ejections (ICMEs) and their shocks can sweep out galactic cosmic rays (GCRs), thus creating Forbush decreases (FDs). The traditional model of FDs predicts that an ICME and its shock decrease the GCR intensity in a two-step profile. This model, however, has been the focus of little testing. Thus, our goal is to discover whether a passing ICME and its shock inevitably lead to a two-step FD, as predicted by the model. We use cosmic ray data from 14 neutron monitors and, when possible, high time resolution GCR data from the spacecraft International Gamma Ray Astrophysical Laboratory (INTEGRAL). We analyze 233 ICMEs that should have created two-step FDs. Of these, only 80 created FDs, and only 13 created two-step FDs. FDs are thus less common than predicted by the model. The majority of events indicates that profiles of FDs are more complicated, particularly within the ICME sheath, than predicted by the model. We conclude that the traditional model of FDs as having one or two steps should be discarded. We also conclude that generally ignored small-scale interplanetary magnetic field structure can contribute to the observed variety of FD profiles
- …