13,343 research outputs found

    Thermal correlators of anyons in two dimensions

    Full text link
    The anyon fields have trivial α\alpha-commutator for α\alpha not integer. For integer α\alpha the commutators become temperature-dependent operator valued distributions. The nn-point functions do not factorize as for quasifree states.Comment: 14 pages, LaTeX (misprints corrected, a reference added

    Assumptions that imply quantum dynamics is linear

    Full text link
    A basic linearity of quantum dynamics, that density matrices are mapped linearly to density matrices, is proved very simply for a system that does not interact with anything else. It is assumed that at each time the physical quantities and states are described by the usual linear structures of quantum mechanics. Beyond that, the proof assumes only that the dynamics does not depend on anything outside the system but must allow the system to be described as part of a larger system. The basic linearity is linked with previously established results to complete a simple derivation of the linear Schrodinger equation. For this it is assumed that density matrices are mapped one-to-one onto density matrices. An alternative is to assume that pure states are mapped one-to-one onto pure states and that entropy does not decrease.Comment: 10 pages. Added references. Improved discussion of equations of motion for mean values. Expanded Introductio

    Exploring the Referral and Usage of Science Fiction in HCI Literature

    Full text link
    Research on science fiction (sci-fi) in scientific publications has indicated the usage of sci-fi stories, movies or shows to inspire novel Human-Computer Interaction (HCI) research. Yet no studies have analysed sci-fi in a top-ranked computer science conference at present. For that reason, we examine the CHI main track for the presence and nature of sci-fi referrals in relationship to HCI research. We search for six sci-fi terms in a dataset of 5812 CHI main proceedings and code the context of 175 sci-fi referrals in 83 papers indexed in the CHI main track. In our results, we categorize these papers into five contemporary HCI research themes wherein sci-fi and HCI interconnect: 1) Theoretical Design Research; 2) New Interactions; 3) Human-Body Modification or Extension; 4) Human-Robot Interaction and Artificial Intelligence; and 5) Visions of Computing and HCI. In conclusion, we discuss results and implications located in the promising arena of sci-fi and HCI research.Comment: v1: 20 pages, 4 figures, 3 tables, HCI International 2018 accepted submission v2: 20 pages, 4 figures, 3 tables, added link/doi for Springer proceedin

    Precision frequency measurements with interferometric weak values

    Get PDF
    We demonstrate an experiment which utilizes a Sagnac interferometer to measure a change in optical frequency of 129 kHz per root Hz with only 2 mW of continuous wave, single mode input power. We describe the measurement of a weak value and show how even higher frequency sensitivities may be obtained over a bandwidth of several nanometers. This technique has many possible applications, such as precision relative frequency measurements and laser locking without the use of atomic lines.Comment: 4 pages, 3 figures, published in PR

    Ultrasensitive Beam Deflection Measurement via Interferometric Weak Value Amplification

    Get PDF
    We report on the use of an interferometric weak value technique to amplify very small transverse deflections of an optical beam. By entangling the beam's transverse degrees of freedom with the which-path states of a Sagnac interferometer, it is possible to realize an optical amplifier for polarization independent deflections. The theory for the interferometric weak value amplification method is presented along with the experimental results, which are in good agreement. Of particular interest, we measured the angular deflection of a mirror down to 560 femtoradians and the linear travel of a piezo actuator down to 20 femtometers

    Optimizing the Signal to Noise Ratio of a Beam Deflection Measurement with Interferometric Weak Values

    Get PDF
    The amplification obtained using weak values is quantified through a detailed investigation of the signal to noise ratio for an optical beam deflection measurement. We show that for a given deflection, input power and beam radius, the use of interferometric weak values allows one to obtain the optimum signal to noise ratio using a coherent beam. This method has the advantage of reduced technical noise and allows for the use of detectors with a low saturation intensity. We report on an experiment which improves the signal to noise ratio for a beam deflection measurement by a factor of 54 when compared to a measurement using the same beam size and a quantum limited detector

    (Never) Mind your p's and q's: Von Neumann versus Jordan on the Foundations of Quantum Theory

    Get PDF
    In two papers entitled "On a new foundation [Neue Begr\"undung] of quantum mechanics," Pascual Jordan (1927b,g) presented his version of what came to be known as the Dirac-Jordan statistical transformation theory. As an alternative that avoids the mathematical difficulties facing the approach of Jordan and Paul A. M. Dirac (1927), John von Neumann (1927a) developed the modern Hilbert space formalism of quantum mechanics. In this paper, we focus on Jordan and von Neumann. Central to the formalisms of both are expressions for conditional probabilities of finding some value for one quantity given the value of another. Beyond that Jordan and von Neumann had very different views about the appropriate formulation of problems in quantum mechanics. For Jordan, unable to let go of the analogy to classical mechanics, the solution of such problems required the identication of sets of canonically conjugate variables, i.e., p's and q's. For von Neumann, not constrained by the analogy to classical mechanics, it required only the identication of a maximal set of commuting operators with simultaneous eigenstates. He had no need for p's and q's. Jordan and von Neumann also stated the characteristic new rules for probabilities in quantum mechanics somewhat differently. Jordan (1927b) was the first to state those rules in full generality. Von Neumann (1927a) rephrased them and, in a subsequent paper (von Neumann, 1927b), sought to derive them from more basic considerations. In this paper we reconstruct the central arguments of these 1927 papers by Jordan and von Neumann and of a paper on Jordan's approach by Hilbert, von Neumann, and Nordheim (1928). We highlight those elements in these papers that bring out the gradual loosening of the ties between the new quantum formalism and classical mechanics.Comment: New version. The main difference with the old version is that the introduction has been rewritten. Sec. 1 (pp. 2-12) in the old version has been replaced by Secs. 1.1-1.4 (pp. 2-31) in the new version. The paper has been accepted for publication in European Physical Journal

    Slope Instability of the Earthen Levee in Boston, UK: Numerical Simulation and Sensor Data Analysis

    Full text link
    The paper presents a slope stability analysis for a heterogeneous earthen levee in Boston, UK, which is prone to occasional slope failures under tidal loads. Dynamic behavior of the levee under tidal fluctuations was simulated using a finite element model of variably saturated linear elastic perfectly plastic soil. Hydraulic conductivities of the soil strata have been calibrated according to piezometers readings, in order to obtain correct range of hydraulic loads in tidal mode. Finite element simulation was complemented with series of limit equilibrium analyses. Stability analyses have shown that slope failure occurs with the development of a circular slip surface located in the soft clay layer. Both models (FEM and LEM) confirm that the least stable hydraulic condition is the combination of the minimum river levels at low tide with the maximal saturation of soil layers. FEM results indicate that in winter time the levee is almost at its limit state, at the margin of safety (strength reduction factor values are 1.03 and 1.04 for the low-tide and high-tide phases, respectively); these results agree with real-life observations. The stability analyses have been implemented as real-time components integrated into the UrbanFlood early warning system for flood protection

    Revisiting two-step Forbush decreases

    Get PDF
    Interplanetary coronal mass ejections (ICMEs) and their shocks can sweep out galactic cosmic rays (GCRs), thus creating Forbush decreases (FDs). The traditional model of FDs predicts that an ICME and its shock decrease the GCR intensity in a two-step profile. This model, however, has been the focus of little testing. Thus, our goal is to discover whether a passing ICME and its shock inevitably lead to a two-step FD, as predicted by the model. We use cosmic ray data from 14 neutron monitors and, when possible, high time resolution GCR data from the spacecraft International Gamma Ray Astrophysical Laboratory (INTEGRAL). We analyze 233 ICMEs that should have created two-step FDs. Of these, only 80 created FDs, and only 13 created two-step FDs. FDs are thus less common than predicted by the model. The majority of events indicates that profiles of FDs are more complicated, particularly within the ICME sheath, than predicted by the model. We conclude that the traditional model of FDs as having one or two steps should be discarded. We also conclude that generally ignored small-scale interplanetary magnetic field structure can contribute to the observed variety of FD profiles
    corecore