8,315 research outputs found

    Deep Cover HCI

    Get PDF
    The growing popularity of methodologies that turn "to the wild" for real world data creates new ethical issues for the HCI community. For investigations questioning interactions in public or transient spaces, crowd interaction, or natural behaviour, uncontrolled and uninfluenced (by the experimenter) experiences represent the ideal evaluation environment. We argue that covert research can be completed rigorously and ethically to expand our knowledge of ubiquitous technologies. Our approach, which we call Deep Cover HCI, utilises technology-supported observation in public spaces to stage completely undisturbed experiences for evaluation. We complete studies without informed consent and without intervention from an experimenter in order to gain new insights into how people use technology in public settings. We argue there is clear value in this approach, reflect on the ethical issues of such investigations, and describe our ethical guidelines for completing Deep Cover HCI Research

    Random Variables Recorded under Mutually Exclusive Conditions: Contextuality-by-Default

    Full text link
    We present general principles underlying analysis of the dependence of random variables (outputs) on deterministic conditions (inputs). Random outputs recorded under mutually exclusive input values are labeled by these values and considered stochastically unrelated, possessing no joint distribution. An input that does not directly influence an output creates a context for the latter. Any constraint imposed on the dependence of random outputs on inputs can be characterized by considering all possible couplings (joint distributions) imposed on stochastically unrelated outputs. The target application of these principles is a quantum mechanical system of entangled particles, with directions of spin measurements chosen for each particle being inputs and the spins recorded outputs. The sphere of applicability, however, spans systems across physical, biological, and behavioral sciences.Comment: In H. Liljenstr\"om (Ed.) Advances in Cognitive Neurodynamics IV (pp. 405-410) (2015

    A Quantitative Occam's Razor

    Full text link
    This paper derives an objective Bayesian "prior" based on considerations of entropy/information. By this means, it produces a quantitative measure of goodness of fit (the "H-statistic") that balances higher likelihood against the number of fitting parameters employed. The method is intended for phenomenological applications where the underlying theory is uncertain or unknown. For example, it can help decide whether the large angle anomalies in the CMB data should be taken seriously. I am therefore posting it now, even though it was published before the arxiv existed.Comment: plainTeX, 16 pages, no figures. Most current version is available at http://www.physics.syr.edu/~sorkin/some.papers/ (or wherever my home-page may be

    Quantum Locality?

    Full text link
    Robert Griffiths has recently addressed, within the framework of a 'consistent quantum theory' that he has developed, the issue of whether, as is often claimed, quantum mechanics entails a need for faster-than-light transfers of information over long distances. He argues that the putative proofs of this property that involve hidden variables include in their premises some essentially classical-physics-type assumptions that are fundamentally incompatible with the precepts of quantum physics. One cannot logically prove properties of a system by establishing, instead, properties of a system modified by adding properties alien to the original system. Hence Griffiths' rejection of hidden-variable-based proofs is logically warranted. Griffiths mentions the existence of a certain alternative proof that does not involve hidden variables, and that uses only macroscopically described observable properties. He notes that he had examined in his book proofs of this general kind, and concluded that they provide no evidence for nonlocal influences. But he did not examine the particular proof that he cites. An examination of that particular proof by the method specified by his 'consistent quantum theory' shows that the cited proof is valid within that restrictive version of quantum theory. An added section responds to Griffiths' reply, which cites general possibilities of ambiguities that make what is to be proved ill-defined, and hence render the pertinent 'consistent framework' ill defined. But the vagaries that he cites do not upset the proof in question, which, both by its physical formulation and by explicit identification, specify the framework to be used. Griffiths confirms the validity of the proof insofar as that framework is used. The section also shows, in response to Griffiths' challenge, why a putative proof of locality that he has described is flawed.Comment: This version adds a response to Griffiths' reply to my original. It notes that Griffiths confirms the validity of my argument if one uses the framework that I use. Griffiths' objection that other frameworks exist is not germaine, because I use the unique one that satisfies the explicitly stated conditions that the choices be macroscopic choices of experiments and outcomes in a specified orde

    Quantum analogues of Hardy's nonlocality paradox

    Full text link
    Hardy's nonlocality is a "nonlocality proof without inequalities": it exemplifies that quantum correlations can be qualitatively stronger than classical correlations. This paper introduces variants of Hardy's nonlocality in the CHSH scenario which are realized by the PR-box, but not by quantum correlations. Hence this new kind of Hardy-type nonlocality is a proof without inequalities showing that superquantum correlations can be qualitatively stronger than quantum correlations.Comment: minor fixe

    Numerical Simulation of Vortex Crystals and Merging in N-Point Vortex Systems with Circular Boundary

    Full text link
    In two-dimensional (2D) inviscid incompressible flow, low background vorticity distribution accelerates intense vortices (clumps) to merge each other and to array in the symmetric pattern which is called ``vortex crystals''; they are observed in the experiments on pure electron plasma and the simulations of Euler fluid. Vortex merger is thought to be a result of negative ``temperature'' introduced by L. Onsager. Slight difference in the initial distribution from this leads to ``vortex crystals''. We study these phenomena by examining N-point vortex systems governed by the Hamilton equations of motion. First, we study a three-point vortex system without background distribution. It is known that a N-point vortex system with boundary exhibits chaotic behavior for N\geq 3. In order to investigate the properties of the phase space structure of this three-point vortex system with circular boundary, we examine the Poincar\'e plot of this system. Then we show that topology of the Poincar\'e plot of this system drastically changes when the parameters, which are concerned with the sign of ``temperature'', are varied. Next, we introduce a formula for energy spectrum of a N-point vortex system with circular boundary. Further, carrying out numerical computation, we reproduce a vortex crystal and a vortex merger in a few hundred point vortices system. We confirm that the energy of vortices is transferred from the clumps to the background in the course of vortex crystallization. In the vortex merging process, we numerically calculate the energy spectrum introduced above and confirm that it behaves as k^{-\alpha},(\alpha\approx 2.2-2.8) at the region 10^0<k<10^1 after the merging.Comment: 30 pages, 11 figures. to be published in Journal of Physical Society of Japan Vol.74 No.

    Bell's theorem as a signature of nonlocality: a classical counterexample

    Full text link
    For a system composed of two particles Bell's theorem asserts that averages of physical quantities determined from local variables must conform to a family of inequalities. In this work we show that a classical model containing a local probabilistic interaction in the measurement process can lead to a violation of the Bell inequalities. We first introduce two-particle phase-space distributions in classical mechanics constructed to be the analogs of quantum mechanical angular momentum eigenstates. These distributions are then employed in four schemes characterized by different types of detectors measuring the angular momenta. When the model includes an interaction between the detector and the measured particle leading to ensemble dependencies, the relevant Bell inequalities are violated if total angular momentum is required to be conserved. The violation is explained by identifying assumptions made in the derivation of Bell's theorem that are not fulfilled by the model. These assumptions will be argued to be too restrictive to see in the violation of the Bell inequalities a faithful signature of nonlocality.Comment: Extended manuscript. Significant change
    • …
    corecore