239 research outputs found

    The economics of broiler, grain, and trout production as a risk diversification strategy

    Get PDF
    AbstractA comprehensive farm-level stochastic and dynamic capital budgeting simulation model (AQUASIM) is used to evaluate the economic benefits of incorporating a small-scale trout enterprise with a grain and broiler farm. The simulation results indicate that combining aquaculture production with traditional agriculture increased expected income and reduced risk substantially. The use of external debt capital improved the after-tax net present values and internal rates of return but lowered net cash farm income. This study shows the importance of enterprise diversification in stabilizing variability in expected income

    Revisiting consistency conditions for quantum states of systems on closed timelike curves: an epistemic perspective

    Full text link
    There has been considerable recent interest in the consequences of closed timelike curves (CTCs) for the dynamics of quantum mechanical systems. A vast majority of research into this area makes use of the dynamical equations developed by Deutsch, which were developed from a consistency condition that assumes that mixed quantum states uniquely describe the physical state of a system. We criticise this choice of consistency condition from an epistemic perspective, i.e., a perspective in which the quantum state represents a state of knowledge about a system. We demonstrate that directly applying Deutsch's condition when mixed states are treated as representing an observer's knowledge of a system can conceal time travel paradoxes from the observer, rather than resolving them. To shed further light on the appropriate dynamics for quantum systems traversing CTCs, we make use of a toy epistemic theory with a strictly classical ontology due to Spekkens and show that, in contrast to the results of Deutsch, many of the traditional paradoxical effects of time travel are present.Comment: 10 pages, 6 figures, comments welcome; v2 added references and clarified some points; v3 published versio

    Perfect state distinguishability and computational speedups with postselected closed timelike curves

    Get PDF
    Bennett and Schumacher's postselected quantum teleportation is a model of closed timelike curves (CTCs) that leads to results physically different from Deutsch's model. We show that even a single qubit passing through a postselected CTC (P-CTC) is sufficient to do any postselected quantum measurement, and we discuss an important difference between "Deutschian" CTCs (D-CTCs) and P-CTCs in which the future existence of a P-CTC might affect the present outcome of an experiment. Then, based on a suggestion of Bennett and Smith, we explicitly show how a party assisted by P-CTCs can distinguish a set of linearly independent quantum states, and we prove that it is not possible for such a party to distinguish a set of linearly dependent states. The power of P-CTCs is thus weaker than that of D-CTCs because the Holevo bound still applies to circuits using them regardless of their ability to conspire in violating the uncertainty principle. We then discuss how different notions of a quantum mixture that are indistinguishable in linear quantum mechanics lead to dramatically differing conclusions in a nonlinear quantum mechanics involving P-CTCs. Finally, we give explicit circuit constructions that can efficiently factor integers, efficiently solve any decision problem in the intersection of NP and coNP, and probabilistically solve any decision problem in NP. These circuits accomplish these tasks with just one qubit traveling back in time, and they exploit the ability of postselected closed timelike curves to create grandfather paradoxes for invalid answers.Comment: 15 pages, 4 figures; Foundations of Physics (2011

    Cosmological parameters from SDSS and WMAP

    Full text link
    We measure cosmological parameters using the three-dimensional power spectrum P(k) from over 200,000 galaxies in the Sloan Digital Sky Survey (SDSS) in combination with WMAP and other data. Our results are consistent with a ``vanilla'' flat adiabatic Lambda-CDM model without tilt (n=1), running tilt, tensor modes or massive neutrinos. Adding SDSS information more than halves the WMAP-only error bars on some parameters, tightening 1 sigma constraints on the Hubble parameter from h~0.74+0.18-0.07 to h~0.70+0.04-0.03, on the matter density from Omega_m~0.25+/-0.10 to Omega_m~0.30+/-0.04 (1 sigma) and on neutrino masses from <11 eV to <0.6 eV (95%). SDSS helps even more when dropping prior assumptions about curvature, neutrinos, tensor modes and the equation of state. Our results are in substantial agreement with the joint analysis of WMAP and the 2dF Galaxy Redshift Survey, which is an impressive consistency check with independent redshift survey data and analysis techniques. In this paper, we place particular emphasis on clarifying the physical origin of the constraints, i.e., what we do and do not know when using different data sets and prior assumptions. For instance, dropping the assumption that space is perfectly flat, the WMAP-only constraint on the measured age of the Universe tightens from t0~16.3+2.3-1.8 Gyr to t0~14.1+1.0-0.9 Gyr by adding SDSS and SN Ia data. Including tensors, running tilt, neutrino mass and equation of state in the list of free parameters, many constraints are still quite weak, but future cosmological measurements from SDSS and other sources should allow these to be substantially tightened.Comment: Minor revisions to match accepted PRD version. SDSS data and ppt figures available at http://www.hep.upenn.edu/~max/sdsspars.htm

    Three-Dimensional Mapping of the Dark Matter

    Full text link
    We study the prospects for three-dimensional mapping of the dark matter to high redshift through the shearing of faint galaxies images at multiple distances by gravitational lensing. Such maps could provide invaluable information on the nature of the dark energy and dark matter. While in principle well-posed, mapping by direct inversion introduces exceedingly large, but usefully correlated noise into the reconstruction. By carefully propagating the noise covariance, we show that lensing contains substantial information, both direct and statistical, on the large-scale radial evolution of the density field. This information can be efficiently distilled into low-order signal-to-noise eigenmodes which may be used to compress the data by over an order of magnitude. Such compression will be useful for the statistical analysis of future large data sets. The reconstructed map also contains useful information on the localization of individual massive dark matter halos, and hence the dark energy from halo number counts, but its extraction depends strongly on prior assumptions. We outline a procedure for maximum entropy and point-source regularization of the maps that can identify alternate reconstructions.Comment: 11 pages, 5 figures, submitted to PR

    D* Production in Deep Inelastic Scattering at HERA

    Get PDF
    This paper presents measurements of D^{*\pm} production in deep inelastic scattering from collisions between 27.5 GeV positrons and 820 GeV protons. The data have been taken with the ZEUS detector at HERA. The decay channel D+(D0Kπ+)π+D^{*+}\to (D^0 \to K^- \pi^+) \pi^+ (+ c.c.) has been used in the study. The e+pe^+p cross section for inclusive D^{*\pm} production with 5<Q2<100GeV25<Q^2<100 GeV^2 and y<0.7y<0.7 is 5.3 \pms 1.0 \pms 0.8 nb in the kinematic region {1.3<pT(D±)<9.01.3<p_T(D^{*\pm})<9.0 GeV and η(D±)<1.5| \eta(D^{*\pm}) |<1.5}. Differential cross sections as functions of p_T(D^{*\pm}), η(D±),W\eta(D^{*\pm}), W and Q2Q^2 are compared with next-to-leading order QCD calculations based on the photon-gluon fusion production mechanism. After an extrapolation of the cross section to the full kinematic region in p_T(D^{*\pm}) and η\eta(D^{*\pm}), the charm contribution F2ccˉ(x,Q2)F_2^{c\bar{c}}(x,Q^2) to the proton structure function is determined for Bjorken xx between 2 \cdot 104^{-4} and 5 \cdot 103^{-3}.Comment: 17 pages including 4 figure

    Observation of Scaling Violations in Scaled Momentum Distributions at HERA

    Get PDF
    Charged particle production has been measured in deep inelastic scattering (DIS) events over a large range of xx and Q2Q^2 using the ZEUS detector. The evolution of the scaled momentum, xpx_p, with Q2,Q^2, in the range 10 to 1280 GeV2GeV^2, has been investigated in the current fragmentation region of the Breit frame. The results show clear evidence, in a single experiment, for scaling violations in scaled momenta as a function of Q2Q^2.Comment: 21 pages including 4 figures, to be published in Physics Letters B. Two references adde

    Quantum walks: a comprehensive review

    Full text link
    Quantum walks, the quantum mechanical counterpart of classical random walks, is an advanced tool for building quantum algorithms that has been recently shown to constitute a universal model of quantum computation. Quantum walks is now a solid field of research of quantum computation full of exciting open problems for physicists, computer scientists, mathematicians and engineers. In this paper we review theoretical advances on the foundations of both discrete- and continuous-time quantum walks, together with the role that randomness plays in quantum walks, the connections between the mathematical models of coined discrete quantum walks and continuous quantum walks, the quantumness of quantum walks, a summary of papers published on discrete quantum walks and entanglement as well as a succinct review of experimental proposals and realizations of discrete-time quantum walks. Furthermore, we have reviewed several algorithms based on both discrete- and continuous-time quantum walks as well as a most important result: the computational universality of both continuous- and discrete- time quantum walks.Comment: Paper accepted for publication in Quantum Information Processing Journa

    Materiality, health informatics and the limits of knowledge production

    Get PDF
    © IFIP International Federation for Information Processing 2014 Contemporary societies increasingly rely on complex and sophisticated information systems for a wide variety of tasks and, ultimately, knowledge about the world in which we live. Those systems are central to the kinds of problems our systems and sub-systems face such as health and medical diagnosis, treatment and care. While health information systems represent a continuously expanding field of knowledge production, we suggest that they carry forward significant limitations, particularly in their claims to represent human beings as living creatures and in their capacity to critically reflect on the social, cultural and political origins of many forms of data ‘representation’. In this paper we take these ideas and explore them in relation to the way we see healthcare information systems currently functioning. We offer some examples from our own experience in healthcare settings to illustrate how unexamined ideas about individuals, groups and social categories of people continue to influence health information systems and practices as well as their resulting knowledge production. We suggest some ideas for better understanding how and why this still happens and look to a future where the reflexivity of healthcare administration, the healthcare professions and the information sciences might better engage with these issues. There is no denying the role of health informatics in contemporary healthcare systems but their capacity to represent people in those datascapes has a long way to go if the categories they use to describe and analyse human beings are to produce meaningful knowledge about the social world and not simply to replicate past ideologies of those same categories
    corecore