825 research outputs found

    Applying an Ecobiodevelopmental Framework to Food Insecurity: More Than Simply Food for Thought

    Get PDF
    Dramatic advances in developmental sciences are beginning to reveal the biological mechanisms underlying well-established associations between early childhood adversity and lifelong measures of limited productivity and poor health. The case studies by Chilton and Rabinowich provide poignant and compelling qualitative data that support an ecobiodevelopmental approach towards understanding and addressing both the complex causes and intergenerational consequences of food insecurity

    Positronium beam production and scattering from gaseous targets

    Get PDF
    Quasi-monoenergetic, energy-tunable beams of positronium (Ps) atoms can be produced by neutralising a positron beam in a gaseous target. Investigations into the efficiency for Ps beam production from several gases (He, Ar and H2) have been carried out across a range of energies and gas pressures. In each case optimal Ps beam production conditions have been deduced. The total cross-sections for Ps scattering from He, Ar, H2 and O 2 have also been determined at intermediate energies. These studies have shown that, of the gases studied, H2 is the most efficient positron to Ps beam conversion gas, by a factor of up to three times that of Ar or He, in the range 10 to 90 eV. At 120 eV Ar has been found to be more efficient than H2 by approximately 40%. Ps-gas total cross-sections, ai, have been measured for Ps energies between 10 and 110 eV, across several different Ps flight lengths (from 0.2 to 0.6 m) and hence solid angles (1.3×10 -3 to 10×10-3 sr). In all cases the cross-section initially rises rapidly with incident energy, reaching a broad maximum at -20-40 eV and followed by a slow decrease at higher energies. Of the gases studied, Ar possesses the highest cross-section, with a peak value of ax -20×10 -20 m2, followed by H2, σT -9×10 -20 m2, and then He, σT -5.3×10 -20 m2. Sample measurements for PS-O 2 scattering suggest a cross-section of a similar magnitude to that of Ar. The data are compared to available calculations and to other projectiles. Future work is also suggested

    Using quantum theory to reduce the complexity of input-output processes

    Full text link
    All natural things process and transform information. They receive environmental information as input, and transform it into appropriate output responses. Much of science is dedicated to building models of such systems -- algorithmic abstractions of their input-output behavior that allow us to simulate how such systems can behave in the future, conditioned on what has transpired in the past. Here, we show that classical models cannot avoid inefficiency -- storing past information that is unnecessary for correct future simulation. We construct quantum models that mitigate this waste, whenever it is physically possible to do so. This suggests that the complexity of general input-output processes depends fundamentally on what sort of information theory we use to describe them.Comment: 10 pages, 5 figure

    Fair Value Accounting Fact Or Fancy?

    Get PDF
    Accounting methods had used historical costs prior to FAS 115 and FAS 157. For financial intermediaries in particular, fair value accounting (FVA) has replaced verifiable historical costs with market valuations that, for illiquid assets, rely on assumptions and are not a priori verifiable. The effect of using these relatively new financial accounting standards has been to convert the valuation basis from historical costs accounting to fair value accounting. The recent literature seems to indicate that the current guideline about fair value accounting may be appropriate in certain cases; but in many cases, it does not appear so. Nevertheless, the Financial Accounting Standards Board (FASB) and the Securities and Exchange Commission (SEC) are, apparently, maintaining their current directives for accounting valuation. The Enron case clearly showed that FVA aided the firm in misstating income statements and balance sheets. Given the accounting literature on the subject, the use of FVA also appears to have contributed to the liquidity crisis of 2008 in a negative way in that (1) the use of FVA combined with mandatory capital adequacy requirement introduced a negative feedback mechanism which caused asset prices to fall more than they otherwise would have, and (2) the use of FVA seems to have caused a lack of confidence in valuations reported on banks’ financial reports. This paper will examine the problems inherent in the replacement of historical cost accounting with fair value accounting, with particular focus on the veracity and verifiability of FVA numbers. Our result indicates that accounting methods cannot possibly be responsible for various valuation models, particularly with respect to certain derivative contracts, such as energy swaps and credit default swaps which cannot be replicated in practice

    All Are Welcome? Southern Hospitality and the Politics of Belonging

    Get PDF
    The elective affinity between southern hospitality and Christian hospitality functions as a moral framework for negotiating change and debating identity politics. An ethnographic community study conducted in Rockdale County, Georgia between 2014 and 2017 focused on the discourse and practice of southern hospitality. Illustrations by Andrew Harvard.https://egrove.olemiss.edu/studythesouth/1008/thumbnail.jp

    Guaranteed energy-efficient bit reset in finite time

    Full text link
    Landauer's principle states that it costs at least kTln2 of work to reset one bit in the presence of a heat bath at temperature T. The bound of kTln2 is achieved in the unphysical infinite-time limit. Here we ask what is possible if one is restricted to finite-time protocols. We prove analytically that it is possible to reset a bit with a work cost close to kTln2 in a finite time. We construct an explicit protocol that achieves this, which involves changing the system's Hamiltonian avoiding quantum coherences, and thermalising. Using concepts and techniques pertaining to single-shot statistical mechanics, we further develop the limit on the work cost, proving that the heat dissipated is close to the minimal possible not just on average, but guaranteed with high confidence in every run. Moreover we exploit the protocol to design a quantum heat engine that works near the Carnot efficiency in finite time.Comment: 5 pages + 5 page technical appendix. 5 figures. Author accepted versio

    The classical-quantum divergence of complexity in modelling spin chains

    Full text link
    The minimal memory required to model a given stochastic process - known as the statistical complexity - is a widely adopted quantifier of structure in complexity science. Here, we ask if quantum mechanics can fundamentally change the qualitative behaviour of this measure. We study this question in the context of the classical Ising spin chain. In this system, the statistical complexity is known to grow monotonically with temperature. We evaluate the spin chain's quantum mechanical statistical complexity by explicitly constructing its provably simplest quantum model, and demonstrate that this measure exhibits drastically different behaviour: it rises to a maximum at some finite temperature then tends back towards zero for higher temperatures. This demonstrates how complexity, as captured by the amount of memory required to model a process, can exhibit radically different behaviour when quantum processing is allowed.Comment: 9 pages, 3 figures, comments are welcom

    Maximum one-shot dissipated work from Renyi divergences

    Get PDF
    Thermodynamics describes large-scale, slowly evolving systems. Two modern approaches generalize thermodynamics: fluctuation theorems, which concern finite-time nonequilibrium processes, and one-shot statistical mechanics, which concerns small scales and finite numbers of trials. Combining these approaches, we calculate a one-shot analog of the average dissipated work defined in fluctuation contexts: the cost of performing a protocol in finite time instead of quasistatically. The average dissipated work has been shown to be proportional to a relative entropy between phase-space densities, to a relative entropy between quantum states, and to a relative entropy between probability distributions over possible values of work. We derive one-shot analogs of all three equations, demonstrating that the order-infinity Renyi divergence is proportional to the maximum possible dissipated work in each case. These one-shot analogs of fluctuation-theorem results contribute to the unification of these two toolkits for small-scale, nonequilibrium statistical physics.Comment: 8 pages. Close to published versio
    corecore