9,547 research outputs found
On the uniqueness of paths for spin-0 and spin-1 quantum mechanics
The uniqueness of the Bohmian particle interpretation of the Kemmer equation,
which describes massive spin-0 and spin-1 particles, is discussed. Recently the
same problem for spin-1/2 was dealt with by Holland. It appears that the
uniqueness of boson paths can be enforced under well determined conditions.
This in turn fixes the nonrelativistic particle equations of the
nonrelativistic Schrodinger equation, which appear to correspond with the
original definitions given by de Broglie and Bohm only in the spin-0 case.
Similar to the spin-1/2 case, there appears an additional spin-dependent term
in the guidance equation in the spin-1 case. We also discuss the ambiguity
associated with the introduction of an electromagnetic coupling in the Kemmer
theory. We argue that when the minimal coupling is correctly introduced, then
the current constructed from the energy-momentum tensor is no longer conserved.
Hence this current can not serve as a particle probability four-vector.Comment: 19 pages, no figures, LaTex, shortened version for Phys. Lett.
A non-local, Lorentz-invariant, hidden-variable interpretation of relativistic quantum mechanics based on particle trajectories
We demonstrate how to construct a lorentz-invariant, hidden-variable
interpretation of relativistic quantum mechanics based on particle
trajectories. The covariant theory that we propose employs a multi-time
formalism and a lorentz-invariant rule for the coordination of the space-time
points on the individual particle trajectories. In this way we show that there
is no contradiction between nonlocality and lorentz invariance in quantum
mechanics. The approach is illustrated for relativistic bosons, using a simple
model to discuss the individual non-locally correlated particle motion which
ensues when the wavefunction is entangled. A simple example of measurement is
described.Comment: 12 pages, 2 figure
The Costs and Cost-Effectiveness of Mass Treatment for Intestinal Nematode Worm Infections Using Different Treatment Thresholds
Almost one in every two people in the developing world is infected with one or more types of intestinal nematode worms. When fewer than 50% of people are infected, most carry only a few worms; but when more than 50% are infected, the number carrying moderate to heavy numbers increases markedly, as does the risk of disease. The WHO recommends annual mass deworming of children when 20% or more are infected and twice a year if 50% or more are infected. We estimated the cost of this to treat children with 10+ worms, an arbitrary moderate to heavy infection. We concluded that it is not cost-effective to mass treat children when fewer than 40% are infected because the majority are uninfected and few are likely be diseased. We propose annual treatment when 40% or more children are infected, twice a year at 60%, and three times a year at 80% or more. This would cost USD 224 million annually to treat all children aged 2–14 years in 107 developing countries compared with USD 276 million using current WHO guidelines. The new three-tier guidelines also treat a larger proportion of infected children and treat children with moderate to heavy worm burdens more often
ORCSim: a generalized Organic Rankine cycle simulation tool
An increasing interest in organic Rankine cycle (ORC) technology has led to numerous simulation and
optimization studies. In the open-literature different modeling approaches can be found, but general software
tools available to the academic/industrial community are limited. A generalized ORC simulation
tool, named ORCSim, is proposed in this paper. The framework is developed using object-oriented programming
that easily allows improvements and future extensions. Currently two cycle configurations are
implemented, i.e. a basic ORC and an ORC with liquid-flooded expansion. The software architecture,
the thermo-physical property wrappers, the component library and the solution algorithm are discussed
with particular emphasis on the ORC with liquid-flooded expansion. A thorough validation both at component
and cycle levels is proposed by considering the aforementioned cycle architectures
Temperature and humidity based projections of a rapid rise in global heat stress exposure during the 21st century
As a result of global increases in both temperature and specific humidity, heat stress is projected to intensify throughout the 21st century. Some of the regions most susceptible to dangerous heat and humidity combinations are also among the most densely populated. Consequently, there is the potential for widespread exposure to wet bulb temperatures that approach and in some cases exceed postulated theoretical limits of human tolerance by mid- to late-century. We project that by 2080 the relative frequency of present-day extreme wet bulb temperature events could rise by a factor of 100–250 (approximately double the frequency change projected for temperature alone) in the tropics and parts of the mid-latitudes, areas which are projected to contain approximately half the world's population. In addition, population exposure to wet bulb temperatures that exceed recent deadly heat waves may increase by a factor of five to ten, with 150–750 million person-days of exposure to wet bulb temperatures above those seen in today's most severe heat waves by 2070–2080. Under RCP 8.5, exposure to wet bulb temperatures above 35 °C—the theoretical limit for human tolerance—could exceed a million person-days per year by 2080. Limiting emissions to follow RCP 4.5 entirely eliminates exposure to that extreme threshold. Some of the most affected regions, especially Northeast India and coastal West Africa, currently have scarce cooling infrastructure, relatively low adaptive capacity, and rapidly growing populations. In the coming decades heat stress may prove to be one of the most widely experienced and directly dangerous aspects of climate change, posing a severe threat to human health, energy infrastructure, and outdoor activities ranging from agricultural production to military training
On beta-Plurality Points in Spatial Voting Games
Let be a set of points in , called voters. A point
is a plurality point for when the following holds: for
every the number of voters closer to than to is at
least the number of voters closer to than to . Thus, in a vote where
each votes for the nearest proposal (and voters for which the
proposals are at equal distance abstain), proposal will not lose against
any alternative proposal . For most voter sets a plurality point does not
exist. We therefore introduce the concept of -plurality points, which
are defined similarly to regular plurality points except that the distance of
each voter to (but not to ) is scaled by a factor , for some
constant . We investigate the existence and computation of
-plurality points, and obtain the following.
* Define \beta^*_d := \sup \{ \beta : \text{any finite multiset V\mathbb{R}^d\beta-plurality point} \}. We prove that , and that for all
.
* Define \beta(p, V) := \sup \{ \beta : \text{p\betaV}\}. Given a voter set , we provide an
algorithm that runs in time and computes a point such that
. Moreover, for we can compute a point
with in time.
* Define \beta(V) := \sup \{ \beta : \text{V\beta-plurality
point}\}. We present an algorithm that, given a voter set in
, computes an plurality point in
time .Comment: 21 pages, 10 figures, SoCG'2
Holocene drainage systems of the English Fenland : roddons and their environmental significance
The roddons of the English Fenlands are fossilised silt and sand-filled tidal creek systems of mid- to late-Holocene age, incised into contemporaneous clay deposits. However, anthropogenic change (drainage and agriculture) has caused the former channels to become positive topographical features. Three stratigraphically discrete generations of roddon have been discriminated. They all show well-developed dendritic meander patterns, but there is little or no evidence of sand/silt infill during meandering; thus, unlike modern tidal creeks and rivers they typically lack laterally stacked point bar deposits, suggesting rapid infill. Major “trunk” roddons are rich in fine sands and there is little change in grain size from roddon mouth to the upper reaches, suggesting highly effective sand transport mechanisms and uniform conditions of deposition. Tributaries are silt-rich, while minor tributaries also have a significant clay component. During infill, active drainage networks appear to have been choked by sediment, converting mudflat/salt-marsh environments into widespread peat-forming freshwater reed swamps
Covariant many-fingered time Bohmian interpretation of quantum field theory
The Bohmian interpretation of the many-fingered time (MFT) Tomonaga-Schwinger
formulation of quantum field theory (QFT) describes MFT fields, which provides
a covariant Bohmian interpretation of QFT without introducing a preferred
foliation of spacetime.Comment: 7 pages, significantly revise
Integrating evidence, politics and society: a methodology for the science–policy interface
There is currently intense debate over expertise, evidence and ‘post-truth’ politics, and how this is influencing policy formulation and implementation. In this article, we put forward a methodology for evidence-based policy making intended as a way of helping navigate this web of complexity. Starting from the premise of why it is so crucial that policies to meet major global challenges use scientific evidence, we discuss the socio-political difficulties and complexities that hinder this process. We discuss the necessity of embracing a broader view of what constitutes evidence—science and the evaluation of scientific evidence cannot be divorced from the political, cultural and social debate that inevitably and justifiably surrounds these major issues. As a pre-requisite for effective policy making, we propose a methodology that fully integrates scientific investigation with political debate and social discourse. We describe a rigorous process of mapping, analysis, visualisation and sharing of evidence, constructed from integrating science and social science data. This would then be followed by transparent evidence evaluation, combining independent assessment to test the validity and completeness of the evidence with deliberation to discover how the evidence is perceived, misunderstood or ignored. We outline the opportunities and the problems derived from the use of digital communications, including social media, in this methodology, and emphasise the power of creative and innovative evidence visualisation and sharing in shaping policy
- …