1,789 research outputs found
External Inversion, Internal Inversion, and Reflection Invariance
Having in mind that physical systems have different levels of structure we
develop the concept of external, internal and total improper Lorentz
transformation (space inversion and time reversal). A particle obtained from
the ordinary one by the application of internal space inversion or time
reversal is generally a different particle. From this point of view the
intrinsic parity of a nuclear particle (`elementary particle') is in fact the
external intrinsic parity, if we take into account the internal structure of a
particle. We show that non-conservation of the external parity does not
necessarily imply non-invariance of nature under space inversion. The
conventional theory of beta-decay can be corrected by including the internal
degrees of freedom to become invariant under total space inversion, though not
under the external one.Comment: 15 pages. An early proposal of "mirror matter", published in 1974.
This is an exact copy of the published paper. I am posting it here because of
the increasing interest in the "exact parity models" and its experimental
consequence
Weak randomness completely trounces the security of QKD
In usual security proofs of quantum protocols the adversary (Eve) is expected
to have full control over any quantum communication between any communicating
parties (Alice and Bob). Eve is also expected to have full access to an
authenticated classical channel between Alice and Bob. Unconditional security
against any attack by Eve can be proved even in the realistic setting of device
and channel imperfection. In this Letter we show that the security of QKD
protocols is ruined if one allows Eve to possess a very limited access to the
random sources used by Alice. Such knowledge should always be expected in
realistic experimental conditions via different side channels
Validation of Geant4-based Radioactive Decay Simulation
Radioactive decays are of concern in a wide variety of applications using
Monte-Carlo simulations. In order to properly estimate the quality of such
simulations, knowledge of the accuracy of the decay simulation is required. We
present a validation of the original Geant4 Radioactive Decay Module, which
uses a per-decay sampling approach, and of an extended package for Geant4-based
simulation of radioactive decays, which, in addition to being able to use a
refactored per-decay sampling, is capable of using a statistical sampling
approach. The validation is based on measurements of calibration isotope
sources using a high purity Germanium (HPGe) detector; no calibration of the
simulation is performed. For the considered validation experiment equivalent
simulation accuracy can be achieved with per-decay and statistical sampling
Radioactive Decays in Geant4
The simulation of radioactive decays is a common task in Monte-Carlo systems
such as Geant4. Usually, a system either uses an approach focusing on the
simulations of every individual decay or an approach which simulates a large
number of decays with a focus on correct overall statistics. The radioactive
decay package presented in this work permits, for the first time, the use of
both methods within the same simulation framework - Geant4. The accuracy of the
statistical approach in our new package, RDM-extended, and that of the existing
Geant4 per-decay implementation (original RDM), which has also been refactored,
are verified against the ENSDF database. The new verified package is beneficial
for a wide range of experimental scenarios, as it enables researchers to choose
the most appropriate approach for their Geant4-based application
The Peculiar Atmospheric Chemistry of KELT-9b
The atmospheric temperatures of the ultra-hot Jupiter KELT-9b straddle the
transition between gas giants and stars, and therefore between two
traditionally distinct regimes of atmospheric chemistry. Previous theoretical
studies assume the atmosphere of KELT-9b to be in chemical equilibrium. Despite
the high ultraviolet flux from KELT-9, we show using photochemical kinetics
calculations that the observable atmosphere of KELT-9b is predicted to be close
to chemical equilibrium, which greatly simplifies any theoretical
interpretation of its spectra. It also makes the atmosphere of KELT-9b, which
is expected to be cloudfree, a tightly constrained chemical system that lends
itself to a clean set of theoretical predictions. Due to the lower pressures
probed in transmission (compared to emission) spectroscopy, we predict the
abundance of water to vary by several orders of magnitude across the
atmospheric limb depending on temperature, which makes water a sensitive
thermometer. Carbon monoxide is predicted to be the dominant molecule under a
wide range of scenarios, rendering it a robust diagnostic of the metallicity
when analyzed in tandem with water. All of the other usual suspects (acetylene,
ammonia, carbon dioxide, hydrogen cyanide, methane) are predicted to be
subdominant at solar metallicity, while atomic oxygen, iron and magnesium are
predicted to have relative abundances as high as 1 part in 10,000. Neutral
atomic iron is predicted to be seen through a forest of optical and
near-infrared lines, which makes KELT-9b suitable for high-resolution
ground-based spectroscopy with HARPS-N or CARMENES. We summarize future
observational prospects of characterizing the atmosphere of KELT-9b.Comment: Accepted by ApJ. 9 pages, 6 figures. Corrected minor errors in
Figures 1a and 1b (some line styles were switched by accident), text and
conclusions unchanged, these minor changes will be updated in final ApJ proo
Adaptive Resolution Simulation of Liquid Water
We present a multiscale simulation of liquid water where a spatially adaptive
molecular resolution procedure allows for changing on-the-fly from a
coarse-grained to an all-atom representation. We show that this approach leads
to the correct description of all essential thermodynamic and structural
properties of liquid water.Comment: 4 pages, 3 figures; changed figure
Computer Simulation of Ion Beam Analysis: Possibilities and Limitations
Quantitative application of ion beam analysis methods, such as Rutherford backscat- tering, elastic recoil detection analysis, and nuclear reaction analysis, requires the use of computer simulation codes. The different types of available codes are pre- sented, and their advantages and weaknesses with respect to underlying physics and computing time requirements are discussed. Differences between different codes of the same type are smaller by about one order of magnitude than the uncertainty of basic input data, especially stopping power and cross section data. Even very com- plex sample structures with elemental concentration variations with depth or lat- erally varying structures can be simulated quantitatively. Laterally inhomogeneous samples generally result in an ambiguity with depth profiles. The optimization of ion beam analysis measurements is discussed, and available tools are presented
A local-global principle for linear dependence of noncommutative polynomials
A set of polynomials in noncommuting variables is called locally linearly
dependent if their evaluations at tuples of matrices are always linearly
dependent. By a theorem of Camino, Helton, Skelton and Ye, a finite locally
linearly dependent set of polynomials is linearly dependent. In this short note
an alternative proof based on the theory of polynomial identities is given. The
method of the proof yields generalizations to directional local linear
dependence and evaluations in general algebras over fields of arbitrary
characteristic. A main feature of the proof is that it makes it possible to
deduce bounds on the size of the matrices where the (directional) local linear
dependence needs to be tested in order to establish linear dependence.Comment: 8 page
Greenhouse gas implications of mobilizing agricultural biomass for energy: a reassessment of global potentials in 2050 under different food-system pathways
Global bioenergy potentials have been the subject of extensive research and continued controversy. Due to vast uncertainties regarding future yields, diets and other influencing parameters, estimates of future agricultural biomass potentials vary widely. Most scenarios compatible with ambitious climate targets foresee a large expansion of bioenergy, mainly from energy crops that needs to be kept consistent with projections of agriculture and food production. Using the global biomass balance model BioBaM, we here present an assessment of agricultural bioenergy potentials compatible with the Food and Agriculture Organization's (2018) 'Alternative pathways to 2050' projections. Mobilizing biomass at larger scales may be associated with systemic feedbacks causing greenhouse gas (GHG) emissions, e.g. crop residue removal resulting in loss of soil carbon stocks and increased emissions from fertilization. To assess these effects, we derive 'GHG cost supply-curves', i.e. integrated representations of biomass potentials and their systemic GHG costs. Livestock manure is most favourable in terms of GHG costs, as anaerobic digestion yields reductions of GHG emissions from manure management. Global potentials from intensive livestock systems are about 5 EJ/yr. Crop residues can provide up to 20 EJ/yr at moderate GHG costs. For energy crops, we find that the medium range of literature estimates (~40 to 90 EJ/yr) is only compatible with FAO yield and human diet projections if energy plantations expand into grazing areas (~4–5 million km2) and grazing land is intensified globally. Direct carbon stock changes associated with perennial energy crops are beneficial for climate mitigation, yet there are—sometimes considerable—'opportunity GHG costs' if one accounts the foregone opportunity of afforestation. Our results indicate that the large potentials of energy crops foreseen in many energy scenarios are not freely and unconditionally available. Disregarding systemic effects in agriculture can result in misjudgement of GHG saving potentials and flawed climate mitigation strategies
- …