4,000 research outputs found
From Sun to Interplanetary Space: What is the Pathlength of Solar Energetic Particles?
Solar energetic particles (SEPs), accelerated during solar eruptions, propagate in turbulent solar wind before being
observed with in situ instruments. In order to interpret their origin through comparison with remote sensing
observations of the solar eruption, we thus must deconvolve the transport effects due to the turbulent magnetic
fields from the SEP observations. Recent research suggests that the SEP propagation is guided by the turbulent
meandering of the magnetic fieldlines across the mean magnetic field. However, the lengthening of the distance the
SEPs travel, due to the fieldline meandering, has so far not been included in SEP event analysis. This omission can
cause significant errors in estimation of the release times of SEPs at the Sun. We investigate the distance traveled
by the SEPs by considering them to propagate along fieldlines that meander around closed magnetic islands that
are inherent in turbulent plasma. We introduce a fieldline random walk model which takes into account the
physical scales associated to the magnetic islands. Our method remedies the problem of the diffusion equation
resulting in unrealistically short pathlengths, and the fractal dependence of the pathlength of random walk on the
length of the random-walk step. We find that the pathlength from the Sun to 1au can be below the nominal Parker
spiral length for SEP events taking place at solar longitudes 45E to 60W, whereas the western and behind-the-limb
particles can experience pathlengths longer than 2au due to fieldline meandering
Stars were born in significantly denser regions in the early Universe
The density of the warm ionized gas in high-redshift galaxies is known to be
higher than what is typical in local galaxies on similar scales. At the same
time, the mean global properties of the high- and low-redshift galaxies are
quite different. Here, we present a detailed differential analysis of the
ionization parameters of 14 star-forming galaxies at redshift 2.6-3.4, compiled
from the literature. For each of those high-redshift galaxies, we construct a
comparison sample of low-redshift galaxies closely matched in specific star
formation rate (sSFR) and stellar mass, thus ensuring that their global
physical conditions are similar to the high-redshift galaxy. We find that the
median log [OIII] 5007/ [OII] 3727 line ratio of the high-redshift galaxies is
0.5 dex higher than their local counterparts. We construct a new calibration
between the [OIII] 5007/ [OII] 3727 emission line ratio and ionization
parameter to estimate the difference between the ionization parameters in the
high and low-redshift samples. Using this, we show that the typical density of
the warm ionized gas in star-forming regions decreases by a median factor of
from z ~ 3.3 to z ~ 0 at fixed mass and sSFR. We show that
metallicity differences cannot explain the observed density differences.
Because the high- and low-redshift samples are comparable in size, we infer
that the relationship between star formation rate density and gas density must
have been significantly less efficient at z ~2-3 than what is observed in
nearby galaxies with similar levels of star formation activity.Comment: 16 pages, 6 figures, accepted for publication in Ap
Vortex-strings in N=2 quiver X U(1) theories
We study half-BPS vortex-strings in four dimensional N=2 supersymmetric
quiver theories with gauge group SU(N)^n X U(1). The matter content of the
quiver can be represented by what we call a tetris diagram, which simplifies
the analysis of the Higgs vacua and the corresponding strings. We classify the
vacua of these theories in the presence of a Fayet-Iliopoulos term, and study
strings above fully-Higgsed vacua. The strings are studied using classical zero
modes analysis, supersymmetric localization and, in some cases, also S-duality.
We analyze the conditions for bulk-string decoupling at low energies. When the
conditions are satisfied, the low energy theory living on the string's
worldsheet is some 2d N=(2,2) supersymmetric non-linear sigma model. We analyze
the conditions for weak to weak 2d-4d map of parameters, and identify the
worldsheet theory in all the cases where the map is weak to weak. For some
SU(2) quivers, S-duality can be used to map weakly coupled worldsheet theories
to strongly coupled ones. In these cases, we are able to identify the
worldsheet theories also when the 2d-4d map of parameters is weak to strong.Comment: 61 pages, 10 figure
Merger Policies and Trade Liberalization
This paper is about the interactions between what is traditionally considered trade policy and a narrow but important aspect of competition policy, namely merger policy. We focus on links between merger policies and trade liberalization. We put special emphasis on the topical issue of the role that international agreements such as the GATT play when merger policies are nationally chosen. Of particular concern is the possibility that liberalization of international trade will induce countries to increasingly use competition policies to promote national interests at the expense of others. We examine the incentives for a welfare maximizing government to make such a substitution. Interpreting merger policy as a choice of degree of industrial concentration, we investigate how the merger policy that is optimal from the point of view of an individual country is affected by restrictions on the use of tariffs and export subsidies.
Severity as a Priority Setting Criterion: Setting a Challenging Research Agenda
Priority setting in health care is ubiquitous and health authorities are increasingly
recognising the need for priority setting guidelines to ensure efficient, fair, and
equitable resource allocation. While cost-effectiveness concerns seem to dominate
many policies, the tension between utilitarian and deontological concerns is salient
to many, and various severity criteria appear to fill this gap. Severity, then, must be
subjected to rigorous ethical and philosophical analysis. Here we first give a brief
history of the path to today’s severity criteria in Norway and Sweden. The Scandinavian
perspective on severity might be conducive to the international discussion,
given its long-standing use as a priority setting criterion, despite having reached
rather different conclusions so far. We then argue that severity can be viewed as a
multidimensional concept, drawing on accounts of need, urgency, fairness, duty to
save lives, and human dignity. Such concerns will often be relative to local mores,
and the weighting placed on the various dimensions cannot be expected to be fixed.
Thirdly, we present what we think are the most pertinent questions to answer about
severity in order to facilitate decision making in the coming years of increased scarcity,
and to further the understanding of underlying assumptions and values that go
into these decisions. We conclude that severity is poorly understood, and that the
topic needs substantial further inquiry; thus we hope this article may set a challenging
and important research agenda
Toward the Semiclassical Theory of the High Energy Heavy Ion Collisions
Sudden deposition of energy at the early stage of high energy heavy ion
collisions makes virtual gluon fields real.
The same is true for virtual vacuum fields the topological barrier,
excited to real states or the barrier, gluomagnetic clusters of
particular structure related to the of the electroweak theory.
Semiclassically, these states play the role of the {\em ``turning points''}.
After being produced they explode into a spherical shell of coherent field
which then turn into several outgoing gluons. Furthermore, this explosions
promptly produce quark pairs, as seen from explicit solution of the Dirac
equation.
The masses of such clusters depend on their size, and are expected to peak at
. After we briefly review those consepts in a non-technical
manner, we discuss what observable consequences the production of such clusters
would make in the context of heavy ion collisions, especially at the RHIC
energies. We discuss entropy and especially quark production, event-by-event
fluctuations in collective effects like radial and elliptic flows and
suppression. Coherent fields and their geometry increase the jet quenching, and
we also point out the existene of ``explosive edge'' which jump-start
collective effects and may affect unusual phenomena seen at RHIC at large
.Comment: Third version, substantially changed adding new sections and
eliminating large part on jet quenching of the paper which brunched into a
separate pape
Components of the gravitational force in the field of a gravitational wave
Gravitational waves bring about the relative motion of free test masses. The
detailed knowledge of this motion is important conceptually and practically,
because the mirrors of laser interferometric detectors of gravitational waves
are essentially free test masses. There exists an analogy between the motion of
free masses in the field of a gravitational wave and the motion of free charges
in the field of an electromagnetic wave. In particular, a gravitational wave
drives the masses in the plane of the wave-front and also, to a smaller extent,
back and forth in the direction of the wave's propagation. To describe this
motion, we introduce the notion of `electric' and `magnetic' components of the
gravitational force. This analogy is not perfect, but it reflects some
important features of the phenomenon. Using different methods, we demonstrate
the presence and importance of what we call the `magnetic' component of motion
of free masses. It contributes to the variation of distance between a pair of
particles. We explicitely derive the full response function of a 2-arm laser
interferometer to a gravitational wave of arbitrary polarization. We give a
convenient description of the response function in terms of the spin-weighted
spherical harmonics. We show that the previously ignored `magnetic' component
may provide a correction of up to 10 %, or so, to the usual `electric'
component of the response function. The `magnetic' contribution must be taken
into account in the data analysis, if the parameters of the radiating system
are not to be mis-estimated.Comment: prints to 29 pages including 9 figures, new title, additional
explanations and references in response to referee's comments, to be
published in Class. Quant. Gra
Regulation, institutions and commitment : the Jamaican telecommunications sector
The Jamaican telecommunications sector today is much more dynamic than it was before and provides much better service. There is widespread skepticism about the current regulatory framework, which is criticized for encouraging a tight telecommunications monopoly, little administrative discretion, and continous price adjustments to satisfy what many see as a high rate of return requirement. But the authors suggest that the regulatory framework is a"second-best"alternative, a pragmatic response to The Jamaican's institutional realities. The authors analyze why the reforms of the late 1980s took the form they did, and whether they could have been better. They find that the changing nature of regulatory institutions, ownership arrangements, and sector performance in the past 50 years is traceable to intense contracting problems between firms or interest groups and the government. Attempts to resolves these contracting problems have continuously constrained the government's (and firms) ability to implement efficient pricing schemes. In the abstract, The Jamaican's regulatory structure looks inefficient. In the context of The Jamaican's political system, politics, judiciary, bureaucracy, and interest groups, the regulatory framework developed in the late 1980s emerges as a fairly pragmatic, welfare-improving set of policies. Perhaps it could have been better, but its current design reflects basic commitment problems the government has with public utilities.Public Sector Economics&Finance,Economic Theory&Research,National Governance,Environmental Economics&Policies,ICT Policy and Strategies
- …