31,204 research outputs found
The impact of the ATLAS zero-lepton, jets and missing momentum search on a CMSSM fit
Recent ATLAS data significantly extend the exclusion limits for
supersymmetric particles. We examine the impact of such data on global fits of
the constrained minimal supersymmetric standard model (CMSSM) to indirect and
cosmological data. We calculate the likelihood map of the ATLAS search, taking
into account systematic errors on the signal and on the background. We validate
our calculation against the ATLAS determinaton of 95% confidence level
exclusion contours. A previous CMSSM global fit is then re-weighted by the
likelihood map, which takes a bite at the high probability density region of
the global fit, pushing scalar and gaugino masses up.Comment: 16 pages, 7 figures. v2 has bigger figures and fixed typos. v3 has
clarified explanation of our handling of signal systematic
Coherent states for compact Lie groups and their large-N limits
The first two parts of this article surveys results related to the
heat-kernel coherent states for a compact Lie group K. I begin by reviewing the
definition of the coherent states, their resolution of the identity, and the
associated Segal-Bargmann transform. I then describe related results including
connections to geometric quantization and (1+1)-dimensional Yang--Mills theory,
the associated coherent states on spheres, and applications to quantum gravity.
The third part of this article summarizes recent work of mine with Driver and
Kemp on the large-N limit of the Segal--Bargmann transform for the unitary
group U(N). A key result is the identification of the leading-order large-N
behavior of the Laplacian on "trace polynomials."Comment: Submitted to the proceeding of the CIRM conference, "Coherent states
and their applications: A contemporary panorama.
Effect of venting range hood flow rate on size-resolved ultrafine particle concentrations from gas stove cooking
Cooking is the main source of ultrafine particles (UFP) in homes. This study investigated the effect of venting range hood flow rate on size-resolved UFP concentrations from gas stove cooking. The same cooking protocol was conducted 60 times using three venting range hoods operated at six flow rates in twin research houses. Size-resolved particle (10–420 nm) concentrations were monitored using a NanoScan scanning mobility particle sizer (SMPS) from 15 min before cooking to 3 h after the cooking had stopped. Cooking increased the background total UFP number concentrations to 1.3 × 103 particles/cm3 on average, with a mean exposure-relevant source strength of 1.8 × 1012 particles/min. Total particle peak reductions ranged from 25% at the lowest fan flow rate of 36 L/s to 98% at the highest rate of 146 L/s. During the operation of a venting range hood, particle removal by deposition was less significant compared to the increasing air exchange rate driven by exhaust ventilation. Exposure to total particles due to cooking varied from 0.9 to 5.8 × 104 particles/cm3·h, 3 h after cooking ended. Compared to the 36 L/s range hood, higher flow rates of 120 and 146 L/s reduced the first-hour post-cooking exposure by 76% and 85%, respectively. © 2018 Crown Copyright. Published with license by Taylor & Francis Group, LLC
R-parity violating resonant stop production at the Large Hadron Collider
We have investigated the resonant production of a stop at the Large Hadron
Collider, driven by baryon number violating interactions in supersymmetry. We
work in the framework of minimal supergravity models with the lightest
neutralino being the lightest supersymmetric particle which decays within the
detector. We look at various dilepton and trilepton final states, with or
without b-tags. A detailed background simulation is performed, and all possible
decay modes of the lighter stop are taken into account. We find that higher
stop masses are sometimes easier to probe, through the decay of the stop into
the third or fourth neutralino and their subsequent cascades. We also comment
on the detectability of such signals during the 7 TeV run, where, as expected,
only relatively light stops can be probed. Our conclusion is that the resonant
process may be probed, at both 10 and 14 TeV, with the R-parity violating
coupling {\lambda}"_{312} as low as 0.05, for a stop mass of about 1 TeV. The
possibility of distinguishing between resonant stop production and
pair-production is also discussed.Comment: 20 pages, 4 figures, 6 tables; Version accepted by JHE
Tractable Pathfinding for the Stochastic On-Time Arrival Problem
We present a new and more efficient technique for computing the route that
maximizes the probability of on-time arrival in stochastic networks, also known
as the path-based stochastic on-time arrival (SOTA) problem. Our primary
contribution is a pathfinding algorithm that uses the solution to the
policy-based SOTA problem---which is of pseudo-polynomial-time complexity in
the time budget of the journey---as a search heuristic for the optimal path. In
particular, we show that this heuristic can be exceptionally efficient in
practice, effectively making it possible to solve the path-based SOTA problem
as quickly as the policy-based SOTA problem. Our secondary contribution is the
extension of policy-based preprocessing to path-based preprocessing for the
SOTA problem. In the process, we also introduce Arc-Potentials, a more
efficient generalization of Stochastic Arc-Flags that can be used for both
policy- and path-based SOTA. After developing the pathfinding and preprocessing
algorithms, we evaluate their performance on two different real-world networks.
To the best of our knowledge, these techniques provide the most efficient
computation strategy for the path-based SOTA problem for general probability
distributions, both with and without preprocessing.Comment: Submission accepted by the International Symposium on Experimental
Algorithms 2016 and published by Springer in the Lecture Notes in Computer
Science series on June 1, 2016. Includes typographical corrections and
modifications to pre-processing made after the initial submission to SODA'15
(July 7, 2014
Bayesian Conditioning, the Reflection Principle, and Quantum Decoherence
The probabilities a Bayesian agent assigns to a set of events typically
change with time, for instance when the agent updates them in the light of new
data. In this paper we address the question of how an agent's probabilities at
different times are constrained by Dutch-book coherence. We review and attempt
to clarify the argument that, although an agent is not forced by coherence to
use the usual Bayesian conditioning rule to update his probabilities, coherence
does require the agent's probabilities to satisfy van Fraassen's [1984]
reflection principle (which entails a related constraint pointed out by
Goldstein [1983]). We then exhibit the specialized assumption needed to recover
Bayesian conditioning from an analogous reflection-style consideration.
Bringing the argument to the context of quantum measurement theory, we show
that "quantum decoherence" can be understood in purely personalist
terms---quantum decoherence (as supposed in a von Neumann chain) is not a
physical process at all, but an application of the reflection principle. From
this point of view, the decoherence theory of Zeh, Zurek, and others as a story
of quantum measurement has the plot turned exactly backward.Comment: 14 pages, written in memory of Itamar Pitowsk
Effects of invisible particle emission on global inclusive variables at hadron colliders
We examine the effects of invisible particle emission in conjunction with QCD
initial state radiation (ISR) on quantities designed to probe the mass scale of
new physics at hadron colliders, which involve longitudinal as well as
transverse final-state momenta. This is an extension of our previous treatment,
arXiv:0903.2013, of the effects of ISR on global inclusive variables. We
present resummed results on the visible invariant mass distribution and compare
them to parton-level Monte Carlo results for top quark and gluino
pair-production at the LHC. There is good agreement as long as the visible
pseudorapidity interval is large enough (eta ~ 3). The effect of invisible
particle emission is small in the case of top pair production but substantial
for gluino pair production. This is due mainly to the larger mass of the
intermediate particles in gluino decay (squarks rather than W-bosons). We also
show Monte Carlo modelling of the effects of hadronization and the underlying
event. The effect of the underlying event is large but may be approximately
universal.Comment: 22 pages, expanded sections and other minor modifications. Version
published in JHE
Recommended from our members
Sixteen years of bathymetry and waves at San Diego beaches.
Sustained, quantitative observations of nearshore waves and sand levels are essential for testing beach evolution models, but comprehensive datasets are relatively rare. We document beach profiles and concurrent waves monitored at three southern California beaches during 2001-2016. The beaches include offshore reefs, lagoon mouths, hard substrates, and cobble and sandy (medium-grained) sediments. The data span two energetic El Niño winters and four beach nourishments. Quarterly surveys of 165 total cross-shore transects (all sites) at 100 m alongshore spacing were made from the backbeach to 8 m depth. Monthly surveys of the subaerial beach were obtained at alongshore-oriented transects. The resulting dataset consists of (1) raw sand elevation data, (2) gridded elevations, (3) interpolated elevation maps with error estimates, (4) beach widths, subaerial and total sand volumes, (5) locations of hard substrate and beach nourishments, (6) water levels from a NOAA tide gauge (7) wave conditions from a buoy-driven regional wave model, and (8) time periods and reaches with alongshore uniform bathymetry, suitable for testing 1-dimensional beach profile change models
Generalized Totalizer Encoding for Pseudo-Boolean Constraints
Pseudo-Boolean constraints, also known as 0-1 Integer Linear Constraints, are
used to model many real-world problems. A common approach to solve these
constraints is to encode them into a SAT formula. The runtime of the SAT solver
on such formula is sensitive to the manner in which the given pseudo-Boolean
constraints are encoded. In this paper, we propose generalized Totalizer
encoding (GTE), which is an arc-consistency preserving extension of the
Totalizer encoding to pseudo-Boolean constraints. Unlike some other encodings,
the number of auxiliary variables required for GTE does not depend on the
magnitudes of the coefficients. Instead, it depends on the number of distinct
combinations of these coefficients. We show the superiority of GTE with respect
to other encodings when large pseudo-Boolean constraints have low number of
distinct coefficients. Our experimental results also show that GTE remains
competitive even when the pseudo-Boolean constraints do not have this
characteristic.Comment: 10 pages, 2 figures, 2 tables. To be published in 21st International
Conference on Principles and Practice of Constraint Programming 201
- …