2,329 research outputs found
Top Quark Pair Production at the Tevatron
The measurement of the top quark pair production cross section in
proton-antiproton collisions at 1.96 TeV is a test of quantum chromodynamics
and could potentially be sensitive to new physics beyond the standard model. I
report on the latest t-tbar cross section results from the CDF and DZero
experiments in various final state topologies which arise from decays of top
quark pairs.Comment: 4 pages, 3 figures; to appear in the proceedings of the XXXXth
Rencontres de Moriond: QCD and High Energy Hadronic Interactions, La Thuile,
Italy, March 12-19, 200
Analytic Confidence Level Calculations using the Likelihood Ratio and Fourier Transform
The interpretation of new particle search results involves a confidence level
calculation on either the discovery hypothesis or the background-only ("null")
hypothesis. A typical approach uses toy Monte Carlo experiments to build an
expected experiment estimator distribution against which an observed
experiment's estimator may be compared. In this note, a new approach is
presented which calculates analytically the experiment estimator distribution
via a Fourier transform, using the likelihood ratio as an ordering estimator.
The analytic approach enjoys an enormous speed advantage over the toy Monte
Carlo method, making it possible to quickly and precisely calculate confidence
level results.Comment: 11 pages, 2 figure
Fundamentals of LHC Experiments
Experiments on the Large Hadron Collider at CERN represent our furthest
excursion yet along the energy frontier of particle physics. The goal of
probing physical processes at the TeV energy scale puts strict requirements on
the performance of accelerator and experiment, dictating the awe-inspiring
dimensions of both. These notes, based on a set of five lectures given at the
2010 Theoretical Advanced Studies Institute in Boulder, Colorado, not only
review the physics considered as part of the accelerator and experiment design,
but also introduce algorithms and tools used to interpret experimental results
in terms of theoretical models. The search for new physics beyond the Standard
Model presents many new challenges, a few of which are addressed in specific
examples.Comment: 23 pages; lecture notes from 2010 Theoretical Advanced Studies
Institute in Boulder, Colorad
Weak measurement and control of entanglement generation
In this paper we show how weak joint measurement and local feedback can be
used to control entanglement generation between two qubits. To do this, we make
use of a decoherence free subspace (DFS). Weak measurement and feedback can be
used to drive the system into this subspace rapidly. Once within the subspace,
feedback can generate entanglement rapidly, or turn off entanglement generation
dynamically. We also consider, in the context of weak measurement, some of
differences between purification and generating entanglement
Evaluating Information Assurance Control Effectiveness on an Air Force Supervisory Control and Data Acquisition (SCADA) System
Supervisory Control and Data Acquisition (SCADA) systems are increasingly being connected to corporate networks which has dramatically expanded their attack surface to remote cyber attack. Adversaries are targeting these systems with increasing frequency and sophistication. This thesis seeks to answer the research question addressing which Information Assurance (IA) controls are most significant for network defenders and SCADA system managers/operators to focus on in order to increase the security of critical infrastructure systems against a Stuxnet-like cyber attack. This research applies the National Institute of Science and Technology (NIST) IA controls to an attack tree modeled on a remote Stuxnet-like cyber attack against the WPAFB fuels operation. The probability of adversary success of specific attack scenarios is developed via the attack tree. Then an impact assessment is obtained via a survey of WPAFB fuels operation subject matter experts (SMEs). The probabilities of adversary success and impact analysis are used to create a Risk Level matrix, which is analyzed to identify recommended IA controls. The culmination of this research identified 14 IA controls associated with mitigating an adversary from gaining remote access and deploying an exploit as the most influential for SCADA managers, operators and network defenders to focus on in order to maximize system security against a Stuxnet-like remote cyber attack
Thermochemistry of Alane Complexes for Hydrogen Storage: A Theoretical and Experimental Investigation.
Knowledge of the relative stabilities of alane (AlH(3)) complexes with electron donors is essential for identifying hydrogen storage materials for vehicular applications that can be regenerated by off-board methods; however, almost no thermodynamic data are available to make this assessment. To fill this gap, we employed the G4(MP2) method to determine heats of formation, entropies, and Gibbs free energies of formation for 38 alane complexes with NH(3-n)R(n) (R = Me, Et; n = 0-3), pyridine, pyrazine, triethylenediamine (TEDA), quinuclidine, OH(2-n)R(n) (R = Me, Et; n = 0-2), dioxane, and tetrahydrofuran (THF). Monomer, bis, and selected dimer complex geometries were considered. Using these data, we computed the thermodynamics of the key formation and dehydrogenation reactions that would occur during hydrogen delivery and alane regeneration, from which trends in complex stability were identified. These predictions were tested by synthesizing six amine-alane complexes involving trimethylamine, triethylamine, dimethylethylamine, TEDA, quinuclidine, and hexamine and obtaining upper limits of ΔG° for their formation from metallic aluminum. Combining these computational and experimental results, we establish a criterion for complex stability relevant to hydrogen storage that can be used to assess potential ligands prior to attempting synthesis of the alane complex. On the basis of this, we conclude that only a subset of the tertiary amine complexes considered and none of the ether complexes can be successfully formed by direct reaction with aluminum and regenerated in an alane-based hydrogen storage system
The Gemini Planet Imager Exoplanet Survey: Dynamical Mass of the Exoplanet β Pictoris b from Combined Direct Imaging and Astrometry
We present new observations of the planet β Pictoris b from 2018 with the Gemini Planet Imager (GPI), the first GPI observations following conjunction. Based on these new measurements, we perform a joint orbit fit to the available relative astrometry from ground-based imaging, the Hipparcos Intermediate Astrometric Data (IAD), and the Gaia DR2 position, and demonstrate how to incorporate the IAD into direct imaging orbit fits. We find a mass consistent with predictions of hot-start evolutionary models and previous works following similar methods, though with larger uncertainties: 12.8^(+5.3)_(−3.2) M_(Jup). Our eccentricity determination of 0.12^(+0.04)_(-0.03) disfavors circular orbits. We consider orbit fits to several different imaging data sets, and find generally similar posteriors on the mass for each combination of imaging data. Our analysis underscores the importance of performing joint fits to the absolute and relative astrometry simultaneously, given the strong covariance between orbital elements. Time of conjunction is well-constrained within 2.8 days of 2017 September 13, with the star behind the planet's Hill sphere between 2017 April 11 and 2018 February 16 (±18 days). Following the recent radial velocity detection of a second planet in the system, β Pic c, we perform additional two-planet fits combining relative astrometry, absolute astrometry, and stellar radial velocities. These joint fits find a significantly smaller mass (8.0 ± 2.6 M_(Jup)) for the imaged planet β Pic b, in a somewhat more circular orbit. We expect future ground-based observations to further constrain the visual orbit and mass of the planet in advance of the release of Gaia DR4
Superoperator Analysis of Entanglement in a Four-Qubit Cluster State
In this paper we utilize superoperator formalism to explore the entanglement
evolution of four-qubit cluster states in a number of decohering environments.
A four-qubit cluster state is a resource for the performance of an arbitrary
single logical qubit rotation via measurement based cluster state quantum
computation. We are specifically interested in the relationship between
entanglement evolution and the fidelity with which the arbitrary single logical
qubit rotation can be implemented in the presence of decoherence as this will
have important experimental ramifications. We also note the exhibition of
entanglement sudden death (ESD) and ask how severely its onset affects the
utilization of the cluster state as a means of implementing an arbitrary single
logical qubit rotation.Comment: 9 pages, 9 composite figures, presentation of results completely
rewritte
Exactly how has income inequality changed? Patterns of distributional change in core societies
The recent resurgence of income inequality in some of the core societies has spawned a wide-ranging debate as to the culprits. Progress in this debate has been complicated by the fact that many of the theories that have been developed to account for the inequality upswing imply radically different patterns of distributional change, while predicting the same outcome in terms of the behavior of standard summary measures (e.g., a rise in the Gini coefficient or in Theil's inequality). Handcock and Morris (1999) have developed methods that allow the analyst to precisely identify patterns of distributional change and a set of summary measures to characterize such changes. These are based on the relative distribution, defined for our purposes as the ratio of the fraction of households in the baseline year to the fraction of households in the comparison year in each decile of the distribution of income. We use the available high-quality data from the Luxemburg Income Study to explore the evolution of household income inequality in sixteen core societies. We describe exactly how inequality grew in some core societies since the late 1960s and discuss the extent to which patterns of distributional change were homogeneous or heterogeneous across the core. We find that 1) rising inequality is generally associated with polarization, rather than upgrading or downgrading alone, 2) among those societies experiencing the largest increases in inequality, upgrading typically takes precedence over downgrading in the course of such polarization, and 3) declining inequality, where it occurs, has been the result of convergence, with the magnitude of the shift from the lower tail to the middle exceeding that of the shift from upper tail to the middle
- …