9,846 research outputs found
Limited phase deviation frequency multiplier phase modulator, appendix e final report
Limited phase deviation frequency multiplier phase modulator - application to radio frequency test console equipmen
Pure xenon hexafluoride prepared for thermal properties studies
Preparation of a xenon hexafluoride and sodium fluoride salt yields a sample of the highest possible purity for use in thermal measurements. The desired hexafluoride can easily be freed from the common contaminants, xenon tetra-fluoride, xenon difluoride, and xenon oxide tetrafluoride, because none of these compounds reacts with sodium fluoride
Quantum Metropolis Sampling
The original motivation to build a quantum computer came from Feynman who
envisaged a machine capable of simulating generic quantum mechanical systems, a
task that is believed to be intractable for classical computers. Such a machine
would have a wide range of applications in the simulation of many-body quantum
physics, including condensed matter physics, chemistry, and high energy
physics. Part of Feynman's challenge was met by Lloyd who showed how to
approximately decompose the time-evolution operator of interacting quantum
particles into a short sequence of elementary gates, suitable for operation on
a quantum computer. However, this left open the problem of how to simulate the
equilibrium and static properties of quantum systems. This requires the
preparation of ground and Gibbs states on a quantum computer. For classical
systems, this problem is solved by the ubiquitous Metropolis algorithm, a
method that basically acquired a monopoly for the simulation of interacting
particles. Here, we demonstrate how to implement a quantum version of the
Metropolis algorithm on a quantum computer. This algorithm permits to sample
directly from the eigenstates of the Hamiltonian and thus evades the sign
problem present in classical simulations. A small scale implementation of this
algorithm can already be achieved with today's technologyComment: revised versio
Saharan dust and biomass burning aerosols during ex-hurricane Ophelia: Observations from the new UK lidar and sun-photometer network
This is the final version. Available from European Geosciences Union (EGU) / Copernicus Publications via the DOI in this record. On 15-16 October 2017, ex-hurricane Ophelia passed to the west of the British Isles, bringing dust from the Sahara and smoke from Portuguese forest fires that was observable to the naked eye and reported in the UK's national press. We report here detailed observations of this event using the UK operational lidar and sun-photometer network, established for the early detection of aviation hazards, including volcanic ash. We also use ECMWF ERA5 wind field data and MODIS imagery to examine the aerosol transport. The observations, taken continuously over a period of 30 h, show a complex picture, dominated by several different aerosol layers at different times and clearly correlated with the passage of different air masses associated with the intense cyclonic system. A similar evolution was observed at several sites, with a time delay between them explained by their different location with respect to the storm and associated meteorological features. The event commenced with a shallow dust layer at 1-2 km in altitude and culminated in a deep and complex structure that lasted ∼12 h at each site over the UK, correlated with the storm's warm sector. For most of the time, the aerosol detected was dominated by mineral dust mixtures, as highlighted by depolarisation measurements, but an intense biomass burning aerosol (BBA) layer was observed towards the end of the event, lasting around 3 h at each site. The aerosol optical depth at 355 nm (AOD355) during the whole event ranged from 0.2 to 2.9, with the larger AOD correlated to the intense BBA layer. Such a large AOD is unprecedented in the UK according to AERONET records for the last 20 years. The Raman lidars permitted the measurement of the aerosol extinction coefficient at 355 nm, the particle linear depolarisation ratio (PLDR), and the lidar ratio (LR) and made the separation of the dust (depolarising) aerosol from other aerosol types possible. A specific extinction has also been computed to provide an estimate of the atmospheric concentration of both aerosol types separately, which peaked at 420±200 μgm-3 for the dust and 558±232 μgm-3 for the biomass burning aerosols. Back trajectories computed using the Numerical Atmospheric-dispersion Modelling Environment (NAME) were used to identify the sources and strengthen the conclusions drawn from the observations. The UK network represents a significant expansion of the observing capability in northern Europe, with instruments evenly distributed across Great Britain, from Camborne in Cornwall to Lerwick in the Shetland Islands, and this study represents the first attempt to demonstrate its capability and validate the methods in use. Its ultimate purpose will be the detection and quantification of volcanic plumes, but the present study clearly demonstrates the advanced capabilities of the network.Natural Environment Research CouncilUniversity of Exete
General Monogamy Inequality for Bipartite Qubit Entanglement
We consider multipartite states of qubits and prove that their bipartite
quantum entanglement, as quantified by the concurrence, satisfies a monogamy
inequality conjectured by Coffman, Kundu, and Wootters. We relate this monogamy
inequality to the concept of frustration of correlations in quantum spin
systems.Comment: Fixed spelling mistake. Added references. Fixed error in
transformation law. Shorter and more explicit proof of capacity formula.
Reference added. Rewritten introduction and conclusion
Causal structure of the entanglement renormalization ansatz
We show that the multiscale entanglement renormalization ansatz (MERA) can be
reformulated in terms of a causality constraint on discrete quantum dynamics.
This causal structure is that of de Sitter space with a flat spacelike
boundary, where the volume of a spacetime region corresponds to the number of
variational parameters it contains. This result clarifies the nature of the
ansatz, and suggests a generalization to quantum field theory. It also
constitutes an independent justification of the connection between MERA and
hyperbolic geometry which was proposed as a concrete implementation of the
AdS-CFT correspondence
The SSS phase of RS Ophiuchi observed with Chandra and XMM-Newton I.: Data and preliminary Modeling
The phase of Super-Soft-Source (SSS) emission of the sixth recorded outburst
of the recurrent nova RS Oph was observed twice with Chandra and once with
XMM-Newton. The observations were taken on days 39.7, 54.0, and 66.9 after
outburst. We confirm a 35-sec period on day 54.0 and found that it originates
from the SSS emission and not from the shock. We discus the bound-free
absorption by neutral elements in the line of sight, resonance absorption lines
plus self-absorbed emission line components, collisionally excited emission
lines from the shock, He-like intersystem lines, and spectral changes during an
episode of high-amplitude variability. We find a decrease of the oxygen K-shell
absorption edge that can be explained by photoionization of oxygen. The
absorption component has average velocities of -1286+-267 km/s on day 39.7 and
of -771+-65 km/s on day 66.9. The wavelengths of the emission line components
are consistent with their rest wavelengths as confirmed by measurements of
non-self absorbed He-like intersystem lines. We have evidence that these lines
originate from the shock rather than the outer layers of the outflow and may be
photoexcited in addition to collisional excitations. We found collisionally
excited emission lines that are fading at wavelengths shorter than 15A that
originate from the radiatively cooling shock. On day 39.5 we find a systematic
blue shift of -526+-114 km/s from these lines. We found anomalous He-like f/i
ratios which indicates either high densities or significant UV radiation near
the plasma where the emission lines are formed. During the phase of strong
variability the spectral hardness light curve overlies the total light curve
when shifted by 1000sec. This can be explained by photoionization of neutral
oxygen in the line of sight if the densities of order 10^{10}-10^{11} cm^{-3}.Comment: 16 pages, 10 figures, 4 tables. Accepted by ApJ; v2: Co-author
Woodward adde
The Complexity of Admissibility in Omega-Regular Games
Iterated admissibility is a well-known and important concept in classical
game theory, e.g. to determine rational behaviors in multi-player matrix games.
As recently shown by Berwanger, this concept can be soundly extended to
infinite games played on graphs with omega-regular objectives. In this paper,
we study the algorithmic properties of this concept for such games. We settle
the exact complexity of natural decision problems on the set of strategies that
survive iterated elimination of dominated strategies. As a byproduct of our
construction, we obtain automata which recognize all the possible outcomes of
such strategies
Exploring scholarly data with Rexplore.
Despite the large number and variety of tools and services available today for exploring scholarly data, current support is still very limited in the context of sensemaking tasks, which go beyond standard search and ranking of authors and publications, and focus instead on i) understanding the dynamics of research areas, ii) relating authors ‘semantically’ (e.g., in terms of common interests or shared academic trajectories), or iii) performing fine-grained academic expert search along multiple dimensions. To address this gap we have developed a novel tool, Rexplore, which integrates statistical analysis, semantic technologies, and visual analytics to provide effective support for exploring and making sense of scholarly data. Here, we describe the main innovative elements of the tool and we present the results from a task-centric empirical evaluation, which shows that Rexplore is highly effective at providing support for the aforementioned sensemaking tasks. In addition, these results are robust both with respect to the background of the users (i.e., expert analysts vs. ‘ordinary’ users) and also with respect to whether the tasks are selected by the evaluators or proposed by the users themselves
Space tug propulsion system failure mode, effects and criticality analysis
For purposes of the study, the propulsion system was considered as consisting of the following: (1) main engine system, (2) auxiliary propulsion system, (3) pneumatic system, (4) hydrogen feed, fill, drain and vent system, (5) oxygen feed, fill, drain and vent system, and (6) helium reentry purge system. Each component was critically examined to identify possible failure modes and the subsequent effect on mission success. Each space tug mission consists of three phases: launch to separation from shuttle, separation to redocking, and redocking to landing. The analysis considered the results of failure of a component during each phase of the mission. After the failure modes of each component were tabulated, those components whose failure would result in possible or certain loss of mission or inability to return the Tug to ground were identified as critical components and a criticality number determined for each. The criticality number of a component denotes the number of mission failures in one million missions due to the loss of that component. A total of 68 components were identified as critical with criticality numbers ranging from 1 to 2990
- …