9,502 research outputs found
Rationale and design of the Clinical Evaluation of Magnetic Resonance Imaging in Coronary heart disease 2 trial (CE-MARC 2): a prospective, multicenter, randomized trial of diagnostic strategies in suspected coronary heart disease
Background:
A number of investigative strategies exist for the diagnosis of coronary heart disease (CHD). Despite the widespread availability of noninvasive imaging, invasive angiography is commonly used early in the diagnostic pathway. Consequently, approximately 60% of angiograms reveal no evidence of obstructive coronary disease. Reducing unnecessary angiography has potential financial savings and avoids exposing the patient to unnecessary risk. There are no large-scale comparative effectiveness trials of the different diagnostic strategies recommended in international guidelines and none that have evaluated the safety and efficacy of cardiovascular magnetic resonance.<p></p>
Trial Design:
CE-MARC 2 is a prospective, multicenter, 3-arm parallel group, randomized controlled trial of patients with suspected CHD (pretest likelihood 10%-90%) requiring further investigation. A total of 1,200 patients will be randomized on a 2:2:1 basis to receive 3.0-T cardiovascular magnetic resonance–guided care, single-photon emission computed tomography–guided care (according to American College of Cardiology/American Heart Association appropriate-use criteria), or National Institute for Health and Care Excellence guidelines–based management. The primary (efficacy) end point is the occurrence of unnecessary angiography as defined by a normal (>0.8) invasive fractional flow reserve. Safety of each strategy will be assessed by 3-year major adverse cardiovascular event rates. Cost-effectiveness and health-related quality-of-life measures will be performed.<p></p>
Conclusions:
The CE-MARC 2 trial will provide comparative efficacy and safety evidence for 3 different strategies of investigating patients with suspected CHD, with the intension of reducing unnecessary invasive angiography rates. Evaluation of these management strategies has the potential to improve patient care, health-related quality of life, and the cost-effectiveness of CHD investigation
Automated reduction of submillimetre single-dish heterodyne data from the James Clerk Maxwell Telescope using ORAC-DR
With the advent of modern multi-detector heterodyne instruments that can
result in observations generating thousands of spectra per minute it is no
longer feasible to reduce these data as individual spectra. We describe the
automated data reduction procedure used to generate baselined data cubes from
heterodyne data obtained at the James Clerk Maxwell Telescope. The system can
automatically detect baseline regions in spectra and automatically determine
regridding parameters, all without input from a user. Additionally it can
detect and remove spectra suffering from transient interference effects or
anomalous baselines. The pipeline is written as a set of recipes using the
ORAC-DR pipeline environment with the algorithmic code using Starlink software
packages and infrastructure. The algorithms presented here can be applied to
other heterodyne array instruments and have been applied to data from
historical JCMT heterodyne instrumentation.Comment: 18 pages, 13 figures, submitted to Monthly Notices of the Royal
Astronomical Societ
Longmeyer Exposes or Creates Uncertainty about the Duty to Inform Remainder Beneficiaries of a Revocable Trust
This article discusses the surprising Longmeyer decision, handed down by the Supreme Court of Kentucky earlier this year in which a predecessor trustee was held to have a duty to give certain notifications to former remainder beneficiaries of a revocable trust. The authors then examine how Longmeyer might have been decided in other states and under other statutory schemes. The article concludes with observations concerning when certain notices to trust beneficiaries may be conducive to effective trust administration and suggestions to those who administer trusts on how best to comply with beneficiary notice requirements
Pando: Personal Volunteer Computing in Browsers
The large penetration and continued growth in ownership of personal
electronic devices represents a freely available and largely untapped source of
computing power. To leverage those, we present Pando, a new volunteer computing
tool based on a declarative concurrent programming model and implemented using
JavaScript, WebRTC, and WebSockets. This tool enables a dynamically varying
number of failure-prone personal devices contributed by volunteers to
parallelize the application of a function on a stream of values, by using the
devices' browsers. We show that Pando can provide throughput improvements
compared to a single personal device, on a variety of compute-bound
applications including animation rendering and image processing. We also show
the flexibility of our approach by deploying Pando on personal devices
connected over a local network, on Grid5000, a French-wide computing grid in a
virtual private network, and seven PlanetLab nodes distributed in a wide area
network over Europe.Comment: 14 pages, 12 figures, 2 table
Vortex Formation by Interference of Multiple Trapped Bose-Einstein Condensates
We report observations of vortex formation as a result of merging together
multiple Rb Bose-Einstein condensates (BECs) in a confining potential.
In this experiment, a trapping potential is partitioned into three sections by
a barrier, enabling the simultaneous formation of three independent,
uncorrelated condensates. The three condensates then merge together into one
BEC, either by removal of the barrier, or during the final stages of
evaporative cooling if the barrier energy is low enough; both processes can
naturally produce vortices within the trapped BEC. We interpret the vortex
formation mechanism as originating in interference between the initially
independent condensates, with indeterminate relative phases between the three
initial condensates and the condensate merging rate playing critical roles in
the probability of observing vortices in the final, single BEC.Comment: 5 pages, 3 figure
Experimental Implementation of the Quantum Baker's Map
This paper reports on the experimental implementation of the quantum baker's
map via a three bit nuclear magnetic resonance (NMR) quantum information
processor. The experiments tested the sensitivity of the quantum chaotic map to
perturbations. In the first experiment, the map was iterated forward and then
backwards to provide benchmarks for intrinsic errors and decoherence. In the
second set of experiments, the least significant qubit was perturbed in between
the iterations to test the sensitivity of the quantum chaotic map to applied
perturbations. These experiments are used to investigate previous predicted
properties of quantum chaotic dynamics.Comment: submitted to PR
Phase changes in 38 atom Lennard-Jones clusters; 1, A parallel tempering study in the canonical ensemble
The heat capacity and isomer distributions of the 38 atom Lennard-Jones cluster have been calculated in the canonical ensemble using parallel tempering Monte Carlo methods. A distinct region of temperature is identified that corresponds to equilibrium between the global minimum structure and the icosahedral basin of structures. This region of temperatures occurs below the melting peak of the heat capacity and is accompanied by a peak in the derivative of the heat capacity with temperature. Parallel tempering is shown to introduce correlations between results at different temperatures. A discussion is given that compares parallel tempering with other related approaches that ensure ergodic simulations
Testing general relativity using golden black-hole binaries
The coalescences of stellar-mass black-hole binaries through their inspiral,
merger, and ringdown are among the most promising sources for ground-based
gravitational-wave (GW) detectors. If a GW signal is observed with sufficient
signal-to-noise ratio, the masses and spins of the black holes can be estimated
from just the inspiral part of the signal. Using these estimates of the initial
parameters of the binary, the mass and spin of the final black hole can be
uniquely predicted making use of general-relativistic numerical simulations. In
addition, the mass and spin of the final black hole can be independently
estimated from the merger--ringdown part of the signal. If the binary black
hole dynamics is correctly described by general relativity (GR), these
independent estimates have to be consistent with each other. We present a
Bayesian implementation of such a test of general relativity, which allows us
to combine the constraints from multiple observations. Using kludge modified GR
waveforms, we demonstrate that this test can detect sufficiently large
deviations from GR, and outline the expected constraints from upcoming GW
observations using the second-generation of ground-based GW detectors.Comment: 5 pages, 2 fig
Signed zeros of Gaussian vector fields-density, correlation functions and curvature
We calculate correlation functions of the (signed) density of zeros of
Gaussian distributed vector fields. We are able to express correlation
functions of arbitrary order through the curvature tensor of a certain abstract
Riemann-Cartan or Riemannian manifold. As an application, we discuss one- and
two-point functions. The zeros of a two-dimensional Gaussian vector field model
the distribution of topological defects in the high-temperature phase of
two-dimensional systems with orientational degrees of freedom, such as
superfluid films, thin superconductors and liquid crystals.Comment: 14 pages, 1 figure, uses iopart.cls, improved presentation, to appear
in J. Phys.
- …