2,462 research outputs found

    Optimally choosing small ensemble members to produce robust climate simulations

    Get PDF
    This study examines the subset climate model ensemble size required to reproduce certain statistical characteristics from a full ensemble. The ensemble characteristics examined are the root mean square error, the ensemble mean and standard deviation. Subset ensembles are created using measures that consider the simulation performance alone or include a measure of simulation independence relative to other ensemble members. It is found that the independence measure is able to identify smaller subset ensembles that retain the desired full ensemble characteristics than either of the performance based measures. It is suggested that model independence be considered when choosing ensemble subsets or creating new ensembles. © 2013 IOP Publishing Ltd

    Accounting for Skill in Trend, Variability, and Autocorrelation Facilitates Better Multi-Model Projections: Application to the AMOC and Temperature Time Series

    Full text link
    We present a novel quasi-Bayesian method to weight multiple dynamical models by their skill at capturing both potentially non-linear trends and first-order autocorrelated variability of the underlying process, and to make weighted probabilistic projections. We validate the method using a suite of one-at-a-time cross-validation experiments involving Atlantic meridional overturning circulation (AMOC), its temperature-based index, as well as Korean summer mean maximum temperature. In these experiments the method tends to exhibit superior skill over a trend-only Bayesian model averaging weighting method in terms of weight assignment and probabilistic forecasts. Specifically, mean credible interval width, and mean absolute error of the projections tend to improve. We apply the method to a problem of projecting summer mean maximum temperature change over Korea by the end of the 21st century using a multi-model ensemble. Compared to the trend-only method, the new method appreciably sharpens the probability distribution function (pdf) and increases future most likely, median, and mean warming in Korea. The method is flexible, with a potential to improve forecasts in geosciences and other fields

    Higgs Messengers

    Full text link
    We explore the consequences of the Higgs fields acting as messengers of supersymmetry breaking. The hidden-sector paradigm in the gauge mediation framework is relaxed by allowing two types of gauge-invariant, renormalizable operators that are typically discarded: direct coupling between the Higgses and supersymmetry breaking singlets, and Higgs-messenger mixing terms. The most important phenomenological consequence is a flavor-dependent shift in sfermion masses. This is from a one-loop contribution, which we compute for a general set of weak doublet messengers. We also study a couple of explicit models in detail, finding that precision electroweak constraints can be satisfied with a spectrum significantly different from that of gauge mediation.Comment: 20 pages, 5 figure

    The Bolocam Galactic Plane Survey. XII. Distance Catalog Expansion Using Kinematic Isolation of Dense Molecular Cloud Structures With 13CO(1-0)

    Full text link
    We present an expanded distance catalog for 1,710 molecular cloud structures identified in the Bolocam Galactic Plane Survey (BGPS) version 2, representing a nearly threefold increase over the previous BGPS distance catalog. We additionally present a new method for incorporating extant data sets into our Bayesian distance probability density function (DPDF) methodology. To augment the dense-gas tracers (e.g., HCO+(3-2), NH3(1,1)) used to derive line-of-sight velocities for kinematic distances, we utilize the Galactic Ring Survey 13CO(1-0) data to morphologically extract velocities for BGPS sources. The outline of a BGPS source is used to select a region of the GRS 13CO data, along with a reference region to subtract enveloping diffuse emission, to produce a line profile of 13CO matched to the BGPS source. For objects with a HCO+(3-2) velocity, \approx 95% of the new 13CO(1-0) velocities agree with that of the dense gas. A new prior DPDF for kinematic distance ambiguity (KDA) resolution, based on a validated formalism for associating molecular cloud structures with known objects from the literature, is presented. We demonstrate this prior using catalogs of masers with trigonometric parallaxes and HII regions with robust KDA resolutions. The distance catalog presented here contains well-constrained distance estimates for 20% of BGPS V2 sources, with typical distance uncertainties \lesssim 0.5 kpc. Approximately 75% of the well-constrained sources lie within 6 kpc of the Sun, concentrated in the Scutum-Centarus arm. Galactocentric positions of objects additionally trace out portions of the Sagittarius, Perseus, and Outer arms in the first and second Galactic quadrants, and we also find evidence for significant regions of interarm dense gas.Comment: 28 pages, 19 figures. Accepted for publication in ApJ. Distance-Omnibus code available at https://github.com/BGPS/distance-omnibu

    Optical/Near-Infrared Imaging of Infrared-Excess Palomar-Green QSOs

    Get PDF
    Ground-based high spatial-resolution (FWHM < 0.3-0.8") optical and near-infrared imaging (0.4-2.2um) is presented for a complete sample of optically selected Palomar-Green QSOs with far-infrared excesses at least as great as those of "warm" AGN-like ultraluminous infrared galaxies (L_ir/L_big-blue-bump > 0.46). In all cases, the host galaxies of the QSOs were detected and most have discernable two-dimensional structure. The QSO host galaxies and the QSO nuclei are similar in magnitude at H-band. H-band luminosities of the hosts range from 0.5-7.5 L* with a mean of 2.3 L*, and are consistent with those found in ULIGs. Both the QSO nuclei and the host galaxies have near-infrared excesses, which may be the result of dust associated with the nucleus and of recent dusty star formation in the host. These results suggest that some, but not all, optically-selected QSOs may have evolved from an infrared-active state triggered by the merger of two similarly-sized L* galaxies, in a manner similar to that of the ultraluminous infrared galaxies.Comment: Aastex format, 38 pages, 4 tables, 10 figures. Higher quality figures are available in JPG forma

    The Bolocam Galactic Plane Survey. XIII. Physical Properties and Mass Functions of Dense Molecular Cloud Structures

    Full text link
    We use the distance probability density function (DPDF) formalism of Ellsworth-Bowers et al. (2013, 2015) to derive physical properties for the collection of 1,710 Bolocam Galactic Plane Survey (BGPS) version 2 sources with well-constrained distance estimates. To account for Malmquist bias, we estimate that the present sample of BGPS sources is 90% complete above 400 MM_\odot and 50% complete above 70 MM_\odot. The mass distributions for the entire sample and astrophysically motivated subsets are generally fitted well by a lognormal function, with approximately power-law distributions at high mass. Power-law behavior emerges more clearly when the sample population is narrowed in heliocentric distance (power-law index α=2.0±0.1\alpha = 2.0\pm0.1 for sources nearer than 6.5 kpc and α=1.9±0.1\alpha = 1.9\pm0.1 for objects between 2 kpc and 10 kpc). The high-mass power-law indices are generally 1.85α2.051.85 \leq \alpha \leq 2.05 for various subsamples of sources, intermediate between that of giant molecular clouds and the stellar initial mass function. The fit to the entire sample yields a high-mass power-law α^=1.940.10+0.34\hat{\alpha} = 1.94_{-0.10}^{+0.34}. Physical properties of BGPS sources are consistent with large molecular cloud clumps or small molecular clouds, but the fractal nature of the dense interstellar medium makes difficult the mapping of observational categories to the dominant physical processes driving the observed structure. The face-on map of the Galactic disk's mass surface density based on BGPS dense molecular cloud structures reveals the high-mass star-forming regions W43, W49, and W51 as prominent mass concentrations in the first quadrant. Furthermore, we present a 0.25-kpc resolution map of the dense gas mass fraction across the Galactic disk that peaks around 5%.Comment: Accepted for publication in ApJ; 32 pages, 21 figure

    The Bolocam Galactic Plane Survey IX: Data Release 2 and Outer Galaxy Extension

    Get PDF
    We present a re-reduction and expansion of the Bolocam Galactic Plane Survey, first presented by Aguirre et al. (2011) and Rosolowsky et al. (2010). The BGPS is a 1.1 mm survey of dust emission in the Northern galactic plane, covering longitudes -10 < \ell < 90 and latitudes |b| < 0.5 with a typical 1-\sigma RMS sensitivity of 30-100 mJy in a 33" beam. Version 2 of the survey includes an additional 20 square degrees of coverage in the 3rd and 4th quadrants and 2 square degrees in the 1st quadrant. The new data release has improved angular recovery, with complete recovery out to 80" and partial recovery to 300", and reduced negative bowls around bright sources resulting from the atmospheric subtraction process. We resolve the factor of 1.5 flux calibration offset between the v1.0 data release and other data sets and determine that there is no offset between v2.0 and other data sets. The v2.0 pointing accuracy is tested against other surveys and demonstrated to be accurate and an improvement over v1.0. We present simulations and tests of the pipeline and its properties, including measurements of the pipeline's angular transfer function. The Bolocat cataloging tool was used to extract a new catalog, which includes 8594 sources, with 591 in the expanded regions. We have demonstrated that the Bolocat 40" and 80" apertures are accurate even in the presence of strong extended background emission. The number of sources is lower than in v1.0, but the amount of flux and area included in identified sources is larger.Comment: 36 pages, 16 figures, accepted to ApJS. Data available from http://irsa.ipac.caltech.edu/data/BOLOCAM_GPS

    Economic evaluation of short treatment for multidrugresistant tuberculosis, Ethiopia and South Africa : the STREAM trial

    Get PDF
    OBJECTIVE STREAM was a phase-III non-inferiority randomised controlled trial (RCT) to evaluate a shortened regimen for multi-drug resistant tuberculosis (MDR-TB), and included the first-ever within-trial economic evaluation of such regimens, reported here. METHODS We compared the costs of ‘Long’ (20-22 months) and ‘Short’ (9-11 months) regimens in Ethiopia and South Africa. Cost data were collected from trial participants, and health system costs estimated using ‘bottom-up’ and ‘top-down’ costing approaches. A cost-effectiveness analysis was conducted with the trial primary outcome as the measure of effectiveness, including a probabilistic sensitivity analysis (PSA) to illustrate decision uncertainty. FINDINGS The Short-regimen reduced healthcare costs per case by 21% in South Africa (US8,341LongvsUS8,341 Long vs US6,619 Short) and 25% in Ethiopia (US6,097LongvsUS6,097 Long vs US4,552 Short). The largest component of this saving was medication in South Africa (67%) and social support in Ethiopia (35%). In Ethiopia, participants on the Short-regimen reported reductions in dietary supplementation expenditure (US225percase(95225 per case (95%CI 133-297)), and greater productivity (667 additional hours worked, 95%CI 193– 1127). Patient cost savings also arose from fewer visits to health facilities (Ethiopia US13 (95%CI 11-14), South Africa US64(9564 (95%CI 50-77) per case). The probability of cost-effectiveness was >95% when favourable outcomes were valued at <US19,000 (Ethiopia) or <US$14,500 (South Africa). CONCLUSION The Short-regimen provided substantial health system cost savings and reduced financial burden on participants. Shorter regimens are likely to be cost-effective in most settings, and an effective strategy to support the WHO goal of eliminating catastrophic costs in T

    Quantifying the overall added value of dynamical downscaling and the contribution from different spatial scales

    Get PDF
    This study evaluates the added value in the representation of surface climate variables from\ud an ensemble of regional climate model (RCM) simulations by comparing the relative skill of the RCM\ud simulations and their driving data over a wide range of RCM experimental setups and climate statistics.\ud The methodology is specifically designed to compare results across different variables and metrics, and it\ud incorporates a rigorous approach to separate the added value occurring at different spatial scales. Results\ud show that the RCMs’ added value strongly depends on the type of driving data, the climate variable, and the\ud region of interest but depends rather weakly on the choice of the statistical measure, the season, and the\ud RCM physical configuration. Decomposing climate statistics according to different spatial scales shows that\ud improvements are coming from the small scales when considering the representation of spatial patterns,\ud but from the large-scale contribution in the case of absolute values. Our results also show that a large part\ud of the added value can be attained using some simple postprocessing methods

    Modeling payback from research into the efficacy of left-ventricular assist devices as destination therapy

    Get PDF
    Objectives: Ongoing developments in design have improved the outlook for left-ventricular assist device (LVAD) implantation as a therapy in end-stage heart failure. Nevertheless, early cost-effectiveness assessments, based on first-generation devices, have not been encouraging. Against this background, we set out (i) to examine the survival benefit that LVADs would need to generate before they could be deemed cost-effective; (ii) to provide insight into the likelihood that this benefit will be achieved; and (iii) from the perspective of a healthcare provider, to assess the value of discovering the actual size of this benefit by means of a Bayesian value of information analysis. Methods: Cost-effectiveness assessments are made from the perspective of the healthcare provider, using current UK norms for the value of a quality-adjusted life-year (QALY). The treatment model is grounded in published analyses of the Randomized Evaluation of Mechanical Assistance for the Treatment of Congestive Heart Failure (REMATCH) trial of first-generation LVADs, translated into a UK cost setting. The prospects for patient survival with second-generation devices is assessed using Bayesian prior distributions, elicited from a group of leading clinicians in the field. Results: Using established thresholds, cost-effectiveness probabilities under these priors are found to be low (.2 percent) for devices costing as much as £60,000. Sensitivity of the conclusions to both device cost and QALY valuation is examined. Conclusions: In the event that the price of the device in use would reduce to £40,000, the value of the survival information can readily justify investment in further trials
    corecore