236,285 research outputs found

    The enhancement and decrement of the Sunyaev-Zel'dovich effect towards the ROSAT Cluster RXJ0658-5557The enhancement and decrement of the Sunyaev-Zeldovich effect towards the ROSAT Cluster RXJ0658-5557

    Get PDF
    Published in: Astrophys. J. 513 (1999) 23 citations recorded in [Science Citation Index] Abstract: We report simultaneous observations at 1.2 and 2 mm, with a double channel photometer on the SEST Telescope, of the X-ray cluster RXJ0658-5557 in search for the Sunyaev-Zel'dovich (S-Z). The S-Z data were analyzed using the relativistically correct expression for the Comptonization parameter and we find from the detected decrement (2.60 +/- 0.79) ~ 10^{-4}, which is consistent with that computed using the X-ray (ROSAT and ASCA) observations. The uncertainty includes contributions due to statistical uncertainty in the detection and systematics and calibration. The 1.2 {mm} channel data alone gives rise to a larger Comptonization parameter and this result is discussed in terms of contamination from foreground sources and/or dust in the cluster or from a possible systematic effect. We then make use of the combined analysis of the ROSAT and ASCA X-ray satellite observations to determine an isothermal model for the S-Z surface brightness. Since the cluster is asymmetrical and probably in a merging process, models are only approximate. The associated uncertainty can, however, be estimated by exploring a set of alternative models. We then find as the global uncertainty on the Comptonization parameter a factor of 1.3. Combining the S-Z and the X-ray measurements, we determine a value for the Hubble constant. The 2 mm data are consistent with H_0(q_0 = 1/2)= 53^{+38}_{-28} km/s Mpc^{-1}, where the uncertainty is dominated by the uncertainty in models of the X-ray plasma halo

    Surgical treatment of stage IV colorectal cancer with synchronous liver metastases : a systematic review and network meta-analysis

    No full text
    Background: The ideal treatment approach for colorectal cancer (CRC) with synchronous liver metastases (SCRLM) remains debated. We performed a network meta-analysis (NMA) comparing the 'bowel-first' approach (BFA), simultaneous resection (SIM), and the 'liver-first' approach (LFA). Methods: A systematic search of comparative studies in CRC with SCRLM was undertaken using the Embase, PubMed, Web of Science, and CENTRAL databases. Outcome measures included postoperative complications, 30- and 90-day mortality, chemotherapy use, treatment completion rate, 3- and 5-year recurrence-free survival, and 3- and 5-year overall survival (OS). Pairwise and network meta-analysis were performed to compare strategies. Heterogeneity was assessed using the Higgins I-2 statistic. Results: One prospective and 43 retrospective studies reporting on 10 848 patients were included. Patients undergoing the LFA were more likely to have rectal primaries and a higher metastatic load. The SIM approach resulted in a higher risk of major morbidity and 30-day mortality. Compared to the BFA, the LFA more frequently resulted in failure to complete treatment as planned (34% versus 6%). Pairwise and network meta-analysis showed a similar 5-year OS between LFA and BFA and a more favorable 5-year OS after SIM compared to LFA (odds ratio 0.25-0.90, p = 0.02, I-2 = 0%), but not compared to BFA. Conclusion: Despite a higher tumor load in LFA compared to BFA patients, survival was similar. A lower rate of treatment completion was observed with LFA. Uncertainty remains substantial due to imprecise estimates of treatment effects. In the absence of prospective trials, treatment of stage IV CRC patients should be individually tailored. (C) 2020 Elsevier Ltd, BASO similar to The Association for Cancer Surgery, and the European Society of Surgical Oncology. All rights reserved

    Radio emissions from double RHESSI TGFs

    Get PDF
    A detailed analysis of Reuven Ramaty High Energy Solar Spectroscopic Imager (RHESSI) terrestrial gamma ray flashes (TGFs) is performed in association with World Wide Lightning Location Network (WWLLN) sources and very low frequency (VLF) sferics recorded at Duke University. RHESSI clock offset is evaluated and found to experience changes on the 5 August 2005 and 21 October 2013, based on the analysis of TGF-WWLLN matches. The clock offsets were found for all three periods of observations with standard deviations less than 100 {\mu}s. This result opens the possibility for the precise comparative analyses of RHESSI TGFs with the other types of data (WWLLN, radio measurements, etc.) In case of multiple-peak TGFs, WWLLN detections are observed to be simultaneous with the last TGF peak for all 16 cases of multipeak RHESSI TGFs simultaneous with WWLLN sources. VLF magnetic field sferics were recorded for two of these 16 events at Duke University. These radio measurements also attribute VLF sferics to the second peak of the double TGFs, exhibiting no detectable radio emission during the first TGF peak. Possible scenarios explaining these observations are proposed. Double (multipeak) TGFs could help to distinguish between the VLF radio emission radiated by the recoil currents in the +IC leader channel and the VLF emission from the TGF producing electrons

    Simultaneous localization and map-building using active vision

    No full text
    An active approach to sensing can provide the focused measurement capability over a wide field of view which allows correctly formulated Simultaneous Localization and Map-Building (SLAM) to be implemented with vision, permitting repeatable long-term localization using only naturally occurring, automatically-detected features. In this paper, we present the first example of a general system for autonomous localization using active vision, enabled here by a high-performance stereo head, addressing such issues as uncertainty-based measurement selection, automatic map-maintenance, and goal-directed steering. We present varied real-time experiments in a complex environment.Published versio

    Treatment of input uncertainty in hydrologic modeling: Doing hydrology backward with Markov chain Monte Carlo simulation

    Get PDF
    There is increasing consensus in the hydrologic literature that an appropriate framework for streamflow forecasting and simulation should include explicit recognition of forcing and parameter and model structural error. This paper presents a novel Markov chain Monte Carlo (MCMC) sampler, entitled differential evolution adaptive Metropolis (DREAM), that is especially designed to efficiently estimate the posterior probability density function of hydrologic model parameters in complex, high-dimensional sampling problems. This MCMC scheme adaptively updates the scale and orientation of the proposal distribution during sampling and maintains detailed balance and ergodicity. It is then demonstrated how DREAM can be used to analyze forcing data error during watershed model calibration using a five-parameter rainfall-runoff model with streamflow data from two different catchments. Explicit treatment of precipitation error during hydrologic model calibration not only results in prediction uncertainty bounds that are more appropriate but also significantly alters the posterior distribution of the watershed model parameters. This has significant implications for regionalization studies. The approach also provides important new ways to estimate areal average watershed precipitation, information that is of utmost importance for testing hydrologic theory, diagnosing structural errors in models, and appropriately benchmarking rainfall measurement devices

    Simultaneous Optimal Uncertainty Apportionment and Robust Design Optimization of Systems Governed by Ordinary Differential Equations

    Get PDF
    The inclusion of uncertainty in design is of paramount practical importance because all real-life systems are affected by it. Designs that ignore uncertainty often lead to poor robustness, suboptimal performance, and higher build costs. Treatment of small geometric uncertainty in the context of manufacturing tolerances is a well studied topic. Traditional sequential design methodologies have recently been replaced by concurrent optimal design methodologies where optimal system parameters are simultaneously determined along with optimally allocated tolerances; this allows to reduce manufacturing costs while increasing performance. However, the state of the art approaches remain limited in that they can only treat geometric related uncertainties restricted to be small in magnitude. This work proposes a novel framework to perform robust design optimization concurrently with optimal uncertainty apportionment for dynamical systems governed by ordinary differential equations. The proposed framework considerably expands the capabilities of contemporary methods by enabling the treatment of both geometric and non-geometric uncertainties in a unified manner. Additionally, uncertainties are allowed to be large in magnitude and the governing constitutive relations may be highly nonlinear. In the proposed framework, uncertainties are modeled using Generalized Polynomial Chaos and are solved quantitatively using a least-square collocation method. The computational efficiency of this approach allows statistical moments of the uncertain system to be explicitly included in the optimization-based design process. The framework formulates design problems as constrained multi-objective optimization problems, thus enabling the characterization of a Pareto optimal trade-off curve that is off-set from the traditional deterministic optimal trade-off curve. The Pareto off-set is shown to be a result of the additional statistical moment information formulated in the objective and constraint relations that account for the system uncertainties. Therefore, the Pareto trade-off curve from the new framework characterizes the entire family of systems within the probability space; consequently, designers are able to produce robust and optimally performing systems at an optimal manufacturing cost. A kinematic tolerance analysis case-study is presented first to illustrate how the proposed methodology can be applied to treat geometric tolerances. A nonlinear vehicle suspension design problem, subject to parametric uncertainty, illustrates the capability of the new framework to produce an optimal design at an optimal manufacturing cost, accounting for the entire family of systems within the associated probability space. This case-study highlights the general nature of the new framework which is capable of optimally allocating uncertainties of multiple types and with large magnitudes in a single calculation

    An Improved Distance and Mass Estimate for Sgr A* from a Multistar Orbit Analysis

    Get PDF
    We present new, more precise measurements of the mass and distance of our Galaxy's central supermassive black hole, Sgr A*. These results stem from a new analysis that more than doubles the time baseline for astrometry of faint stars orbiting Sgr A*, combining two decades of speckle imaging and adaptive optics data. Specifically, we improve our analysis of the speckle images by using information about a star's orbit from the deep adaptive optics data (2005 - 2013) to inform the search for the star in the speckle years (1995 - 2005). When this new analysis technique is combined with the first complete re-reduction of Keck Galactic Center speckle images using speckle holography, we are able to track the short-period star S0-38 (K-band magnitude = 17, orbital period = 19 years) through the speckle years. We use the kinematic measurements from speckle holography and adaptive optics to estimate the orbits of S0-38 and S0-2 and thereby improve our constraints of the mass (MbhM_{bh}) and distance (RoR_o) of Sgr A*: Mbh=4.02±0.16±0.04×106 MM_{bh} = 4.02\pm0.16\pm0.04\times10^6~M_{\odot} and 7.86±0.14±0.047.86\pm0.14\pm0.04 kpc. The uncertainties in MbhM_{bh} and RoR_o as determined by the combined orbital fit of S0-2 and S0-38 are improved by a factor of 2 and 2.5, respectively, compared to an orbital fit of S0-2 alone and a factor of \sim2.5 compared to previous results from stellar orbits. This analysis also limits the extended dark mass within 0.01 pc to less than 0.13×106 M0.13\times10^{6}~M_{\odot} at 99.7% confidence, a factor of 3 lower compared to prior work.Comment: 56 pages, 14 figures, accepted to Ap
    corecore