23,513 research outputs found

    The bulk of the black hole growth since z ~ 1 occurs in a secular universe: no major merger-AGN connection

    Get PDF
    What is the relevance of major mergers and interactions as triggering mechanisms for active galactic nuclei (AGNs) activity? To answer this long-standing question, we analyze 140 XMM-Newton-selected AGN host galaxies and a matched control sample of 1264 inactive galaxies over z ~ 0.3–1.0 and M_∗ < 10^(11.7) M_⊙ with high-resolution Hubble Space Telescope/Advanced Camera for Surveys imaging from the COSMOS field. The visual analysis of their morphologies by 10 independent human classifiers yields a measure of the fraction of distorted morphologies in the AGN and control samples, i.e., quantifying the signature of recent mergers which might potentially be responsible for fueling/triggering the AGN. We find that (1) the vast majority (>85%) of the AGN host galaxies do not show strong distortions and (2) there is no significant difference in the distortion fractions between active and inactive galaxies. Our findings provide the best direct evidence that, since z ~ 1, the bulk of black hole (BH) accretion has not been triggered by major galaxy mergers, therefore arguing that the alternative mechanisms, i.e., internal secular processes and minor interactions, are the leading triggers for the episodes of major BH growth.We also exclude an alternative interpretation of our results: a substantial time lag between merging and the observability of the AGN phase could wash out the most significant merging signatures, explaining the lack of enhancement of strong distortions on the AGN hosts. We show that this alternative scenario is unlikely due to (1) recent major mergers being ruled out for the majority of sources due to the high fraction of disk-hosted AGNs, (2) the lack of a significant X-ray signal in merging inactive galaxies as a signature of a potential buried AGN, and (3) the low levels of soft X-ray obscuration for AGNs hosted by interacting galaxies, in contrast to model predictions

    Contemporary angiography in the diagnosis and treatment of cardiovascular disease

    Get PDF

    Guidance on the use of MRI for treatment planning in radiotherapy clinical trials

    Get PDF
    The aim of this article is to propose meaningful guidance covering the technical and safety issues involved when designing or conducting radiotherapy clinical trials that use MRI for treatment planning. The complexity of imaging requirements will depend on the trial aims, design and MRI methods used.The use of MRI within the RT pathway is becoming more prevalent and clinically appropriate as access to MRI increases, treatment planning systems become more versatile and potential indications for MRI-planning in RT are documented. Novel MRI-planning opportunities are often initiated and validated within clinical trials.The guidance in this document is intended to assist researchers designing RT clinical trials involving MRI, so that they may provide sufficient information about the appropriate methods to be used for image acquisition, post-processing and quality assurance such that participating sites complete MRI to consistent standards. It has been produced in collaboration with the National Radiotherapy Trials Quality Assurance Group (RTTQA).As the use of MRI in RT is developed, it is highly recommended for researchers writing clinical trial protocols to include imaging guidance as part of their clinical trial documentation covering the trial-specific requirements for MRI procedures. Many of the considerations and recommendations in this guidance may well apply to MR-guided treatment machines, where clinical trials will be crucial. Similarly, many of these recommendations will apply to the general use of MRI in RT, outside of clinical trials.This document contains a large number of recommendations, not all of which will be relevant to any particular trial. Designers of RT clinical trials must therefore take this into account. They must also use their own judgement as to the appropriate compromise between accessibility of the trial and its technical rigour

    Can high-frequency ultrasound predict metastatic lymph nodes in patients with invasive breast cancer?

    Get PDF
    Aim To determine whether high-frequency ultrasound can predict the presence of metastatic axillary lymph nodes, with a high specificity and positive predictive value, in patients with invasive breast cancer. The clinical aim is to identify patients with axillary disease requiring surgery who would not normally, on clinical grounds, have an axillary dissection, so potentially improving outcome and survival rates. Materials and methods The ipsilateral and contralateral axillae of 42 consecutive patients with invasive breast cancer were scanned prior to treatment using a B-mode frequency of 13 MHz and a Power Doppler frequency of 7 MHz. The presence or absence of an echogenic centre for each lymph node detected was recorded, and measurements were also taken to determine the L/S ratio and the widest and narrowest part of the cortex. Power Doppler was also used to determine vascularity. The contralateral axilla was used as a control for each patient. Results In this study of patients with invasive breast cancer, ipsilateral lymph nodes with a cortical bulge ≥3 mm and/or at least two lymph nodes with absent echogenic centres indicated the presence of metastatic axillary lymph nodes (10 patients). The sensitivity and specificity were 52.6% and 100%, respectively, positive and negative predictive values were 100% and 71.9%, respectively, the P value was 0.001 and the Kappa score was 0.55.\ud Conclusion This would indicate that high-frequency ultrasound can be used to accurately predict metastatic lymph nodes in a proportion of patients with invasive breast cancer, which may alter patient management

    The use of craniofacial superimposition for disaster victim identification

    Get PDF
    Skull-to-face comparison is utilised for human identification where there is a suspected identity and the usual methods of identification, such as DNA or dental comparison, are not possible or practical. This research aimed to compare the reliability of manual and computerised craniofacial superimposition techniques and to establish the application of these techniques for disaster victim identification, where there may be a large database of passport-style images, such as the MPUB Interpol database. Twenty skulls (10 females; 10 males) were utilised from the William Bass Skeletal Collection at the University of Tennessee and compared to face pools of 20 face photographs of similar sex, age and ethnic group. A traditional manual photographic method and a new 3D computer-based method were used. The results suggested that profile and three-quarter views of the ante-mortem face were the most valuable for craniofacial superimposition. However, the poor identification rate achieved using images in frontal view suggests that the MPUB Interpol database would not be optimal for disaster victim identification, and passport-style images do not provide enough distinguishing facial detail. This suggests that multiple ante-mortem images with a variety of facial expression should be utilised for identification purposes. There was no significant difference in success between the manual and computer methods

    Evaluation of Mesoscale Model Phenomenological Verification Techniques

    Get PDF
    Forecasters at the Spaceflight Meteorology Group, 45th Weather Squadron, and National Weather Service in Melbourne, FL use mesoscale numerical weather prediction model output in creating their operational forecasts. These models aid in forecasting weather phenomena that could compromise the safety of launch, landing, and daily ground operations and must produce reasonable weather forecasts in order for their output to be useful in operations. Considering the importance of model forecasts to operations, their accuracy in forecasting critical weather phenomena must be verified to determine their usefulness. The currently-used traditional verification techniques involve an objective point-by-point comparison of model output and observations valid at the same time and location. The resulting statistics can unfairly penalize high-resolution models that make realistic forecasts of a certain phenomena, but are offset from the observations in small time and/or space increments. Manual subjective verification can provide a more valid representation of model performance, but is time-consuming and prone to personal biases. An objective technique that verifies specific meteorological phenomena, much in the way a human would in a subjective evaluation, would likely produce a more realistic assessment of model performance. Such techniques are being developed in the research community. The Applied Meteorology Unit (AMU) was tasked to conduct a literature search to identify phenomenological verification techniques being developed, determine if any are ready to use operationally, and outline the steps needed to implement any operationally-ready techniques into the Advanced Weather Information Processing System (AWIPS). The AMU conducted a search of all literature on the topic of phenomenological-based mesoscale model verification techniques and found 10 different techniques in various stages of development. Six of the techniques were developed to verify precipitation forecasts, one to verify sea breeze forecasts, and three were capable of verifying several phenomena. The AMU also determined the feasibility of transitioning each technique into operations and rated the operational capability of each technique on a subjective 1-10 scale: (1) 1 indicates that the technique is only in the initial stages of development, (2) 2-5 indicates that the technique is still undergoing modifications and is not ready for operations, (3) 6-8 indicates a higher probability of integrating the technique into AWIPS with code modifications, and (4) 9-10 indicates that the technique was created for AWIPS and is ready for implementation. Eight of the techniques were assigned a rating of 5 or below. The other two received ratings of 6 and 7, and none of the techniques a rating of 9-10. At the current time, there are no phenomenological model verification techniques ready for operational use. However, several of the techniques described in this report may become viable techniques in the future and should be monitored for updates in the literature. The desire to use a phenomenological verification technique is widespread in the modeling community, and it is likely that other techniques besides those described herein are being developed, but the work has not yet been published. Therefore, the AMIU recommends that the literature continue to be monitored for updates to the techniques described in this report and for new techniques being developed whose results have not yet been published. 11

    A New Automatic Method to Identify Galaxy Mergers I. Description and Application to the STAGES Survey

    Get PDF
    We present an automatic method to identify galaxy mergers using the morphological information contained in the residual images of galaxies after the subtraction of a Sersic model. The removal of the bulk signal from the host galaxy light is done with the aim of detecting the fainter minor mergers. The specific morphological parameters that are used in the merger diagnostic suggested here are the Residual Flux Fraction and the asymmetry of the residuals. The new diagnostic has been calibrated and optimized so that the resulting merger sample is very complete. However, the contamination by non-mergers is also high. If the same optimization method is adopted for combinations of other structural parameters such as the CAS system, the merger indicator we introduce yields merger samples of equal or higher statistical quality than the samples obtained through the use of other structural parameters. We explore the ability of the method presented here to select minor mergers by identifying a sample of visually classified mergers that would not have been picked up by the use of the CAS system, when using its usual limits. Given the low prevalence of mergers among the general population of galaxies and the optimization used here, we find that the merger diagnostic introduced in this work is best used as a negative merger test, i.e., it is very effective at selecting non-merging galaxies. As with all the currently available automatic methods, the sample of merger candidates selected is contaminated by non-mergers, and further steps are needed to produce a clean sample. This merger diagnostic has been developed using the HST/ACS F606W images of the A901/02 cluster (z=0.165) obtained by the STAGES team. In particular, we have focused on a mass and magnitude limited sample (log M/M_{O}>9.0, R_{Vega}<23.5mag)) which includes 905 cluster galaxies and 655 field galaxies of all morphological types.Comment: 25 pages, 14 figures, 4 tables. To appear in MNRA
    corecore