201,542 research outputs found

    International consensus for neuroblastoma molecular diagnostics: report from the International Neuroblastoma Risk Group (INRG) Biology Committee

    Get PDF
    Neuroblastoma serves as a paradigm for utilising tumour genomic data for determining patient prognosis and treatment allocation. However, before the establishment of the International Neuroblastoma Risk Group (INRG) Task Force in 2004, international consensus on markers, methodology, and data interpretation did not exist, compromising the reliability of decisive genetic markers and inhibiting translational research efforts. The objectives of the INRG Biology Committee were to identify highly prognostic genetic aberrations to be included in the new INRG risk classification schema and to develop precise definitions, decisive biomarkers, and technique standardisation. The review of the INRG database (n=8800 patients) by the INRG Task Force finally enabled the identification of the most significant neuroblastoma biomarkers. In addition, the Biology Committee compared the standard operating procedures of different cooperative groups to arrive at international consensus for methodology, nomenclature, and future directions. Consensus was reached to include MYCN status, 11q23 allelic status, and ploidy in the INRG classification system on the basis of an evidence-based review of the INRG database. Standardised operating procedures for analysing these genetic factors were adopted, and criteria for proper nomenclature were developed. Neuroblastoma treatment planning is highly dependant on tumour cell genomic features, and it is likely that a comprehensive panel of DNA-based biomarkers will be used in future risk assignment algorithms applying genome-wide techniques. Consensus on methodology and interpretation is essential for uniform INRG classification and will greatly facilitate international and cooperative clinical and translational research studies

    MANAGING VARIANT DISCREPANCY IN HEREDITARY CANCER: CLINICAL PRACTICE, BARRIERS, AND DESIRED RESOURCES

    Get PDF
    Variants are changes in the DNA whose phenotypic effects may or may not be definitively understood. Because variant interpretation is a complex process, sources sometimes disagree on the classification of a variant, which is called a variant discrepancy. This study aimed to determine the practice of genetic counselors regarding variant discrepancies and to identify the barriers to counseling a variant discrepancy in hereditary cancer genetic testing. This investigation was unique because it was the first to address variant discrepancies from a clinical point of view. An electronic survey was sent to genetic counselors in the NSGC Cancer Special Interest Group. The vast majority of counselors (93%) had seen a variant discrepancy in practice. The most commonly selected barriers to counseling a variant discrepancy were lack of data sharing (90%) and lack of a central database (76%). Most counselors responded that the ideal database would be owned by a non-profit (59%) and obtain information directly from laboratories (91%). When asked how they approached counseling sessions involving variant discrepancies, the free responses emphasized that counselors consider family history and psychosocial concerns, showing that genetic counselors tailored the session to each individual. Variant discrepancies are an ongoing concern for clinical cancer genetic counselors, as demonstrated by the fact that counselors desired further resources to aid in addressing variant discrepancies, including a centralized database (89%), guidelines from a major organization (88%), continuing education about the issue (74%) and functional studies (58%)

    Community Seismic Network

    Get PDF
    The article describes the design of the Community Seismic Network, which is a dense open seismic network based on low cost sensors. The inputs are from sensors hosted by volunteers from the community by direct connection to their personal computers, or through sensors built into mobile devices. The server is cloud-based for robustness and to dynamically handle the load of impulsive earthquake events. The main product of the network is a map of peak acceleration, delivered within seconds of the ground shaking. The lateral variations in the level of shaking will be valuable to first responders, and the waveform information from a dense network will allow detailed mapping of the rupture process. Sensors in buildings may be useful for monitoring the state-of-health of the structure after major shaking

    Very-high-energy gamma-ray emission from high-redshift blazars

    Full text link
    We study the possible detection of and properties of very high-energy (VHE) gamma-ray emission (in the energy band above 100 GeV) from high redshift sources. We report on the detection of VHE gamma-ray flux from blazars with redshifts z>0.5. We use the data of Fermi telescope in the energy band above 100 GeV and identify significant sources via cross-correlation of arrival directions of individual VHE gamma-rays with the positions of known Fermi sources. There are thirteen high-redshift sources detected in the VHE band by Fermi/LAT telescope. The present statistics of the Fermi signal from these sources is too low for a sensible study of the effects of suppression of the VHE flux by pair production through interactions with Extragalactic Background Light photons. We find that the detection of these sources with ground-based gamma-ray telescopes would be challenging. However, several sources including BL Lacs PKS 0426-380 at z=1.11, KUV 00311-1938 at z=0.61, B3 1307+433 at z=0.69, PG 1246+586 at z=0.84, Ton 116 at z=1.065 as well as a flat-spectrum radio quasar 4C +55.17 at z=0.89 should be detectable by HESS-II, MAGIC-II and CTA. A high-statistics study of a much larger number of VHE gamma-ray sources at cosmological distances would be possible with the proposed high-altitude Cherenkov telescope [email protected]: 10 pages, 14 figure

    The Millennium Run Observatory: First Light

    Full text link
    Simulations of galaxy evolution aim to capture our current understanding as well as to make predictions for testing by future experiments. Simulations and observations are often compared in an indirect fashion: physical quantities are estimated from the data and compared to models. However, many applications can benefit from a more direct approach, where the observing process is also simulated and the models are seen fully from the observer's perspective. To facilitate this, we have developed the Millennium Run Observatory (MRObs), a theoretical virtual observatory which uses virtual telescopes to `observe' semi-analytic galaxy formation models based on the suite of Millennium Run dark matter simulations. The MRObs produces data that can be processed and analyzed using the standard software packages developed for real observations. At present, we produce images in forty filters from the rest-frame UV to IR for two stellar population synthesis models, three different models of IGM absorption, and two cosmologies (WMAP1/7). Galaxy distributions for a large number of mock lightcones can be `observed' using models of major ground- and space-based telescopes. The data include lightcone catalogues linked to structural properties of galaxies, pre-observation model images, mock telescope images, and Source Extractor products that can all be traced back to the higher level dark matter, semi-analytic galaxy, and lightcone catalogues available in the Millennium database. Here, we describe our methods and announce a first public release of simulated surveys (e.g., SDSS, CFHT-LS, GOODS, GOODS/ERS, CANDELS, and HUDF). The MRObs browser, an online tool, further facilitates exploration of the simulated data. We demonstrate the benefits of a direct approach through a number of example applications (galaxy number counts in CANDELS, clusters, morphologies, and dropout selections).Comment: MNRAS, in press. Millennium Run Observatory data products, online tools, and more available through http://galformod.mpa-garching.mpg.de/mrobs

    The prospects for mathematical logic in the twenty-first century

    Get PDF
    The four authors present their speculations about the future developments of mathematical logic in the twenty-first century. The areas of recursion theory, proof theory and logic for computer science, model theory, and set theory are discussed independently.Comment: Association for Symbolic Logi

    A New 3D Tool for Planning Plastic Surgery

    Get PDF
    Face plastic surgery (PS) plays a major role in today medicine. Both for reconstructive and cosmetic surgery, achieving harmony of facial features is an important, if not the major goal. Several systems have been proposed for presenting to patient and surgeon possible outcomes of the surgical procedure. In this paper, we present a new 3D system able to automatically suggest, for selected facial features as nose, chin, etc, shapes that aesthetically match the patient's face. The basic idea is suggesting shape changes aimed to approach similar but more harmonious faces. To this goal, our system compares the 3D scan of the patient with a database of scans of harmonious faces, excluding the feature to be corrected. Then, the corresponding features of the k most similar harmonious faces, as well as their average, are suitably pasted onto the patient's face, producing k+1 aesthetically effective surgery simulations. The system has been fully implemented and tested. To demonstrate the system, a 3D database of harmonious faces has been collected and a number of PS treatments have been simulated. The ratings of the outcomes of the simulations, provided by panels of human judges, show that the system and the underlying idea are effectiv
    • 

    corecore