139 research outputs found

    Accuracy of MSI Testing in Predicting Germline Mutations of MSH2 and MLH1: A Case Study in Bayesian Meta-Analysis of Diagnostic Tests Without a Gold Standard

    Get PDF
    Microsatellite instability (MSI) testing is a common screening procedure used to identify families that may harbor mutations of a mismatch repair gene and therefore may be at high risk for hereditary colorectal cancer. A reliable estimate of sensitivity and specificity of MSI for detecting germline mutations of mismatch repair genes is critical in genetic counseling and colorectal cancer prevention. Several studies published results of both MSI and mutation analysis on the same subjects. In this article we perform a meta-analysis of these studies and obtain estimates that can be directly used in counseling and screening. In particular we estimate the sensitivity of MSI for detecting mutations of MSH2 and MLH1 to be 0.78 (0.69--0.86). Statistically, challenges arise from the following: a) traditional mutation analysis methods used in these studies cannot be considered a gold standard for the identification of mutations; b) studies are heterogeneous in both the design and the populations considered; and c) studies may include different patterns of missing data resulting from partial testing of the populations sampled. We addressed these challenges in the context of a Bayesian meta-analytic implementation of the Hui-Walter design, designed to account for various forms of incomplete data. Posterior inference are handled via a Gibbs sampler

    RANDOM EFFECTS MODELS IN A META-ANALYSIS OF THE ACCURACY OF DIAGNOSTIC TESTS WITHIN A GOLD STANDARD IN THE PRESENCE OF MISSING DATA

    Get PDF
    In evaluating the accuracy of diagnosis tests, it is common to apply two imperfect tests jointly or sequentially to a study population. In a recent meta-analysis of the accuracy of microsatellite instability testing (MSI) and traditional mutation analysis (MUT) in predicting germline mutations of the mismatch repair (MMR) genes, a Bayesian approach (Chen, Watson, and Parmigiani 2005) was proposed to handle missing data resulting from partial testing and the lack of a gold standard. In this paper, we demonstrate an improved estimation of the sensitivities and specificities of MSI and MUT by using a nonlinear mixed model and a Bayesian hierarchical model, both of which account for the heterogeneity across studies through study-specific random effects. The methods can be used to estimate the accuracy of two imperfect diagnostic tests in other meta-analyses when the prevalence of disease, the sensitivities and/or the specificities of diagnostic tests are heterogeneous among studies. Furthermore, simulation studies have demonstrated the importance of carefully selecting appropriate random effects on the estimation of diagnostic accuracy measurements in this scenario

    MULTIPLE MODEL EVALUATION ABSENT THE GOLD STANDARD VIA MODEL COMBINATION

    Get PDF
    We describe a method for evaluating an ensemble of predictive models given a sample of observations comprising the model predictions and the outcome event measured with error. Our formulation allows us to simultaneously estimate measurement error parameters, true outcome — aka the gold standard — and a relative weighting of the predictive scores. We describe conditions necessary to estimate the gold standard and for these estimates to be calibrated and detail how our approach is related to, but distinct from, standard model combination techniques. We apply our approach to data from a study to evaluate a collection of BRCA1/BRCA2 gene mutation prediction scores. In this example, genotype is measured with error by one or more genetic assays. We estimate true genotype for each individual in the dataset, operating characteristics of the commonly used genotyping procedures and a relative weighting of the scores. Finally, we compare the scores against the gold standard genotype and find that Mendelian scores are, on average, the more refined and better calibrated of those considered and that the comparison is sensitive to measurement error in the gold standard

    HTC-DC Net: Monocular Height Estimation from Single Remote Sensing Images

    Full text link
    3D geo-information is of great significance for understanding the living environment; however, 3D perception from remote sensing data, especially on a large scale, is restricted. To tackle this problem, we propose a method for monocular height estimation from optical imagery, which is currently one of the richest sources of remote sensing data. As an ill-posed problem, monocular height estimation requires well-designed networks for enhanced representations to improve performance. Moreover, the distribution of height values is long-tailed with the low-height pixels, e.g., the background, as the head, and thus trained networks are usually biased and tend to underestimate building heights. To solve the problems, instead of formalizing the problem as a regression task, we propose HTC-DC Net following the classification-regression paradigm, with the head-tail cut (HTC) and the distribution-based constraints (DCs) as the main contributions. HTC-DC Net is composed of the backbone network as the feature extractor, the HTC-AdaBins module, and the hybrid regression process. The HTC-AdaBins module serves as the classification phase to determine bins adaptive to each input image. It is equipped with a vision transformer encoder to incorporate local context with holistic information and involves an HTC to address the long-tailed problem in monocular height estimation for balancing the performances of foreground and background pixels. The hybrid regression process does the regression via the smoothing of bins from the classification phase, which is trained via DCs. The proposed network is tested on three datasets of different resolutions, namely ISPRS Vaihingen (0.09 m), DFC19 (1.3 m) and GBH (3 m). Experimental results show the superiority of the proposed network over existing methods by large margins. Extensive ablation studies demonstrate the effectiveness of each design component.Comment: 18 pages, 10 figures, submitted to IEEE Transactions on Geoscience and Remote Sensin

    A Novel Preparation Method for 5-Aminosalicylic Acid Loaded Eudragit S100 Nanoparticles

    Get PDF
    In this study, solution enhanced dispersion by supercritical fluids (SEDS) technique was applied for the preparation of 5-aminosalicylic acid (5-ASA) loaded Eudragit S100 (EU S100) nanoparticles. The effects of various process variables including pressure, temperature, 5-ASA concentration and solution flow rate on morphology, particle size, 5-ASA loading and entrapment efficiency of nanoparticles were investigated. Under the appropriate conditions, drug-loaded nanoparticles exhibited a spherical shape and small particle size with narrow particle size distribution. In addition, the nanoparticles prepared were characterized by X-ray diffraction, Differential scanning calorimetry and Fourier transform infrared spectroscopy analyses. The results showed that 5-ASA was imbedded into EU S100 in an amorphous state after SEDS processing and the SEDS process did not induce degradation of 5-ASA

    BayesMendel: An R Environment for Mendelian Risk Prediction

    Get PDF
    Several important syndromes are caused by deleterious germline mutations of individual genes. In both clinical and research applications it is useful to evaluate the probability that an individual carries an inherited genetic variant of these genes, and to predict the risk of disease for that individual, using information on his/her family history. Mendelian risk prediction models accomplish these goals by integrating Mendelian principles and state-of-the art statistical models to describe phenotype/genotype relationships. Here we introduce an R library called BayesMendel that allows implementation of Mendelian models in research and counseling settings. BayesMendel is implemented in an object--oriented structure in the language R and distributed freely as an open source library. In its first release, it includes two major cancer syndromes: the breast-ovarian cancer syndrome and the hereditary non-polyposis colorectal cancer syndrome, along with up-to-date estimates of penetrance and prevalence for the corresponding genes. Input genetic parameters can be easily modified by users. BayesMendel can also serve as a generic tool for genetic epidemiologists to flexibly implement their own Mendelian models for novel syndromes and local subpopulations, without reprogramming complex statistical analyses and prediction tools

    GAMUS: A Geometry-aware Multi-modal Semantic Segmentation Benchmark for Remote Sensing Data

    Full text link
    Geometric information in the normalized digital surface models (nDSM) is highly correlated with the semantic class of the land cover. Exploiting two modalities (RGB and nDSM (height)) jointly has great potential to improve the segmentation performance. However, it is still an under-explored field in remote sensing due to the following challenges. First, the scales of existing datasets are relatively small and the diversity of existing datasets is limited, which restricts the ability of validation. Second, there is a lack of unified benchmarks for performance assessment, which leads to difficulties in comparing the effectiveness of different models. Last, sophisticated multi-modal semantic segmentation methods have not been deeply explored for remote sensing data. To cope with these challenges, in this paper, we introduce a new remote-sensing benchmark dataset for multi-modal semantic segmentation based on RGB-Height (RGB-H) data. Towards a fair and comprehensive analysis of existing methods, the proposed benchmark consists of 1) a large-scale dataset including co-registered RGB and nDSM pairs and pixel-wise semantic labels; 2) a comprehensive evaluation and analysis of existing multi-modal fusion strategies for both convolutional and Transformer-based networks on remote sensing data. Furthermore, we propose a novel and effective Transformer-based intermediary multi-modal fusion (TIMF) module to improve the semantic segmentation performance through adaptive token-level multi-modal fusion.The designed benchmark can foster future research on developing new methods for multi-modal learning on remote sensing data. Extensive analyses of those methods are conducted and valuable insights are provided through the experimental results. Code for the benchmark and baselines can be accessed at \url{https://github.com/EarthNets/RSI-MMSegmentation}.Comment: 13 page

    Exacerbated climate risks induced by precipitation extremes in the Yangtze River basin under warming scenarios

    Get PDF
    The Yangtze River basin is a typical region of the world that has a well-developed economy but is also greatly affected by multiple climate extremes. An improved understanding of future climate trends and associated exposures in this region is urgent needed to address socioeconomic risks. This research aims to quantify historical and future projected population exposure to precipitation extremes in the Yangtze basin using meteorological records and downscaled climate models. The study found that the hazard zone for precipitation extremes during baseline period was primarily located in the mid-lower Yangtze basin, particularly around the Poyang Lake watershed. Climate projections for 2050 indicate a further increase in the occurrence of precipitation extremes in this hazard zone, while a decrease in extreme events is detectable in the upper Yangtze basin under higher radiative forcing. Future socioeconomic scenarios suggest a tendency for population growth and migration towards the lower Yangtze basin, resulting in aggravated climate risks in megacities. Multi-model simulations indicate that population exposure to precipitation extremes in the lower Yangtze basin will increase by 9–22% around 2050, with both climate and population factors contributing positively. Shanghai, Changsha, Hangzhou, Ganzhou, and Huanggang are identified as hotspot cities facing the highest foreseeable risks of precipitation extremes in the Yangtze basin

    Random Effects Models in a Meta-Analysis of the Accuracy of Two Diagnostic Tests Without a Gold Standard

    Get PDF
    In studies of the accuracy of diagnostic tests, it is common that both the diagnostic test itself and the reference test are imperfect. This is the case for the microsatellite instability test, which is routinely used as a prescreening procedure to identify individuals with Lynch syndrome, the most common hereditary colorectal cancer syndrome. The microsatellite instability test is known to have imperfect sensitivity and specificity. Meanwhile, the reference test, mutation analysis, is also imperfect. We evaluate this test via a random effects meta-analysis of 17 studies. Study-specific random effects account for between-study heterogeneity in mutation prevalence, test sensitivities and specificities under a nonlinear mixed effects model and a Bayesian hierarchical model. Using model selection techniques, we explore a range of random effects models to identify a best-fitting model. We also evaluate sensitivity to the conditional independence assumption between the microsatellite instability test and the mutation analysis by allowing for correlation between them. Finally, we use simulations to illustrate the importance of including appropriate random effects and the impact of overfitting, underfitting, and misfitting on model performance. Our approach can be used to estimate the accuracy of two imperfect diagnostic tests from a meta-analysis of multiple studies or a multicenter study when the prevalence of disease, test sensitivities and/or specificities may be heterogeneous among studies or centers

    A Novel Synchronization Method in Terahertz Large-Scale Antenna Array System

    Get PDF
    We focus on the problems of the accurate time delay estimation, the design of training pilots, and hybrid matrix optimization within the large-scale antenna array Terahertz (THz) broadband communication system. In contrast to the existing researches based on narrow-band arrays, we hereby shed light on the time delay estimation of broadband arrays. In THz broadband communication systems, the data symbol duration is relatively short when comparing with the dimension of the antenna array. In large-scale antenna systems, signals received in each antenna are no longer different phase-shifted copies of the same symbol, but completely different symbols in which occasion traditional narrow-band structure is no longer suitable. Based on the above conclusion, firstly, we put forward a system model based on large-scale antenna arrays and Time delay line (TDL) structure. Secondly, we deduce the Cramer-Rao lower bound (CRLB) of the time delay estimation, and present a time delay estimation algorithm that could reach the CRLB. Thirdly, by minimizing the CRLB, we address the design of the training pilot and optimized TDL structure under the condition of constant envelope training pilot and modulus TDL structure. Finally, we disclose the numerical simulation results. According to the simulation results, the aforementioned method is workable in reaching the CRLB, the TDL structure can significantly surpass that of the traditional model, and the optimal pilot design method outperforms the pseudo-random pilot structure
    • …
    corecore