109 research outputs found

    Utilization of a labeled tracking oligonucleotide for visualization and quality control of spotted 70-mer arrays

    Get PDF
    BACKGROUND: Spotted 70-mer oligonucleotide arrays offer potentially greater specificity and an alternative to expensive cDNA library maintenance and amplification. Since microarray fabrication is a considerable source of data variance, we previously directly tagged cDNA probes with a third fluorophore for prehybridization quality control. Fluorescently modifying oligonucleotide sets is cost prohibitive, therefore, a co-spotted Staphylococcus aureus-specific fluorescein-labeled "tracking" oligonucleotide is described to monitor fabrication variables of a Mycobacterium tuberculosis oligonucleotide microarray. RESULTS: Significantly (p < 0.01) improved DNA retention was achieved printing in 15% DMSO/1.5 M betaine compared to the vendor recommended buffers. Introduction of tracking oligonucleotide did not effect hybridization efficiency or introduce ratio measurement bias in hybridizations between M. tuberculosis H37Rv and M. tuberculosis mprA. Linearity between the mean log Cy3/Cy5 ratios of genes differentially expressed from arrays either possessing or lacking the tracking oligonucleotide was observed (R(2 )= 0.90, p < 0.05) and there were no significant differences in Pearson's correlation coefficients of ratio data between replicates possessing (0.72 ± 0.07), replicates lacking (0.74 ± 0.10), or replicates with and without (0.70 ± 0.04) the tracking oligonucleotide. ANOVA analysis confirmed the tracking oligonucleotide introduced no bias. Titrating target-specific oligonucleotide (40 ΌM to 0.78 ΌM) in the presence of 0.5 ΌM tracking oligonucleotide, revealed a fluorescein fluorescence inversely related to target-specific oligonucleotide molarity, making tracking oligonucleotide signal useful for quality control measurements and differentiating false negatives (synthesis failures and mechanical misses) from true negatives (no gene expression). CONCLUSIONS: This novel approach enables prehybridization array visualization for spotted oligonucleotide arrays and sets the stage for more sophisticated slide qualification and data filtering applications

    Nearshore Monitoring with X-Band Radar: Maximising Utility in Dynamic and Complex Environments

    Get PDF
    Coastal management and engineering applications require data that quantify the nature and magnitude of changes in nearshore bathymetry. However, bathymetric surveys are usually infrequent due to high costs and complex logistics. This study demonstrates that ground‐based X‐band radar offers a cost‐effective means to monitor nearshore changes at relatively high frequency and over large areas. A new data quality and processing framework was developed to reduce uncertainties in the estimates of radar‐derived bathymetry and tested using data from an 18‐month installation at Thorpeness (UK). In addition to data calibration and validation, two new elements are integrated to reduce the influence of data scatter and outliers: (a) an automated selection of periods of ‘good data’ and (b) the application of a depth‐memory stabilisation. For conditions when the wave height is >1 m, the accuracy of the radar‐derived depths is shown to be ±0.5 m (95% confidence interval) at 40x40 m spatial resolution. At Thorpeness, radar‐derived bathymetry changes exceeding this error were observed at timescales ranging from three weeks to six months. These data enabled quantification of changes in nearshore sediment volume at frequencies and spatial cover that would be difficult and/or expensive to obtain by other methods. It is shown that the volume of nearshore sediment movement occurring at timescale as short as few weeks are comparable with the annual longshore transport rates reported in this area. The use of radar can provide an early warning of changes in offshore bathymetry likely to impact vulnerable coastal locations

    Assessing incomplete deprotection of microarray oligonucleotides in situ

    Get PDF
    En masse analysis of gene structure and function by array technologies will have a lasting and profound effect on biology and medicine. This impact can be compromised by low quality of probes within arrays, which we show can be caused by incomplete removal of chemical protecting groups. To solve this quality control problem, we present a sensitive, specific and facile method to detect these groups in situ on arrays using monoclonal antibodies and existing instrumentation. Screening of microarrays with these monoclonal antibodies should guide the consideration given to data derived from these and should enhance the accuracy of the results obtained

    Variability in the analysis of a single neuroimaging dataset by many teams

    Get PDF
    Data analysis workflows in many scientific domains have become increasingly complex and flexible. To assess the impact of this flexibility on functional magnetic resonance imaging (fMRI) results, the same dataset was independently analyzed by 70 teams, testing nine ex-ante hypotheses. The flexibility of analytic approaches is exemplified by the fact that no two teams chose identical workflows to analyze the data. This flexibility resulted in sizeable variation in hypothesis test results, even for teams whose statistical maps were highly correlated at intermediate stages of their analysis pipeline. Variation in reported results was related to several aspects of analysis methodology. Importantly, meta-analytic approaches that aggregated information across teams yielded significant consensus in activated regions across teams. Furthermore, prediction markets of researchers in the field revealed an overestimation of the likelihood of significant findings, even by researchers with direct knowledge of the dataset. Our findings show that analytic flexibility can have substantial effects on scientific conclusions, and demonstrate factors related to variability in fMRI. The results emphasize the importance of validating and sharing complex analysis workflows, and demonstrate the need for multiple analyses of the same data. Potential approaches to mitigate issues related to analytical variability are discussed

    Variability in the analysis of a single neuroimaging dataset by many teams

    Get PDF
    Data analysis workflows in many scientific domains have become increasingly complex and flexible. To assess the impact of this flexibility on functional magnetic resonance imaging (fMRI) results, the same dataset was independently analyzed by 70 teams, testing nine ex-ante hypotheses. The flexibility of analytic approaches is exemplified by the fact that no two teams chose identical workflows to analyze the data. This flexibility resulted in sizeable variation in hypothesis test results, even for teams whose statistical maps were highly correlated at intermediate stages of their analysis pipeline. Variation in reported results was related to several aspects of analysis methodology. Importantly, meta-analytic approaches that aggregated information across teams yielded significant consensus in activated regions across teams. Furthermore, prediction markets of researchers in the field revealed an overestimation of the likelihood of significant findings, even by researchers with direct knowledge of the dataset. Our findings show that analytic flexibility can have substantial effects on scientific conclusions, and demonstrate factors related to variability in fMRI. The results emphasize the importance of validating and sharing complex analysis workflows, and demonstrate the need for multiple analyses of the same data. Potential approaches to mitigate issues related to analytical variability are discussed

    The Rhine outflow studied by the analysis of ERS1/2 SAR data and numerical simulations

    No full text
    Synthetic aperture radar (SAR) images acquired by the First and Second European Remote Sensing Satellite (ERS1/2) over coastal waters near estuaries often show sea surface signatures of river outflow fronts. In particular, the analysis of several SAR images showing the Rhine outflow region indicates that the outflow front is visible as a line of high radar backscatter. Location and form of the outflow front depend strongly on tidal phase and Rhine discharge. In order to simulate the dynamics of the Rhine plume in the outflow region, a two-layer, nonlinear numerical model based on the hydrostatic shallow water equation has been developed. Due to a numerical technique for moveable lateral boundaries, the model allows for the simulation of localized layers with an outcropping interface (front). The model is forced by imposing tidal and residual transport and river discharge at the open boundaries. The evolution of the Rhine plume as calculated by the numerical model is discussed with respect to tidal phase and Rhine discharge. Using a simple radar backscatter model relating the surface velocity convergence and shear to the relative radar backscatter, it is shown that the observed signatures of the Rhine outflow front can be explained by the variation of the surface velocity convergence and shear as calculated by the numerical model
    • 

    corecore