50 research outputs found

    The identification of informative genes from multiple datasets with increasing complexity

    Get PDF
    Background In microarray data analysis, factors such as data quality, biological variation, and the increasingly multi-layered nature of more complex biological systems complicates the modelling of regulatory networks that can represent and capture the interactions among genes. We believe that the use of multiple datasets derived from related biological systems leads to more robust models. Therefore, we developed a novel framework for modelling regulatory networks that involves training and evaluation on independent datasets. Our approach includes the following steps: (1) ordering the datasets based on their level of noise and informativeness; (2) selection of a Bayesian classifier with an appropriate level of complexity by evaluation of predictive performance on independent data sets; (3) comparing the different gene selections and the influence of increasing the model complexity; (4) functional analysis of the informative genes. Results In this paper, we identify the most appropriate model complexity using cross-validation and independent test set validation for predicting gene expression in three published datasets related to myogenesis and muscle differentiation. Furthermore, we demonstrate that models trained on simpler datasets can be used to identify interactions among genes and select the most informative. We also show that these models can explain the myogenesis-related genes (genes of interest) significantly better than others (P < 0.004) since the improvement in their rankings is much more pronounced. Finally, after further evaluating our results on synthetic datasets, we show that our approach outperforms a concordance method by Lai et al. in identifying informative genes from multiple datasets with increasing complexity whilst additionally modelling the interaction between genes. Conclusions We show that Bayesian networks derived from simpler controlled systems have better performance than those trained on datasets from more complex biological systems. Further, we present that highly predictive and consistent genes, from the pool of differentially expressed genes, across independent datasets are more likely to be fundamentally involved in the biological process under study. We conclude that networks trained on simpler controlled systems, such as in vitro experiments, can be used to model and capture interactions among genes in more complex datasets, such as in vivo experiments, where these interactions would otherwise be concealed by a multitude of other ongoing events

    Constraints on Nucleon Decay via "Invisible" Modes from the Sudbury Neutrino Observatory

    Get PDF
    Data from the Sudbury Neutrino Observatory have been used to constrain the lifetime for nucleon decay to ``invisible'' modes, such as n -> 3 nu. The analysis was based on a search for gamma-rays from the de-excitation of the residual nucleus that would result from the disappearance of either a proton or neutron from O16. A limit of tau_inv > 2 x 10^{29} years is obtained at 90% confidence for either neutron or proton decay modes. This is about an order of magnitude more stringent than previous constraints on invisible proton decay modes and 400 times more stringent than similar neutron modes.Comment: Update includes missing efficiency factor (limits change by factor of 2) Submitted to Physical Review Letter

    First Neutrino Observations from the Sudbury Neutrino Observatory

    Get PDF
    The first neutrino observations from the Sudbury Neutrino Observatory are presented from preliminary analyses. Based on energy, direction and location, the data in the region of interest appear to be dominated by 8B solar neutrinos, detected by the charged current reaction on deuterium and elastic scattering from electrons, with very little background. Measurements of radioactive backgrounds indicate that the measurement of all active neutrino types via the neutral current reaction on deuterium will be possible with small systematic uncertainties. Quantitative results for the fluxes observed with these reactions will be provided when further calibrations have been completed.Comment: Latex, 7 pages, 10 figures, Invited paper at Neutrino 2000 Conference, Sudbury, Canada, June 16-21, 2000 to be published in the Proceeding

    Partitioning the Proteome: Phase Separation for Targeted Analysis of Membrane Proteins in Human Post-Mortem Brain

    Get PDF
    Neuroproteomics is a powerful platform for targeted and hypothesis driven research, providing comprehensive insights into cellular and sub-cellular disease states, Gene × Environmental effects, and cellular response to medication effects in human, animal, and cell culture models. Analysis of sub-proteomes is becoming increasingly important in clinical proteomics, enriching for otherwise undetectable proteins that are possible markers for disease. Membrane proteins are one such sub-proteome class that merit in-depth targeted analysis, particularly in psychiatric disorders. As membrane proteins are notoriously difficult to analyse using traditional proteomics methods, we evaluate a paradigm to enrich for and study membrane proteins from human post-mortem brain tissue. This is the first study to extensively characterise the integral trans-membrane spanning proteins present in human brain. Using Triton X-114 phase separation and LC-MS/MS analysis, we enriched for and identified 494 membrane proteins, with 194 trans-membrane helices present, ranging from 1 to 21 helices per protein. Isolated proteins included glutamate receptors, G proteins, voltage gated and calcium channels, synaptic proteins, and myelin proteins, all of which warrant quantitative proteomic investigation in psychiatric and neurological disorders. Overall, our sub-proteome analysis reduced sample complexity and enriched for integral membrane proteins by 2.3 fold, thus allowing for more manageable, reproducible, and targeted proteomics in case vs. control biomarker studies. This study provides a valuable reference for future neuroproteomic investigations of membrane proteins, and validates the use Triton X-114 detergent phase extraction on human post mortem brain

    Utilisation of an operative difficulty grading scale for laparoscopic cholecystectomy

    Get PDF
    Background A reliable system for grading operative difficulty of laparoscopic cholecystectomy would standardise description of findings and reporting of outcomes. The aim of this study was to validate a difficulty grading system (Nassar scale), testing its applicability and consistency in two large prospective datasets. Methods Patient and disease-related variables and 30-day outcomes were identified in two prospective cholecystectomy databases: the multi-centre prospective cohort of 8820 patients from the recent CholeS Study and the single-surgeon series containing 4089 patients. Operative data and patient outcomes were correlated with Nassar operative difficultly scale, using Kendall’s tau for dichotomous variables, or Jonckheere–Terpstra tests for continuous variables. A ROC curve analysis was performed, to quantify the predictive accuracy of the scale for each outcome, with continuous outcomes dichotomised, prior to analysis. Results A higher operative difficulty grade was consistently associated with worse outcomes for the patients in both the reference and CholeS cohorts. The median length of stay increased from 0 to 4 days, and the 30-day complication rate from 7.6 to 24.4% as the difficulty grade increased from 1 to 4/5 (both p < 0.001). In the CholeS cohort, a higher difficulty grade was found to be most strongly associated with conversion to open and 30-day mortality (AUROC = 0.903, 0.822, respectively). On multivariable analysis, the Nassar operative difficultly scale was found to be a significant independent predictor of operative duration, conversion to open surgery, 30-day complications and 30-day reintervention (all p < 0.001). Conclusion We have shown that an operative difficulty scale can standardise the description of operative findings by multiple grades of surgeons to facilitate audit, training assessment and research. It provides a tool for reporting operative findings, disease severity and technical difficulty and can be utilised in future research to reliably compare outcomes according to case mix and intra-operative difficulty

    Determinants of change in subtropical tree diameter growth with ontogenetic stage

    Full text link
    We evaluated the degree to which relative growth rate (RGR) of saplings and large trees is related to seven functional traits that describe physiological behavior and soil environmental factors related to topography and fertility for 57 subtropical tree species in Dinghushan, China. The mean values of functional traits and soil environmental factors for each species that were related to RGR varied with ontogenetic stage. Sapling RGR showed greater relationships with functional traits than large-tree RGR, whereas large-tree RGR was more associated with soil environment than was sapling RGR. The strongest single predictors of RGR were wood density for saplings and slope aspect for large trees. The stepwise regression model for large trees accounted for a larger proportion of variability (R 2 = 0.95) in RGR than the model for saplings (R 2 = 0.55). Functional diversity analysis revealed that the process of habitat filtering likely contributes to the substantial changes in regulation of RGR as communities transition from saplings to large trees. © 2014 Springer-Verlag Berlin Heidelberg

    Effectiveness of Biodiversity Surrogates for Conservation Planning: Different Measures of Effectiveness Generate a Kaleidoscope of Variation

    Get PDF
    Conservation planners represent many aspects of biodiversity by using surrogates with spatial distributions readily observed or quantified, but tests of their effectiveness have produced varied and conflicting results. We identified four factors likely to have a strong influence on the apparent effectiveness of surrogates: (1) the choice of surrogate; (2) differences among study regions, which might be large and unquantified (3) the test method, that is, how effectiveness is quantified, and (4) the test features that the surrogates are intended to represent. Analysis of an unusually rich dataset enabled us, for the first time, to disentangle these factors and to compare their individual and interacting influences. Using two data-rich regions, we estimated effectiveness using five alternative methods: two forms of incidental representation, two forms of species accumulation index and irreplaceability correlation, to assess the performance of ‘forest ecosystems’ and ‘environmental units’ as surrogates for six groups of threatened species—the test features—mammals, birds, reptiles, frogs, plants and all of these combined. Four methods tested the effectiveness of the surrogates by selecting areas for conservation of the surrogates then estimating how effective those areas were at representing test features. One method measured the spatial match between conservation priorities for surrogates and test features. For methods that selected conservation areas, we measured effectiveness using two analytical approaches: (1) when representation targets for the surrogates were achieved (incidental representation), or (2) progressively as areas were selected (species accumulation index). We estimated the spatial correlation of conservation priorities using an index known as summed irreplaceability. In general, the effectiveness of surrogates for our taxa (mostly threatened species) was low, although environmental units tended to be more effective than forest ecosystems. The surrogates were most effective for plants and mammals and least effective for frogs and reptiles. The five testing methods differed in their rankings of effectiveness of the two surrogates in relation to different groups of test features. There were differences between study areas in terms of the effectiveness of surrogates for different test feature groups. Overall, the effectiveness of the surrogates was sensitive to all four factors. This indicates the need for caution in generalizing surrogacy tests

    Population‐based cohort study of outcomes following cholecystectomy for benign gallbladder diseases

    Get PDF
    Background The aim was to describe the management of benign gallbladder disease and identify characteristics associated with all‐cause 30‐day readmissions and complications in a prospective population‐based cohort. Methods Data were collected on consecutive patients undergoing cholecystectomy in acute UK and Irish hospitals between 1 March and 1 May 2014. Potential explanatory variables influencing all‐cause 30‐day readmissions and complications were analysed by means of multilevel, multivariable logistic regression modelling using a two‐level hierarchical structure with patients (level 1) nested within hospitals (level 2). Results Data were collected on 8909 patients undergoing cholecystectomy from 167 hospitals. Some 1451 cholecystectomies (16·3 per cent) were performed as an emergency, 4165 (46·8 per cent) as elective operations, and 3293 patients (37·0 per cent) had had at least one previous emergency admission, but had surgery on a delayed basis. The readmission and complication rates at 30 days were 7·1 per cent (633 of 8909) and 10·8 per cent (962 of 8909) respectively. Both readmissions and complications were independently associated with increasing ASA fitness grade, duration of surgery, and increasing numbers of emergency admissions with gallbladder disease before cholecystectomy. No identifiable hospital characteristics were linked to readmissions and complications. Conclusion Readmissions and complications following cholecystectomy are common and associated with patient and disease characteristics

    The impact of surgical delay on resectability of colorectal cancer: An international prospective cohort study

    Get PDF
    AIM: The SARS-CoV-2 pandemic has provided a unique opportunity to explore the impact of surgical delays on cancer resectability. This study aimed to compare resectability for colorectal cancer patients undergoing delayed versus non-delayed surgery. METHODS: This was an international prospective cohort study of consecutive colorectal cancer patients with a decision for curative surgery (January-April 2020). Surgical delay was defined as an operation taking place more than 4 weeks after treatment decision, in a patient who did not receive neoadjuvant therapy. A subgroup analysis explored the effects of delay in elective patients only. The impact of longer delays was explored in a sensitivity analysis. The primary outcome was complete resection, defined as curative resection with an R0 margin. RESULTS: Overall, 5453 patients from 304 hospitals in 47 countries were included, of whom 6.6% (358/5453) did not receive their planned operation. Of the 4304 operated patients without neoadjuvant therapy, 40.5% (1744/4304) were delayed beyond 4 weeks. Delayed patients were more likely to be older, men, more comorbid, have higher body mass index and have rectal cancer and early stage disease. Delayed patients had higher unadjusted rates of complete resection (93.7% vs. 91.9%, P = 0.032) and lower rates of emergency surgery (4.5% vs. 22.5%, P < 0.001). After adjustment, delay was not associated with a lower rate of complete resection (OR 1.18, 95% CI 0.90-1.55, P = 0.224), which was consistent in elective patients only (OR 0.94, 95% CI 0.69-1.27, P = 0.672). Longer delays were not associated with poorer outcomes. CONCLUSION: One in 15 colorectal cancer patients did not receive their planned operation during the first wave of COVID-19. Surgical delay did not appear to compromise resectability, raising the hypothesis that any reduction in long-term survival attributable to delays is likely to be due to micro-metastatic disease
    corecore