29 research outputs found

    Sensitivity and reproducibility of standardized-competitive RT-PCR for transcript quantification and its comparison with real time RT-PCR

    Get PDF
    BACKGROUND: Probe based detection assays form the mainstay of transcript quantification. Problems with these assays include varying hybridization efficiencies of the probes used for transcript quantification and the expense involved. We examined the ability of a standardized competitive RT-PCR (StaRT PCR) assay to quantify transcripts of 4 cell cycle associated genes (RB, E2F1, CDKN2A and PCNA) in two cell lines (T24 & LD419) and compared its efficacy with the established Taqman real time quantitative RT-PCR assay. We also assessed the sensitivity, reproducibility and consistency of StaRT PCR. StaRT PCR assay is based on the incorporation of competitive templates (CT) in precisely standardized quantities along with the native template (NT) in a PCR reaction. This enables transcript quantification by comparing the NT and CT band intensities at the end of the PCR amplification. The CT serves as an ideal internal control. The transcript numbers are expressed as copies per million transcripts of a control gene such as β-actin (ACTB). RESULTS: The NT and CT were amplified at remarkably similar rates throughout the StaRT PCR amplification cycles, and the coefficient of variation was least (<3.8%) when the NT/CT ratio was kept as close to 1:1 as possible. The variability between the rates of amplification in different tubes subjected to the same StaRT PCR reaction was very low and within the range of experimental noise. Further, StaRT PCR was sensitive enough to detect variations as low as 10% in endogenous actin transcript quantity (p < 0.01 by the paired student's t-test). StaRT PCR correlated well with Taqman real time RT-PCR assay in terms of transcript quantification efficacy (p < 0.01 for all 4 genes by the Spearman Rank correlation method) and the ability to discriminate between cell types and confluence patterns. CONCLUSION: StaRT PCR is thus a reliable and sensitive technique that can be applied to medium-high throughput quantitative transcript measurement. Further, it correlates well with Taqman real time PCR in terms of quantitative and discriminatory ability. This label-free, inexpensive technique may provide the ability to generate prognostically important molecular signatures unique to individual tumors and may enable identification of novel therapeutic targets

    Instantons, Quivers and Noncommutative Donaldson-Thomas Theory

    Full text link
    We construct noncommutative Donaldson-Thomas invariants associated with abelian orbifold singularities by analysing the instanton contributions to a six-dimensional topological gauge theory. The noncommutative deformation of this gauge theory localizes on noncommutative instantons which can be classified in terms of three-dimensional Young diagrams with a colouring of boxes according to the orbifold group. We construct a moduli space for these gauge field configurations which allows us to compute its virtual numbers via the counting of representations of a quiver with relations. The quiver encodes the instanton dynamics of the noncommutative gauge theory, and is associated to the geometry of the singularity via the generalized McKay correspondence. The index of BPS states which compute the noncommutative Donaldson-Thomas invariants is realized via topological quantum mechanics based on the quiver data. We illustrate these constructions with several explicit examples, involving also higher rank Coulomb branch invariants and geometries with compact divisors, and connect our approach with other ones in the literature.Comment: 95 pages, 5 figures; v2: clarifying comments added, discussions using tilting strengthened, references added and updated; v3: minor corrections, final version to be published in Nuclear Physics

    Extreme drought impacts have been underestimated in grasslands and shrublands globally

    Get PDF
    Climate change is increasing the frequency and severity of short-term (~1 y) drought events-the most common duration of drought-globally. Yet the impact of this intensification of drought on ecosystem functioning remains poorly resolved. This is due in part to the widely disparate approaches ecologists have employed to study drought, variation in the severity and duration of drought studied, and differences among ecosystems in vegetation, edaphic and climatic attributes that can mediate drought impacts. To overcome these problems and better identify the factors that modulate drought responses, we used a coordinated distributed experiment to quantify the impact of short-term drought on grassland and shrubland ecosystems. With a standardized approach, we imposed ~a single year of drought at 100 sites on six continents. Here we show that loss of a foundational ecosystem function-aboveground net primary production (ANPP)-was 60% greater at sites that experienced statistically extreme drought (1-in-100-y event) vs. those sites where drought was nominal (historically more common) in magnitude (35% vs. 21%, respectively). This reduction in a key carbon cycle process with a single year of extreme drought greatly exceeds previously reported losses for grasslands and shrublands. Our global experiment also revealed high variability in drought response but that relative reductions in ANPP were greater in drier ecosystems and those with fewer plant species. Overall, our results demonstrate with unprecedented rigor that the global impacts of projected increases in drought severity have been significantly underestimated and that drier and less diverse sites are likely to be most vulnerable to extreme drought

    The handbook for standardized field and laboratory measurements in terrestrial climate change experiments and observational studies (ClimEx)

    Get PDF
    1. Climate change is a world‐wide threat to biodiversity and ecosystem structure, functioning and services. To understand the underlying drivers and mechanisms, and to predict the consequences for nature and people, we urgently need better understanding of the direction and magnitude of climate change impacts across the soil–plant–atmosphere continuum. An increasing number of climate change studies are creating new opportunities for meaningful and high‐quality generalizations and improved process understanding. However, significant challenges exist related to data availability and/or compatibility across studies, compromising opportunities for data re‐use, synthesis and upscaling. Many of these challenges relate to a lack of an established ‘best practice’ for measuring key impacts and responses. This restrains our current understanding of complex processes and mechanisms in terrestrial ecosystems related to climate change. 2. To overcome these challenges, we collected best‐practice methods emerging from major ecological research networks and experiments, as synthesized by 115 experts from across a wide range of scientific disciplines. Our handbook contains guidance on the selection of response variables for different purposes, protocols for standardized measurements of 66 such response variables and advice on data management. Specifically, we recommend a minimum subset of variables that should be collected in all climate change studies to allow data re‐use and synthesis, and give guidance on additional variables critical for different types of synthesis and upscaling. The goal of this community effort is to facilitate awareness of the importance and broader application of standardized methods to promote data re‐use, availability, compatibility and transparency. We envision improved research practices that will increase returns on investments in individual research projects, facilitate second‐order research outputs and create opportunities for collaboration across scientific communities. Ultimately, this should significantly improve the quality and impact of the science, which is required to fulfil society's needs in a changing world

    Serum bile acids as a prognostic biomarker in biliary atresia following Kasai portoenterostomy

    No full text
    Background and aims: In biliary atresia, serum bilirubin is commonly used to predict outcomes after Kasai portoenterostomy (KP). Infants with persistently high levels invariably need liver transplant, but those achieving normalized levels have a less certain disease course. We hypothesized that serum bile acid levels could help predict outcomes in the latter group. Approach and results: Participants with biliary atresia from the Childhood Liver Disease Research Network were included if they had normalized bilirubin levels 6 months after KP and stored serum samples from the 6-month post-KP clinic visit ( n = 137). Bile acids were measured from the stored serum samples and used to divide participants into ≤40 μmol/L ( n = 43) or >40 μmol/L ( n = 94) groups. At 2 years of age, the ≤40 μmol/L compared with >40 μmol/L group had significantly lower total bilirubin, aspartate aminotransferase, alanine aminotransferase, gamma-glutamyltransferase, bile acids, and spleen size, as well as significantly higher albumin and platelet counts. Furthermore, during 734 person-years of follow-up, those in the ≤40 μmol/L group were significantly less likely to develop splenomegaly, ascites, gastrointestinal bleeding, or clinically evident portal hypertension. The ≤40 μmol/L group had a 10-year cumulative incidence of liver transplant/death of 8.5% (95% CI: 1.1%-26.1%), compared with 42.9% (95% CI: 28.6%-56.4%) for the >40 μmol/L group ( p = 0.001). Conclusions: Serum bile acid levels may be a useful prognostic biomarker for infants achieving normalized bilirubin levels after KP
    corecore