71 research outputs found

    Effect of RNA quality on transcript intensity levels in microarray analysis of human post-mortem brain tissues

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Large-scale gene expression analysis of post-mortem brain tissue offers unique opportunities for investigating genetic mechanisms of psychiatric and neurodegenerative disorders. On the other hand microarray data analysis associated with these studies is a challenging task. In this publication we address the issue of low RNA quality data and corresponding data analysis strategies.</p> <p>Results</p> <p>A detailed analysis of effects of post chip RNA quality on the measured abundance of transcripts is presented. Overall Affymetrix GeneChip data (HG-U133_AB and HG-U133_Plus_2.0) derived from ten different brain regions was investigated. Post chip RNA quality being assessed by 5'/3' ratio of housekeeping genes was found to introduce a well pronounced systematic noise into the measured transcript expression levels. According to this study RNA quality effects have: 1) a "random" component which is introduced by the technology and 2) a systematic component which depends on the features of the transcripts and probes. Random components mainly account for numerous negative correlations of low-abundant transcripts. These negative correlations are not reproducible and are mainly introduced by an increased relative level of noise. Three major contributors to the systematic noise component were identified: the first is the probe set distribution, the second is the length of mRNA species, and the third is the stability of mRNA species. Positive correlations reflect the 5'-end to 3'-end direction of mRNA degradation whereas negative correlations result from the compensatory increase in stable and 3'-end probed transcripts. Systematic components affect the expressed transcripts by introducing irrelevant gene correlations and can strongly influence the results of the main experiment. A linear model correcting the effect of RNA quality on measured intensities was introduced.</p> <p>In addition the contribution of a number of pre-mortem and post-mortem attributes to the overall detected RNA quality effect was investigated. Brain pH, duration of agonal stage, post-mortem interval before sampling and donor's age of death within considered limits were found to have no significant contribution.</p> <p>Conclusion</p> <p>Basic conclusions for data analysis in expression profiling study are as follows: 1) testing for RNA quality dependency should be included in the preprocessing of the data; 2) investigating inter-gene correlation without regard to RNA quality effects could be misleading; 3) data normalization procedures relying on housekeeping genes either do not influence the correlation structure (if 3'-end intensities are used) or increase it for negatively correlated transcripts (if 5'-end or median intensities are included in normalization procedure); 4) sample sets should be matched with regard to RNA quality; 5) RMA preprocessing is more sensitive to RNA quality effect, than MAS 5.0.</p

    Phenocopy – A Strategy to Qualify Chemical Compounds during Hit-to-Lead and/or Lead Optimization

    Get PDF
    A phenocopy is defined as an environmentally induced phenotype of one individual which is identical to the genotype-determined phenotype of another individual. The phenocopy phenomenon has been translated to the drug discovery process as phenotypes produced by the treatment of biological systems with new chemical entities (NCE) may resemble environmentally induced phenotypic modifications. Various new chemical entities exerting inhibition of the kinase activity of Transforming Growth Factor β Receptor I (TGF-βR1) were qualified by high-throughput RNA expression profiling. This chemical genomics approach resulted in a precise time-dependent insight to the TGF-β biology and allowed furthermore a comprehensive analysis of each NCE's off-target effects. The evaluation of off-target effects by the phenocopy approach allows a more accurate and integrated view on optimized compounds, supplementing classical biological evaluation parameters such as potency and selectivity. It has therefore the potential to become a novel method for ranking compounds during various drug discovery phases

    a versatile optical pump–soft X-ray probe facility with 100 fs X-ray pulses of variable polarization

    Get PDF
    Here the major upgrades of the femtoslicing facility at BESSY II (Khan et al., 2006) are reviewed, giving a tutorial on how elliptical-polarized ultrashort soft X-ray pulses from electron storage rings are generated at high repetition rates. Employing a 6 kHz femtosecond-laser system consisting of two amplifiers that are seeded by one Ti:Sa oscillator, the total average flux of photons of 100 fs duration (FWHM) has been increased by a factor of 120 to up to 106 photons s-1 (0.1% bandwidth)-1 on the sample in the range from 250 to 1400 eV. Thanks to a new beamline design, a factor of 20 enhanced flux and improvements of the stability together with the top-up mode of the accelerator have been achieved. The previously unavoidable problem of increased picosecond- background at higher repetition rates, caused by `halo' photons, has also been solved by hopping between different `camshaft' bunches in a dedicated fill pattern (`3+1 camshaft fill') of the storage ring. In addition to an increased X-ray performance at variable (linear and elliptical) polarization, the sample excitation in pump-probe experiments has been considerably extended using an optical parametric amplifier that supports the range from the near-UV to the far-IR regime. Dedicated endstations covering ultrafast magnetism experiments based on time-resolved X-ray circular dichroism have been either upgraded or, in the case of time-resolved resonant soft X-ray diffraction and reflection, newly constructed and adapted to femtoslicing requirements. Experiments at low temperatures down to 6 K and magnetic fields up to 0.5 T are supported. The FemtoSpeX facility is now operated as a 24 h user facility enabling a new class of experiments in ultrafast magnetism and in the field of transient phenomena and phase transitions in solids

    IgG and Fcγ Receptors in Intestinal Immunity and Inflammation.

    Get PDF
    Fcγ receptors (FcγR) are cell surface glycoproteins that mediate cellular effector functions of immunoglobulin G (IgG) antibodies. Genetic variation in FcγR genes can influence susceptibility to a variety of antibody-mediated autoimmune and inflammatory disorders, including systemic lupus erythematosus (SLE) and rheumatoid arthritis (RA). More recently, however, genetic studies have implicated altered FcγR signaling in the pathogenesis of inflammatory bowel disease (IBD), a condition classically associated with dysregulated innate and T cell immunity. Specifically, a variant of the activating receptor, FcγRIIA, with low affinity for IgG, confers protection against the development of ulcerative colitis, a subset of IBD, leading to a re-evaluation of the role of IgG and FcγRs in gastrointestinal tract immunity, an organ system traditionally associated with IgA. In this review, we summarize our current understanding of IgG and FcγR function at this unique host-environment interface, from the pathogenesis of colitis and defense against enteropathogens, its contribution to maternal-fetal cross-talk and susceptibility to cancer. Finally, we discuss the therapeutic implications of this information, both in terms of how FcγR signaling pathways may be targeted for the treatment of IBD and how FcγR engagement may influence the efficacy of therapeutic monoclonal antibodies in IBD

    High-resolution transcriptomic and epigenetic profiling identifies novel regulators of COPD

    Get PDF
    Patients with chronic obstructive pulmonary disease (COPD) are still waiting for curative treatments. Considering its environmental cause, we hypothesized that COPD will be associated with altered epigenetic signaling in lung cells. We generated genome-wide DNA methylation maps at single CpG resolution of primary human lung fibroblasts (HLFs) across COPD stages. We show that the epigenetic landscape is changed early in COPD, with DNA methylation changes occurring predominantly in regulatory regions. RNA sequencing of matched fibroblasts demonstrated dysregulation of genes involved in proliferation, DNA repair, and extracellular matrix organization. Data integration identified 110 candidate regulators of disease phenotypes that were linked to fibroblast repair processes using phenotypic screens. Our study provides high-resolution multi-omic maps of HLFs across COPD stages. We reveal novel transcriptomic and epigenetic signatures associated with COPD onset and progression and identify new candidate regulators involved in the pathogenesis of chronic lung diseases. The presence of various epigenetic factors among the candidates demonstrates that epigenetic regulation in COPD is an exciting research field that holds promise for novel therapeutic avenues for patients

    Disentangling type 2 diabetes and metformin treatment signatures in the human gut microbiota

    Get PDF
    In recent years, several associations between common chronic human disorders and altered gut microbiome composition and function have been reported(1,2). In most of these reports, treatment regimens were not controlled for and conclusions could thus be confounded by the effects of various drugs on the microbiota, which may obscure microbial causes, protective factors or diagnostically relevant signals. Our study addresses disease and drug signatures in the human gut microbiome of type 2 diabetes mellitus (T2D). Two previous quantitative gut metagenomics studies of T2D patients that were unstratified for treatment yielded divergent conclusions regarding its associated gut microbial dysbiosis(3,4). Here we show, using 784 available human gut metagenomes, how antidiabetic medication confounds these results, and analyse in detail the effects of the most widely used antidiabetic drug metformin. We provide support for microbial mediation of the therapeutic effects of metformin through short-chain fatty acid production, as well as for potential microbiota-mediated mechanisms behind known intestinal adverse effects in the form of a relative increase in abundance of Escherichia species. Controlling for metformin treatment, we report a unified signature of gut microbiome shifts in T2D with a depletion of butyrate-producing taxa(3,4). These in turn cause functional microbiome shifts, in part alleviated by metformin-induced changes. Overall, the present study emphasizes the need to disentangle gut microbiota signatures of specific human diseases from those of medication

    Comparison of normalization methods for Illumina BeadChip HumanHT-12 v3

    No full text
    Abstract Background Normalization of microarrays is a standard practice to account for and minimize effects which are not due to the controlled factors in an experiment. There is an overwhelming number of different methods that can be applied, none of which is ideally suited for all experimental designs. Thus, it is important to identify a normalization method appropriate for the experimental setup under consideration that is neither too negligent nor too stringent. Major aim is to derive optimal results from the underlying experiment. Comparisons of different normalization methods have already been conducted, none of which, to our knowledge, comparing more than a handful of methods. Results In the present study, 25 different ways of pre-processing Illumina Sentrix BeadChip array data are compared. Among others, methods provided by the BeadStudio software are taken into account. Looking at different statistical measures, we point out the ideal versus the actual observations. Additionally, we compare qRT-PCR measurements of transcripts from different ranges of expression intensities to the respective normalized values of the microarray data. Taking together all different kinds of measures, the ideal method for our dataset is identified. Conclusions Pre-processing of microarray gene expression experiments has been shown to influence further downstream analysis to a great extent and thus has to be carefully chosen based on the design of the experiment. This study provides a recommendation for deciding which normalization method is best suited for a particular experimental setup.</p
    corecore