51 research outputs found

    Assessment of the role of transcript for GATA-4 as a marker of unfavorable outcome in human adrenocortical neoplasms

    Get PDF
    BACKGROUND: Malignant neoplasia of the adrenal cortex is usually associated with very poor prognosis. When adrenocortical neoplasms are diagnosed in the early stages, distinction between carcinoma and adenoma can be very difficult to accomplish, since there is yet no reliable marker to predict tumor recurrence or dissemination. GATA transcription factors play an essential role in the developmental control of cell fate, cell proliferation and differentiation, organ morphogenesis, and tissue-specific gene expression. Normal mouse adrenal cortex expresses GATA-6 while its malignant counterpart only expresses GATA-4. The goal of the present study was to assess whether this reciprocal change in the expression of GATA factors might be relevant for predicting the prognosis of human adrenocortical neoplasms. Since human adrenal cortices express luteinizing hormone (LH/hCG) receptor and the gonadotropins are known to up-regulate GATA-4 in gonadal tumor cell lines, we also studied the expression of LH/hCG receptor. METHODS: We conducted a study on 13 non-metastasizing (NM) and 10 metastasizing/recurrent (MR) tumors obtained from a group of twenty-two adult and pediatric patients. The expression of GATA-4, GATA-6, and LH/hCG receptor (LHR) in normal and tumoral human adrenal cortices was analysed using reverse transcriptase-polymerase chain reaction (RT-PCR) complemented by dot blot hybridization. RESULTS: Messenger RNA for GATA-6 was detected in normal adrenal tissue, as well as in the totality of NM and MR tumors. GATA-4, by its turn, was detected in normal adrenal tissue, in 11 out of 13 NM tumors, and in 9 of the 10 MR tumors, with larger amounts of mRNA found among those presenting aggressive clinical behavior. Transcripts for LH receptor were observed both in normal tissue and neoplasms. A more intense LHR transcript accumulation was observed on those tumors with better clinical outcome. CONCLUSION: Our data suggest that the expression of GATA-6 in human adrenal cortex is not affected by tumorigenesis. GATA-4 expression is more abundant in MR tumors, while NM tumors express more intensely LHR. Further studies with larger cohorts are needed to test whether relative expression levels of LHR or GATA-4 might be used as prognosis predictors

    The three major axes of terrestrial ecosystem function.

    Full text link
    The leaf economics spectrum1,2 and the global spectrum of plant forms and functions3 revealed fundamental axes of variation in plant traits, which represent different ecological strategies that are shaped by the evolutionary development of plant species2. Ecosystem functions depend on environmental conditions and the traits of species that comprise the ecological communities4. However, the axes of variation of ecosystem functions are largely unknown, which limits our understanding of how ecosystems respond as a whole to anthropogenic drivers, climate and environmental variability4,5. Here we derive a set of ecosystem functions6 from a dataset of surface gas exchange measurements across major terrestrial biomes. We find that most of the variability within ecosystem functions (71.8%) is captured by three key axes. The first axis reflects maximum ecosystem productivity and is mostly explained by vegetation structure. The second axis reflects ecosystem water-use strategies and is jointly explained by variation in vegetation height and climate. The third axis, which represents ecosystem carbon-use efficiency, features a gradient related to aridity, and is explained primarily by variation in vegetation structure. We show that two state-of-the-art land surface models reproduce the first and most important axis of ecosystem functions. However, the models tend to simulate more strongly correlated functions than those observed, which limits their ability to accurately predict the full range of responses to environmental changes in carbon, water and energy cycling in terrestrial ecosystems7,8

    Cytokine-associated neutrophil extracellular traps and antinuclear antibodies in Plasmodium falciparum infected children under six years of age

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>In <it>Plasmodium falciparum</it>-infected children, the relationships between blood cell histopathology, blood plasma components, development of immunocompetence and disease severity remain poorly understood. Blood from Nigerian children with uncomplicated malaria was analysed to gain insight into these relationships. This investigation presents evidence for circulating neutrophil extracellular traps (NETs) and antinuclear IgG antibodies (ANA). The presence of NETs and ANA to double-stranded DNA along with the cytokine profiles found suggests autoimmune mechanisms that could produce pathogenesis in children, but immunoprotection in adults.</p> <p>Methods</p> <p>Peripheral blood smear slides and blood samples obtained from 21 Nigerian children under six years of age, presenting with uncomplicated malaria before and seven days after initiation of sulphadoxine-pyrimethamine (SP) treatment were analysed. The slides were stained with Giemsa and with DAPI. Levels of the pro-inflammatory cytokines IFN-γ, IL-2, TNF, CRP, and IL-6, select anti-inflammatory cytokines TGF-β and IL-10, and ANA were determined by immunoassay.</p> <p>Results</p> <p>The children exhibited circulating NETs with adherent parasites and erythrocytes, elevated ANA levels, a Th2 dominated cytokine profile, and left-shifted leukocyte differential counts. Nonspecific ANA levels were significant in 86% of the children pretreatment and in 100% of the children seven days after SP treatment, but in only 33% of age-matched control samples collected during the season of low parasite transmission. Levels of ANA specific for dsDNA were significant in 81% of the children both pre-treatment and post treatment.</p> <p>Conclusion</p> <p>The results of this investigation suggest that NET formation and ANA to dsDNA may induce pathology in falciparum-infected children, but activate a protective mechanism against falciparum malaria in adults. The significance of in vivo circulating chromatin in NETs and dsDNA ANA as a causative factor in the hyporesponsiveness of CpG oligonucleotide-based malaria vaccines is discussed.</p

    Epigallocatechin-3-gallate: a useful, effective and safe clinical approach for targeted prevention and individualised treatment of neurological diseases?

    Get PDF

    Single-dose administration and the influence of the timing of the booster dose on immunogenicity and efficacy of ChAdOx1 nCoV-19 (AZD1222) vaccine: a pooled analysis of four randomised trials.

    Get PDF
    BACKGROUND: The ChAdOx1 nCoV-19 (AZD1222) vaccine has been approved for emergency use by the UK regulatory authority, Medicines and Healthcare products Regulatory Agency, with a regimen of two standard doses given with an interval of 4-12 weeks. The planned roll-out in the UK will involve vaccinating people in high-risk categories with their first dose immediately, and delivering the second dose 12 weeks later. Here, we provide both a further prespecified pooled analysis of trials of ChAdOx1 nCoV-19 and exploratory analyses of the impact on immunogenicity and efficacy of extending the interval between priming and booster doses. In addition, we show the immunogenicity and protection afforded by the first dose, before a booster dose has been offered. METHODS: We present data from three single-blind randomised controlled trials-one phase 1/2 study in the UK (COV001), one phase 2/3 study in the UK (COV002), and a phase 3 study in Brazil (COV003)-and one double-blind phase 1/2 study in South Africa (COV005). As previously described, individuals 18 years and older were randomly assigned 1:1 to receive two standard doses of ChAdOx1 nCoV-19 (5 × 1010 viral particles) or a control vaccine or saline placebo. In the UK trial, a subset of participants received a lower dose (2·2 × 1010 viral particles) of the ChAdOx1 nCoV-19 for the first dose. The primary outcome was virologically confirmed symptomatic COVID-19 disease, defined as a nucleic acid amplification test (NAAT)-positive swab combined with at least one qualifying symptom (fever ≥37·8°C, cough, shortness of breath, or anosmia or ageusia) more than 14 days after the second dose. Secondary efficacy analyses included cases occuring at least 22 days after the first dose. Antibody responses measured by immunoassay and by pseudovirus neutralisation were exploratory outcomes. All cases of COVID-19 with a NAAT-positive swab were adjudicated for inclusion in the analysis by a masked independent endpoint review committee. The primary analysis included all participants who were SARS-CoV-2 N protein seronegative at baseline, had had at least 14 days of follow-up after the second dose, and had no evidence of previous SARS-CoV-2 infection from NAAT swabs. Safety was assessed in all participants who received at least one dose. The four trials are registered at ISRCTN89951424 (COV003) and ClinicalTrials.gov, NCT04324606 (COV001), NCT04400838 (COV002), and NCT04444674 (COV005). FINDINGS: Between April 23 and Dec 6, 2020, 24 422 participants were recruited and vaccinated across the four studies, of whom 17 178 were included in the primary analysis (8597 receiving ChAdOx1 nCoV-19 and 8581 receiving control vaccine). The data cutoff for these analyses was Dec 7, 2020. 332 NAAT-positive infections met the primary endpoint of symptomatic infection more than 14 days after the second dose. Overall vaccine efficacy more than 14 days after the second dose was 66·7% (95% CI 57·4-74·0), with 84 (1·0%) cases in the 8597 participants in the ChAdOx1 nCoV-19 group and 248 (2·9%) in the 8581 participants in the control group. There were no hospital admissions for COVID-19 in the ChAdOx1 nCoV-19 group after the initial 21-day exclusion period, and 15 in the control group. 108 (0·9%) of 12 282 participants in the ChAdOx1 nCoV-19 group and 127 (1·1%) of 11 962 participants in the control group had serious adverse events. There were seven deaths considered unrelated to vaccination (two in the ChAdOx1 nCov-19 group and five in the control group), including one COVID-19-related death in one participant in the control group. Exploratory analyses showed that vaccine efficacy after a single standard dose of vaccine from day 22 to day 90 after vaccination was 76·0% (59·3-85·9). Our modelling analysis indicated that protection did not wane during this initial 3-month period. Similarly, antibody levels were maintained during this period with minimal waning by day 90 (geometric mean ratio [GMR] 0·66 [95% CI 0·59-0·74]). In the participants who received two standard doses, after the second dose, efficacy was higher in those with a longer prime-boost interval (vaccine efficacy 81·3% [95% CI 60·3-91·2] at ≥12 weeks) than in those with a short interval (vaccine efficacy 55·1% [33·0-69·9] at <6 weeks). These observations are supported by immunogenicity data that showed binding antibody responses more than two-fold higher after an interval of 12 or more weeks compared with an interval of less than 6 weeks in those who were aged 18-55 years (GMR 2·32 [2·01-2·68]). INTERPRETATION: The results of this primary analysis of two doses of ChAdOx1 nCoV-19 were consistent with those seen in the interim analysis of the trials and confirm that the vaccine is efficacious, with results varying by dose interval in exploratory analyses. A 3-month dose interval might have advantages over a programme with a short dose interval for roll-out of a pandemic vaccine to protect the largest number of individuals in the population as early as possible when supplies are scarce, while also improving protection after receiving a second dose. FUNDING: UK Research and Innovation, National Institutes of Health Research (NIHR), The Coalition for Epidemic Preparedness Innovations, the Bill & Melinda Gates Foundation, the Lemann Foundation, Rede D'Or, the Brava and Telles Foundation, NIHR Oxford Biomedical Research Centre, Thames Valley and South Midland's NIHR Clinical Research Network, and AstraZeneca

    Can Self-Reported Strokes Be Used to Study Stroke Incidence and Risk Factors? Evidence From the Health and Retirement Study

    No full text
    Background and Purpose - Most stroke incidence studies use geographically localized (community) samples with few national data sources available. Such samples preclude research on contextual risk factors, but national samples frequently collect only self-reported stroke. We examine whether incidence estimates from clinically verified studies are consistent with estimates from a nationally representative US sample assessing self-reported stroke. Methods - Health and Retirement Study (HRS) participants (n=17 056) age 50+ years were followed for self- or proxy-reported first stroke (1293 events) from 1998 to 2006 (average, 6.8 years). We compared incidence rates by race, sex, and age strata with those previously documented in leading geographically localized studies with medically verified stroke. We also examined whether cardiovascular risk factor effect estimates in HRS are comparable to those reported in studies with clinically verified strokes. Results - The weighted first-stroke incidence rate was 10.0 events/1000 person-years. Total age-stratified incidence rates in whites were mostly comparable with those reported elsewhere and were not systematically higher or lower. However, among blacks in HRS, incidence rates generally appeared higher than those previously reported. HRS estimates were most comparable with those reported in the Cardiovascular Health Study. Incidence rates approximately doubled per decade of age and were higher in men and blacks. After demographic adjustment, all risk factors predicted stroke incidence in whites. Smoking, hypertension, diabetes, and heart disease predicted incident stroke in blacks. Conclusions - Associations between known risk factors and stroke incidence were verified in HRS, suggesting that misreporting is nonsystematic. HRS may provide valuable data for stroke surveillance and examination of classical and contextual risk factors. (Stroke. 2009; 40: 873-879.

    Level and Change in Cognitive Test Scores Predict Risk of First Stroke

    No full text
    To determine whether cognitive test scores and cognitive decline predict incidence of first diagnosed stroke. Stroke-free Health and Retirement Study participants were followed on average 7.6 years for self- or proxy-reported first stroke (1,483 events). Predictors included baseline performance on a modified Telephone Interview for Cognitive Status (Mental Status) and Word Recall test and decline between baseline and second assessment in either measure. Hazard ratios (HRs) were estimated using Cox proportional hazards models for the whole sample and stratified according to five major cardiovascular risk factors. National cohort study of noninstitutionalized adults with a mean baseline age of 64 +/- 9.9. Health and Retirement Study participants (n=19,699) aged 50 and older. Word Recall (HR for 1 standard deviation difference=0.92, 95% confidence interval (CI)=0.86-0.97)) and Mental Status (HR=0.89, 95% CI=0.84-0.95) predicted incident stroke. Mental Status predicted stroke risk in those with (HR=0.93, 95%=0.87-0.99) and without (HR=0.81, 95% CI=0.72-.91) one or more vascular risk factors. Word Recall declines predicted a 16% elevation in subsequent stroke risk (95% CI=1.01-1.34). Declines in Mental Status predicted a 37% elevation in stroke risk (95% CI=1.11-1.70). Cognitive test scores predict future stroke risk, independent of other major vascular risk factors
    corecore