82 research outputs found
Environmental gradients and the evolution of successional habitat specialization: A test case with 14 Neotropical forest sites
© 2015 British Ecological Society. Successional gradients are ubiquitous in nature, yet few studies have systematically examined the evolutionary origins of taxa that specialize at different successional stages. Here we quantify successional habitat specialization in Neotropical forest trees and evaluate its evolutionary lability along a precipitation gradient. Theoretically, successional habitat specialization should be more evolutionarily conserved in wet forests than in dry forests due to more extreme microenvironmental differentiation between early and late-successional stages in wet forest. We applied a robust multinomial classification model to samples of primary and secondary forest trees from 14 Neotropical lowland forest sites spanning a precipitation gradient from 788 to 4000 mm annual rainfall, identifying species that are old-growth specialists and secondary forest specialists in each site. We constructed phylogenies for the classified taxa at each site and for the entire set of classified taxa and tested whether successional habitat specialization is phylogenetically conserved. We further investigated differences in the functional traits of species specializing in secondary vs. old-growth forest along the precipitation gradient, expecting different trait associations with secondary forest specialists in wet vs. dry forests since water availability is more limiting in dry forests and light availability more limiting in wet forests. Successional habitat specialization is non-randomly distributed in the angiosperm phylogeny, with a tendency towards phylogenetic conservatism overall and a trend towards stronger conservatism in wet forests than in dry forests. However, the specialists come from all the major branches of the angiosperm phylogeny, and very few functional traits showed any consistent relationships with successional habitat specialization in either wet or dry forests. Synthesis. The niche conservatism evident in the habitat specialization of Neotropical trees suggests a role for radiation into different successional habitats in the evolution of species-rich genera, though the diversity of functional traits that lead to success in different successional habitats complicates analyses at the community scale. Examining the distribution of particular lineages with respect to successional gradients may provide more insight into the role of successional habitat specialization in the evolution of species-rich taxa
Single nucleotide polymorphisms in DNA repair genes as risk factors associated to prostate cancer progression
Background
Besides serum levels of PSA, there is a lack of prostate cancer specific biomarkers. It is need to develop new biological markers associated with the tumor behavior which would be valuable to better individualize treatment. The aim of this study was to elucidate the relationship between single nucleotide polymorphisms (SNPs) in genes involved in DNA repair and prostate cancer progression.Methods
A total of 494 prostate cancer patients from a Spanish multicenter study were genotyped for 10 SNPs in XRCC1, ERCC2, ERCC1, LIG4, ATM and TP53 genes. The SNP genotyping was made in a Biotrove OpenArrayÂź NT Cycler. Clinical tumor stage, diagnostic PSA serum levels, and Gleason score at diagnosis were obtained for all participants. Genotypic and allelic frequencies were determined using the web-based environment SNPator.Results
SNPs rs11615 (ERCC1) and rs17503908 (ATM) appeared as risk factors for prostate cancer aggressiveness. Patients wild homozygous for these SNPs (AA and TT, respectively) were at higher risk for developing cT2b â cT4 (ORâ=â2.21 (confidence interval (CI) 95% 1.47 â 3.31), pâ<â0.001) and Gleason scoresââ„â7 (ORâ=â2.22 (CI 95% 1.38 â 3.57), pâ<â0.001), respectively. Moreover, those patients wild homozygous for both SNPs had the greatest risk of presenting DâAmico high-risk tumors (ORâ=â2.57 (CI 95% 1.28 â 5.16)).Conclusions
Genetic variants at DNA repair genes are associated with prostate cancer progression, and would be taken into account when assessing the malignancy of prostate cancer.This work was subsidized by a grant from the Instituto de Salud Carlos III (Ministerio de EconomĂa y Competitividad from Spain), ID: PI12/01867. Almudena Valenciano has a grant from the Instituto Canario de InvestigaciĂłn del CĂĄncer (ICIC)
The impact of carotid plaque presence and morphology on mortality outcome in cardiological patients
BACKGROUND: Carotid plaque severity and morphology can affect cardiovascular prognosis. We evaluate both the importance of echographically assessed carotid artery plaque geometry and morphology as predictors of death in hospitalised cardiological patients. METHODS: 541 hospitalised patients admitted in a cardiological division (age = 66 ± 11 years, 411 men), have been studied through ultrasound Duplex carotid scan and successively followed-up for a median of 34 months. Echo evaluation assessed plaque severity and morphology (presence of heterogeneity and profile). RESULTS: 361 patients showed carotid stenosis (67% with <50% stenosis, 18% with 50â69% stenosis, 9% with >70% stenosis, 4% with near occlusion and 2% with total occlusion). During the follow-up period, there were 83 all-cause deaths (15% of the total population). Using Cox's proportional hazard model, age (RR 1.06, 95% CI 1.03â1.09, p = 0.000), ejection fraction > 50% (RR = 0.62, 95% CI 0.4â0.96, p = 0.03), treatment with statins (RR = 0.52, 95% CI 0.29â0.95, p = 0.34) and the presence of a heterogeneous plaque (RR 1.6; 95% CI, 1.2 to 2.14, p = 0.002) were independent predictors of death. Kaplan â Meier survival estimates have shown the best outcome in patients without plaque, intermediate in patients with homogeneous plaques and the worst outcome in patients with heterogeneous plaques (90% vs 79% vs 73%, p = 0.0001). CONCLUSION: In hospitalised cardiological patients, carotid plaque presence and morphology assessed by ultrasound are independent predictors of death
Clear and independent associations of several HLA-DRB1 alleles with differential antibody responses to hepatitis B vaccination in youth
To confirm and refine associations of human leukocyte antigen (HLA) genotypes with variable antibody (Ab) responses to hepatitis B vaccination, we have analyzed 255 HIV-1 seropositive (HIV+) youth and 80 HIV-1 seronegatives (HIVâ) enrolled into prospective studies. In univariate analyses that focused on HLA-DRB1, -DQA1, and -DQB1 alleles and haplotypes, the DRB1*03 allele group and DRB1*0701 were negatively associated with the responder phenotype (serum Ab concentration â„ 10 mIU/mL) (P = 0.026 and 0.043, respectively). Collectively, DRB1*03 and DRB1*0701 were found in 42 (53.8%) out of 78 non-responders (serum Ab <10 mIU/mL), 65 (40.6%) out of 160 medium responders (serum Ab 10â1,000 mIU/mL), and 27 (27.8%) out of 97 high responders (serum Ab >1,000 mIU/mL) (P < 0.001 for trend). Meanwhile, DRB1*08 was positively associated with the responder phenotype (P = 0.010), mostly due to DRB1*0804 (P = 0.008). These immunogenetic relationships were all independent of non-genetic factors, including HIV-1 infection status and immunodeficiency. Alternative analyses confined to HIV+ youth or Hispanic youth led to similar findings. In contrast, analyses of more than 80 non-coding, single nucleotide polymorphisms within and beyond the three HLA class II genes revealed no clear associations. Overall, several HLA-DRB1 alleles were major predictors of differential Ab responses to hepatitis B vaccination in youth, suggesting that T-helper cell-dependent pathways mediated through HLA class II antigen presentation are critical to effective immune response to recombinant vaccines
Frequency drift in MR spectroscopy at 3T
Purpose: Heating of gradient coils and passive shim components is a common cause of instability in the B-0 field, especially when gradient intensive sequences are used. The aim of the study was to set a benchmark for typical drift encountered during MR spectroscopy (MRS) to assess the need for real-time field-frequency locking on MRI scanners by comparing field drift data from a large number of sites.Method: A standardized protocol was developed for 80 participating sites using 99 3T MR scanners from 3 major vendors. Phantom water signals were acquired before and after an EPI sequence. The protocol consisted of: minimal preparatory imaging; a short pre-fMRI PRESS; a ten-minute fMRI acquisition; and a long post-fMRI PRESS acquisition. Both pre- and post-fMRI PRESS were non-water suppressed. Real-time frequency stabilization/adjustment was switched off when appropriate. Sixty scanners repeated the protocol for a second dataset. In addition, a three-hour post-fMRI MRS acquisition was performed at one site to observe change of gradient temperature and drift rate. Spectral analysis was performed using MATLAB. Frequency drift in pre-fMRI PRESS data were compared with the first 5:20 minutes and the full 30:00 minutes of data after fMRI. Median (interquartile range) drifts were measured and showed in violin plot. Paired t-tests were performed to compare frequency drift pre- and post-fMRI. A simulated in vivo spectrum was generated using FID-A to visualize the effect of the observed frequency drifts. The simulated spectrum was convolved with the frequency trace for the most extreme cases. Impacts of frequency drifts on NAA and GABA were also simulated as a function of linear drift. Data from the repeated protocol were compared with the corresponding first dataset using Pearson's and intraclass correlation coefficients (ICC).Results: Of the data collected from 99 scanners, 4 were excluded due to various reasons. Thus, data from 95 scanners were ultimately analyzed. For the first 5:20 min (64 transients), median (interquartile range) drift was 0.44 (1.29) Hz before fMRI and 0.83 (1.29) Hz after. This increased to 3.15 (4.02) Hz for the full 30 min (360 transients) run. Average drift rates were 0.29 Hz/min before fMRI and 0.43 Hz/min after. Paired t-tests indicated that drift increased after fMRI, as expected (p < 0.05). Simulated spectra convolved with the frequency drift showed that the intensity of the NAA singlet was reduced by up to 26%, 44 % and 18% for GE, Philips and Siemens scanners after fMRI, respectively. ICCs indicated good agreement between datasets acquired on separate days. The single site long acquisition showed drift rate was reduced to 0.03 Hz/min approximately three hours after fMRI.Discussion: This study analyzed frequency drift data from 95 3T MRI scanners. Median levels of drift were relatively low (5-min average under 1 Hz), but the most extreme cases suffered from higher levels of drift. The extent of drift varied across scanners which both linear and nonlinear drifts were observed.</p
Recommended from our members
Nursing considerations to complement the Surviving Sepsis Campaign guidelines
Objectives: To provide a series of recommendations based on the best available evidence to guide clinicians providing nursing care to patients with severe sepsis.
Design: Modified Delphi method involving international experts and key individuals in subgroup work and electronic-based discussion among the entire group to achieve consensus.
Methods: We used the Surviving Sepsis Campaign guidelines as a framework to inform the structure and content of these guidelines. We used the Grades of Recommendation, Assessment, Development, and Evaluation (GRADE) system to rate the quality of evidence from high (A) to very low (D) and to determine the strength of recommendations, with grade 1 indicating clear benefit in the septic population and grade 2 indicating less confidence in the benefits in the septic population. In areas without complete agreement between all authors, a process of electronic discussion of all evidence was undertaken until consensus was reached. This process was conducted independently of any funding.
Results: Sixty-three recommendations relating to the nursing care of severe sepsis patients are made. Prevention recommendations relate to education, accountability, surveillance of nosocomial infections, hand hygiene, and prevention of respiratory, central line-related, surgical site, and urinary tract infections, whereas infection management recommendations related to both control of the infection source and transmission-based precautions. Recommendations related to initial resuscitation include improved recognition of the deteriorating patient, diagnosis of severe sepsis, seeking further assistance, and initiating early resuscitation measures. Important elements of hemodynamic support relate to improving both tissue oxygenation and macrocirculation. Recommendations related to supportive nursing care incorporate aspects of nutrition, mouth and eye care, and pressure ulcer prevention and management. Pediatric recommendations relate to the use of antibiotics, steroids, vasopressors and inotropes, fluid resuscitation, sedation and analgesia, and the role of therapeutic end points.
Conclusion: Consensus was reached regarding many aspects of nursing care of the severe sepsis patient. Despite this, there is an urgent need for further evidence to better inform this area of critical care
An original phylogenetic approach identified mitochondrial haplogroup T1a1 as inversely associated with breast cancer risk in BRCA2 mutation carriers
Introduction: Individuals carrying pathogenic mutations in the BRCA1 and BRCA2 genes have a high lifetime risk of breast cancer. BRCA1 and BRCA2 are involved in DNA double-strand break repair, DNA alterations that can be caused by exposure to reactive oxygen species, a main source of which are mitochondria. Mitochondrial genome variations affect electron transport chain efficiency and reactive oxygen species production. Individuals with different mitochondrial haplogroups differ in their metabolism and sensitivity to oxidative stress. Variability in mitochondrial genetic background can alter reactive oxygen species production, leading to cancer risk. In the present study, we tested the hypothesis that mitochondrial haplogroups modify breast cancer risk in BRCA1/2 mutation carriers. Methods: We genotyped 22,214 (11,421 affected, 10,793 unaffected) mutation carriers belonging to the Consortium of Investigators of Modifiers of BRCA1/2 for 129 mitochondrial polymorphisms using the iCOGS array. Haplogroup inference and association detection were performed using a phylogenetic approach. ALTree was applied to explore the reference mitochondrial evolutionary tree and detect subclades enriched in affected or unaffected individuals. Results: We discovered that subclade T1a1 was depleted in affected BRCA2 mutation carriers compared with the rest of clade T (hazard ratio (HR) = 0.55; 95% confidence interval (CI), 0.34 to 0.88; P = 0.01). Compared with the most frequent haplogroup in the general population (that is, H and T clades), the T1a1 haplogroup has a HR of 0.62 (95% CI, 0.40 to 0.95; P = 0.03). We also identified three potential susceptibility loci, including G13708A/rs28359178, which has demonstrated an inverse association with familial breast cancer risk. Conclusions: This study illustrates how original approaches such as the phylogeny-based method we used can empower classical molecular epidemiological studies aimed at identifying association or risk modification effects.Peer reviewe
Recommended from our members
Effect of Hydrocortisone on Mortality and Organ Support in Patients With Severe COVID-19: The REMAP-CAP COVID-19 Corticosteroid Domain Randomized Clinical Trial.
Importance: Evidence regarding corticosteroid use for severe coronavirus disease 2019 (COVID-19) is limited. Objective: To determine whether hydrocortisone improves outcome for patients with severe COVID-19. Design, Setting, and Participants: An ongoing adaptive platform trial testing multiple interventions within multiple therapeutic domains, for example, antiviral agents, corticosteroids, or immunoglobulin. Between March 9 and June 17, 2020, 614 adult patients with suspected or confirmed COVID-19 were enrolled and randomized within at least 1 domain following admission to an intensive care unit (ICU) for respiratory or cardiovascular organ support at 121 sites in 8 countries. Of these, 403 were randomized to open-label interventions within the corticosteroid domain. The domain was halted after results from another trial were released. Follow-up ended August 12, 2020. Interventions: The corticosteroid domain randomized participants to a fixed 7-day course of intravenous hydrocortisone (50 mg or 100 mg every 6 hours) (nâ=â143), a shock-dependent course (50 mg every 6 hours when shock was clinically evident) (nâ=â152), or no hydrocortisone (nâ=â108). Main Outcomes and Measures: The primary end point was organ support-free days (days alive and free of ICU-based respiratory or cardiovascular support) within 21 days, where patients who died were assigned -1 day. The primary analysis was a bayesian cumulative logistic model that included all patients enrolled with severe COVID-19, adjusting for age, sex, site, region, time, assignment to interventions within other domains, and domain and intervention eligibility. Superiority was defined as the posterior probability of an odds ratio greater than 1 (threshold for trial conclusion of superiority >99%). Results: After excluding 19 participants who withdrew consent, there were 384 patients (mean age, 60 years; 29% female) randomized to the fixed-dose (nâ=â137), shock-dependent (nâ=â146), and no (nâ=â101) hydrocortisone groups; 379 (99%) completed the study and were included in the analysis. The mean age for the 3 groups ranged between 59.5 and 60.4 years; most patients were male (range, 70.6%-71.5%); mean body mass index ranged between 29.7 and 30.9; and patients receiving mechanical ventilation ranged between 50.0% and 63.5%. For the fixed-dose, shock-dependent, and no hydrocortisone groups, respectively, the median organ support-free days were 0 (IQR, -1 to 15), 0 (IQR, -1 to 13), and 0 (-1 to 11) days (composed of 30%, 26%, and 33% mortality rates and 11.5, 9.5, and 6 median organ support-free days among survivors). The median adjusted odds ratio and bayesian probability of superiority were 1.43 (95% credible interval, 0.91-2.27) and 93% for fixed-dose hydrocortisone, respectively, and were 1.22 (95% credible interval, 0.76-1.94) and 80% for shock-dependent hydrocortisone compared with no hydrocortisone. Serious adverse events were reported in 4 (3%), 5 (3%), and 1 (1%) patients in the fixed-dose, shock-dependent, and no hydrocortisone groups, respectively. Conclusions and Relevance: Among patients with severe COVID-19, treatment with a 7-day fixed-dose course of hydrocortisone or shock-dependent dosing of hydrocortisone, compared with no hydrocortisone, resulted in 93% and 80% probabilities of superiority with regard to the odds of improvement in organ support-free days within 21 days. However, the trial was stopped early and no treatment strategy met prespecified criteria for statistical superiority, precluding definitive conclusions. Trial Registration: ClinicalTrials.gov Identifier: NCT02735707
Carbon sequestration potential of second-growth forest regeneration in the Latin American tropics
Regrowth of tropical secondary forests following complete or nearly complete removal of forest vegetation actively stores carbon in aboveground biomass, partially counterbalancing carbon emissions from deforestation, forest degradation, burning of fossil fuels, and other anthropogenic sources. We estimate the age and spatial extent of lowland second-growth forests in the Latin American tropics and model their potential aboveground carbon accumulation over four decades. Our model shows that, in 2008, second-growth forests (1 to 60 years old) covered 2.4 million km2 of land (28.1%of the total study area).Over 40 years, these lands can potentially accumulate a total aboveground carbon stock of 8.48 Pg C (petagrams of carbon) in aboveground biomass via low-cost natural regeneration or assisted regeneration, corresponding to a total CO2 sequestration of 31.09 Pg CO2. This total is equivalent to carbon emissions from fossil fuel use and industrial processes in all of Latin America and the Caribbean from1993 to 2014. Ten countries account for 95% of this carbon storage potential, led by Brazil, Colombia, Mexico, and Venezuela. We model future land-use scenarios to guide national carbon mitigation policies. Permitting natural regeneration on 40% of lowland pastures potentially stores an additional 2.0 Pg C over 40 years. Our study provides information and maps to guide national-level forest-based carbon mitigation plans on the basis of estimated rates of natural regeneration and pasture abandonment. Coupled with avoided deforestation and sustainable forestmanagement, natural regeneration of second-growth forests provides a low-costmechanism that yields a high carbon sequestration potential with multiple benefits for biodiversity and ecosystem services. © 2016 The Authors
Effect of angiotensin-converting enzyme inhibitor and angiotensin receptor blocker initiation on organ support-free days in patients hospitalized with COVID-19
IMPORTANCE Overactivation of the renin-angiotensin system (RAS) may contribute to poor clinical outcomes in patients with COVID-19.
Objective To determine whether angiotensin-converting enzyme (ACE) inhibitor or angiotensin receptor blocker (ARB) initiation improves outcomes in patients hospitalized for COVID-19.
DESIGN, SETTING, AND PARTICIPANTS In an ongoing, adaptive platform randomized clinical trial, 721 critically ill and 58 nonâcritically ill hospitalized adults were randomized to receive an RAS inhibitor or control between March 16, 2021, and February 25, 2022, at 69 sites in 7 countries (final follow-up on June 1, 2022).
INTERVENTIONS Patients were randomized to receive open-label initiation of an ACE inhibitor (nâ=â257), ARB (nâ=â248), ARB in combination with DMX-200 (a chemokine receptor-2 inhibitor; nâ=â10), or no RAS inhibitor (control; nâ=â264) for up to 10 days.
MAIN OUTCOMES AND MEASURES The primary outcome was organ supportâfree days, a composite of hospital survival and days alive without cardiovascular or respiratory organ support through 21 days. The primary analysis was a bayesian cumulative logistic model. Odds ratios (ORs) greater than 1 represent improved outcomes.
RESULTS On February 25, 2022, enrollment was discontinued due to safety concerns. Among 679 critically ill patients with available primary outcome data, the median age was 56 years and 239 participants (35.2%) were women. Median (IQR) organ supportâfree days among critically ill patients was 10 (â1 to 16) in the ACE inhibitor group (nâ=â231), 8 (â1 to 17) in the ARB group (nâ=â217), and 12 (0 to 17) in the control group (nâ=â231) (median adjusted odds ratios of 0.77 [95% bayesian credible interval, 0.58-1.06] for improvement for ACE inhibitor and 0.76 [95% credible interval, 0.56-1.05] for ARB compared with control). The posterior probabilities that ACE inhibitors and ARBs worsened organ supportâfree days compared with control were 94.9% and 95.4%, respectively. Hospital survival occurred in 166 of 231 critically ill participants (71.9%) in the ACE inhibitor group, 152 of 217 (70.0%) in the ARB group, and 182 of 231 (78.8%) in the control group (posterior probabilities that ACE inhibitor and ARB worsened hospital survival compared with control were 95.3% and 98.1%, respectively).
CONCLUSIONS AND RELEVANCE In this trial, among critically ill adults with COVID-19, initiation of an ACE inhibitor or ARB did not improve, and likely worsened, clinical outcomes.
TRIAL REGISTRATION ClinicalTrials.gov Identifier: NCT0273570
- âŠ