196 research outputs found

    Short-Term Soil Organic Matter and Carbon Responses to Contrasting Grazing Intensities in Integrated Crop-Livestock Systems

    Get PDF
    Combining integrated crop-livestock systems under no-till management may improve soil organic matter (SOM) build up and improve soil C sequestration. Grazing cover crops appears as a possibility to combine crops and livestock in a farm system. Further SOM and soil C increase can be achieved by adding perennial grasses into crop rotations. However, the effect of grazing intensity in such systems are not fully understood. This 2-yr study investigated short-term effects of cropping system [winter cover crops-summer cotton (Gossypium hirsutum L.) and winter cover crops-summer bahiagrass (Paspalum notatum Flüggé) rotations], grazing intensity (no grazing, heavy, moderate, and light grazing), and N fertilization (34 and 90 kg N ha-1 ), on OM and soil C of the soil-surface (0-15 cm) and deep-soil (0-90 cm) under no-till. Preliminary results indicate that treatments containing bahiagrass improved SOM in 1.5 g kg-1 compared to winter grazing on cover crops-cotton systems (P = 0.017). There were no differences among treatments for soil total C stock (15.4 Mg ha-1) and particulate OM-C (4.8 Mg ha-1) at the 15-cm depth (P \u3e 0.1). Carbon concentration increased from 8.0 to 12.6 g kg-1 as aggregate fraction decreased from 250 – 2000 to \u3c 53 µm (P \u3c 0.001). Nonetheless, C stock was not affected by aggregate fraction, with each fraction containing 3.8 Mg C ha-1, on average. Carbon stocks from 0-15, 15-30, 30-60, and 60-90-cm depths did not differ among treatments (P = 0.743), totalizing 30.4 Mg C ha-1 in the soil profile. Long-term studies are necessary to better understand the role of cropping system and grazing intensities on soil OM and C responses on surface and deep soil

    Relationship between Field Measurements in Three \u3cem\u3eBrachiaria\u3c/em\u3e Species with Leaf Area Index and Light Interception by Indirect Methods

    Get PDF
    Brachiaria species play a strategic role in ruminant production systems in Brazil, covering an estimated pasture area of approximately 90 million hectares (Karia et al., 2006), however, these pastures are subject to different degrees of degradation due to inadequate management. In pasture management, field measurements such as canopy height, for example, are used by managers as a tool to establish parameters for the optimal point to cut the forage and for the post-grazing residue, in order to maximize production by harvesting at maximum of herbage mass accumulation, and to avoid problems associated to overgrazing, by setting ideal post-grazing height for forage regrowth. The use of the variables light interception (LI) and leaf area index (LAI) has been recommended as a tool for pasture management, based on the theory that, when the canopy reaches a light interception of 95%, the forage is near its maximum growth rate, which is called critical LAI (Brougham, 1956). The residual LAI refers to the leaf area of the post-grazed stubble. Residual LAI is used to establish the minimum leaf area necessary to ensure an efficient pasture regrowth (Lemos et al., 2014). Light interception and the LAI are difficult to measure at the farm level, due to the high cost of the equipment and technical feasibility of the process. The objective of this study was to evaluate the relationship between LI and LAI measured by two different equipment, with canopy height and soil cover in three species of Brachiaria

    ATLANTIC EPIPHYTES: a data set of vascular and non-vascular epiphyte plants and lichens from the Atlantic Forest

    Get PDF
    Epiphytes are hyper-diverse and one of the frequently undervalued life forms in plant surveys and biodiversity inventories. Epiphytes of the Atlantic Forest, one of the most endangered ecosystems in the world, have high endemism and radiated recently in the Pliocene. We aimed to (1) compile an extensive Atlantic Forest data set on vascular, non-vascular plants (including hemiepiphytes), and lichen epiphyte species occurrence and abundance; (2) describe the epiphyte distribution in the Atlantic Forest, in order to indicate future sampling efforts. Our work presents the first epiphyte data set with information on abundance and occurrence of epiphyte phorophyte species. All data compiled here come from three main sources provided by the authors: published sources (comprising peer-reviewed articles, books, and theses), unpublished data, and herbarium data. We compiled a data set composed of 2,095 species, from 89,270 holo/hemiepiphyte records, in the Atlantic Forest of Brazil, Argentina, Paraguay, and Uruguay, recorded from 1824 to early 2018. Most of the records were from qualitative data (occurrence only, 88%), well distributed throughout the Atlantic Forest. For quantitative records, the most common sampling method was individual trees (71%), followed by plot sampling (19%), and transect sampling (10%). Angiosperms (81%) were the most frequently registered group, and Bromeliaceae and Orchidaceae were the families with the greatest number of records (27,272 and 21,945, respectively). Ferns and Lycophytes presented fewer records than Angiosperms, and Polypodiaceae were the most recorded family, and more concentrated in the Southern and Southeastern regions. Data on non-vascular plants and lichens were scarce, with a few disjunct records concentrated in the Northeastern region of the Atlantic Forest. For all non-vascular plant records, Lejeuneaceae, a family of liverworts, was the most recorded family. We hope that our effort to organize scattered epiphyte data help advance the knowledge of epiphyte ecology, as well as our understanding of macroecological and biogeographical patterns in the Atlantic Forest. No copyright restrictions are associated with the data set. Please cite this Ecology Data Paper if the data are used in publication and teaching events. © 2019 The Authors. Ecology © 2019 The Ecological Society of Americ

    Canagliflozin and renal outcomes in type 2 diabetes and nephropathy

    Get PDF
    BACKGROUND Type 2 diabetes mellitus is the leading cause of kidney failure worldwide, but few effective long-term treatments are available. In cardiovascular trials of inhibitors of sodium–glucose cotransporter 2 (SGLT2), exploratory results have suggested that such drugs may improve renal outcomes in patients with type 2 diabetes. METHODS In this double-blind, randomized trial, we assigned patients with type 2 diabetes and albuminuric chronic kidney disease to receive canagliflozin, an oral SGLT2 inhibitor, at a dose of 100 mg daily or placebo. All the patients had an estimated glomerular filtration rate (GFR) of 30 to <90 ml per minute per 1.73 m2 of body-surface area and albuminuria (ratio of albumin [mg] to creatinine [g], >300 to 5000) and were treated with renin–angiotensin system blockade. The primary outcome was a composite of end-stage kidney disease (dialysis, transplantation, or a sustained estimated GFR of <15 ml per minute per 1.73 m2), a doubling of the serum creatinine level, or death from renal or cardiovascular causes. Prespecified secondary outcomes were tested hierarchically. RESULTS The trial was stopped early after a planned interim analysis on the recommendation of the data and safety monitoring committee. At that time, 4401 patients had undergone randomization, with a median follow-up of 2.62 years. The relative risk of the primary outcome was 30% lower in the canagliflozin group than in the placebo group, with event rates of 43.2 and 61.2 per 1000 patient-years, respectively (hazard ratio, 0.70; 95% confidence interval [CI], 0.59 to 0.82; P=0.00001). The relative risk of the renal-specific composite of end-stage kidney disease, a doubling of the creatinine level, or death from renal causes was lower by 34% (hazard ratio, 0.66; 95% CI, 0.53 to 0.81; P<0.001), and the relative risk of end-stage kidney disease was lower by 32% (hazard ratio, 0.68; 95% CI, 0.54 to 0.86; P=0.002). The canagliflozin group also had a lower risk of cardiovascular death, myocardial infarction, or stroke (hazard ratio, 0.80; 95% CI, 0.67 to 0.95; P=0.01) and hospitalization for heart failure (hazard ratio, 0.61; 95% CI, 0.47 to 0.80; P<0.001). There were no significant differences in rates of amputation or fracture. CONCLUSIONS In patients with type 2 diabetes and kidney disease, the risk of kidney failure and cardiovascular events was lower in the canagliflozin group than in the placebo group at a median follow-up of 2.62 years

    Impact of COVID-19 on cardiovascular testing in the United States versus the rest of the world

    Get PDF
    Objectives: This study sought to quantify and compare the decline in volumes of cardiovascular procedures between the United States and non-US institutions during the early phase of the coronavirus disease-2019 (COVID-19) pandemic. Background: The COVID-19 pandemic has disrupted the care of many non-COVID-19 illnesses. Reductions in diagnostic cardiovascular testing around the world have led to concerns over the implications of reduced testing for cardiovascular disease (CVD) morbidity and mortality. Methods: Data were submitted to the INCAPS-COVID (International Atomic Energy Agency Non-Invasive Cardiology Protocols Study of COVID-19), a multinational registry comprising 909 institutions in 108 countries (including 155 facilities in 40 U.S. states), assessing the impact of the COVID-19 pandemic on volumes of diagnostic cardiovascular procedures. Data were obtained for April 2020 and compared with volumes of baseline procedures from March 2019. We compared laboratory characteristics, practices, and procedure volumes between U.S. and non-U.S. facilities and between U.S. geographic regions and identified factors associated with volume reduction in the United States. Results: Reductions in the volumes of procedures in the United States were similar to those in non-U.S. facilities (68% vs. 63%, respectively; p = 0.237), although U.S. facilities reported greater reductions in invasive coronary angiography (69% vs. 53%, respectively; p < 0.001). Significantly more U.S. facilities reported increased use of telehealth and patient screening measures than non-U.S. facilities, such as temperature checks, symptom screenings, and COVID-19 testing. Reductions in volumes of procedures differed between U.S. regions, with larger declines observed in the Northeast (76%) and Midwest (74%) than in the South (62%) and West (44%). Prevalence of COVID-19, staff redeployments, outpatient centers, and urban centers were associated with greater reductions in volume in U.S. facilities in a multivariable analysis. Conclusions: We observed marked reductions in U.S. cardiovascular testing in the early phase of the pandemic and significant variability between U.S. regions. The association between reductions of volumes and COVID-19 prevalence in the United States highlighted the need for proactive efforts to maintain access to cardiovascular testing in areas most affected by outbreaks of COVID-19 infection

    Mortality from gastrointestinal congenital anomalies at 264 hospitals in 74 low-income, middle-income, and high-income countries: a multicentre, international, prospective cohort study

    Get PDF
    Summary Background Congenital anomalies are the fifth leading cause of mortality in children younger than 5 years globally. Many gastrointestinal congenital anomalies are fatal without timely access to neonatal surgical care, but few studies have been done on these conditions in low-income and middle-income countries (LMICs). We compared outcomes of the seven most common gastrointestinal congenital anomalies in low-income, middle-income, and high-income countries globally, and identified factors associated with mortality. Methods We did a multicentre, international prospective cohort study of patients younger than 16 years, presenting to hospital for the first time with oesophageal atresia, congenital diaphragmatic hernia, intestinal atresia, gastroschisis, exomphalos, anorectal malformation, and Hirschsprung’s disease. Recruitment was of consecutive patients for a minimum of 1 month between October, 2018, and April, 2019. We collected data on patient demographics, clinical status, interventions, and outcomes using the REDCap platform. Patients were followed up for 30 days after primary intervention, or 30 days after admission if they did not receive an intervention. The primary outcome was all-cause, in-hospital mortality for all conditions combined and each condition individually, stratified by country income status. We did a complete case analysis. Findings We included 3849 patients with 3975 study conditions (560 with oesophageal atresia, 448 with congenital diaphragmatic hernia, 681 with intestinal atresia, 453 with gastroschisis, 325 with exomphalos, 991 with anorectal malformation, and 517 with Hirschsprung’s disease) from 264 hospitals (89 in high-income countries, 166 in middleincome countries, and nine in low-income countries) in 74 countries. Of the 3849 patients, 2231 (58·0%) were male. Median gestational age at birth was 38 weeks (IQR 36–39) and median bodyweight at presentation was 2·8 kg (2·3–3·3). Mortality among all patients was 37 (39·8%) of 93 in low-income countries, 583 (20·4%) of 2860 in middle-income countries, and 50 (5·6%) of 896 in high-income countries (p<0·0001 between all country income groups). Gastroschisis had the greatest difference in mortality between country income strata (nine [90·0%] of ten in lowincome countries, 97 [31·9%] of 304 in middle-income countries, and two [1·4%] of 139 in high-income countries; p≤0·0001 between all country income groups). Factors significantly associated with higher mortality for all patients combined included country income status (low-income vs high-income countries, risk ratio 2·78 [95% CI 1·88–4·11], p<0·0001; middle-income vs high-income countries, 2·11 [1·59–2·79], p<0·0001), sepsis at presentation (1·20 [1·04–1·40], p=0·016), higher American Society of Anesthesiologists (ASA) score at primary intervention (ASA 4–5 vs ASA 1–2, 1·82 [1·40–2·35], p<0·0001; ASA 3 vs ASA 1–2, 1·58, [1·30–1·92], p<0·0001]), surgical safety checklist not used (1·39 [1·02–1·90], p=0·035), and ventilation or parenteral nutrition unavailable when needed (ventilation 1·96, [1·41–2·71], p=0·0001; parenteral nutrition 1·35, [1·05–1·74], p=0·018). Administration of parenteral nutrition (0·61, [0·47–0·79], p=0·0002) and use of a peripherally inserted central catheter (0·65 [0·50–0·86], p=0·0024) or percutaneous central line (0·69 [0·48–1·00], p=0·049) were associated with lower mortality. Interpretation Unacceptable differences in mortality exist for gastrointestinal congenital anomalies between lowincome, middle-income, and high-income countries. Improving access to quality neonatal surgical care in LMICs will be vital to achieve Sustainable Development Goal 3.2 of ending preventable deaths in neonates and children younger than 5 years by 2030

    31st Annual Meeting and Associated Programs of the Society for Immunotherapy of Cancer (SITC 2016) : part two

    Get PDF
    Background The immunological escape of tumors represents one of the main ob- stacles to the treatment of malignancies. The blockade of PD-1 or CTLA-4 receptors represented a milestone in the history of immunotherapy. However, immune checkpoint inhibitors seem to be effective in specific cohorts of patients. It has been proposed that their efficacy relies on the presence of an immunological response. Thus, we hypothesized that disruption of the PD-L1/PD-1 axis would synergize with our oncolytic vaccine platform PeptiCRAd. Methods We used murine B16OVA in vivo tumor models and flow cytometry analysis to investigate the immunological background. Results First, we found that high-burden B16OVA tumors were refractory to combination immunotherapy. However, with a more aggressive schedule, tumors with a lower burden were more susceptible to the combination of PeptiCRAd and PD-L1 blockade. The therapy signifi- cantly increased the median survival of mice (Fig. 7). Interestingly, the reduced growth of contralaterally injected B16F10 cells sug- gested the presence of a long lasting immunological memory also against non-targeted antigens. Concerning the functional state of tumor infiltrating lymphocytes (TILs), we found that all the immune therapies would enhance the percentage of activated (PD-1pos TIM- 3neg) T lymphocytes and reduce the amount of exhausted (PD-1pos TIM-3pos) cells compared to placebo. As expected, we found that PeptiCRAd monotherapy could increase the number of antigen spe- cific CD8+ T cells compared to other treatments. However, only the combination with PD-L1 blockade could significantly increase the ra- tio between activated and exhausted pentamer positive cells (p= 0.0058), suggesting that by disrupting the PD-1/PD-L1 axis we could decrease the amount of dysfunctional antigen specific T cells. We ob- served that the anatomical location deeply influenced the state of CD4+ and CD8+ T lymphocytes. In fact, TIM-3 expression was in- creased by 2 fold on TILs compared to splenic and lymphoid T cells. In the CD8+ compartment, the expression of PD-1 on the surface seemed to be restricted to the tumor micro-environment, while CD4 + T cells had a high expression of PD-1 also in lymphoid organs. Interestingly, we found that the levels of PD-1 were significantly higher on CD8+ T cells than on CD4+ T cells into the tumor micro- environment (p < 0.0001). Conclusions In conclusion, we demonstrated that the efficacy of immune check- point inhibitors might be strongly enhanced by their combination with cancer vaccines. PeptiCRAd was able to increase the number of antigen-specific T cells and PD-L1 blockade prevented their exhaus- tion, resulting in long-lasting immunological memory and increased median survival

    Fermentative production of β-Carotene from sugarcane bagasse hydrolysate by Rhodotorula glutinis CCT-2186

    No full text
    Β-Carotene is a red–orange pigment that serves as a precursor to important pharmaceutical molecules like vitamin A and retinol, making it highly significant in the industrial sector. Consequently, there is an ongoing quest for more sustainable production methods. In this study, glucose and xylose, two primary sugars derived from sugarcane bagasse (SCB), were utilized as substrates for β-carotene production by Rhodotorula glutinis CCT-2186. To achieve this, SCB underwent pretreatment using NaOH, involved different concentrations of total solids (TS) (10%, 15%, and 20%) to remove lignin. Each sample was enzymatically hydrolyzed using two substrate loadings (5% and 10%). The pretreated SCB with 10%, 15%, and 20% TS exhibited glucose hydrolysis yields (%wt) of 93.10%, 91.88%, and 90.77%, respectively. The resulting hydrolysate was employed for β-carotene production under batch fermentation. After 72 h of fermentation, the SCB hydrolysate yielded a β-carotene concentration of 118.56 ± 3.01 mg/L. These findings showcase the robustness of R. glutinis as a biocatalyst for converting SCB into β-carotene
    corecore