104 research outputs found

    Essential Oils of Three Aromatic Plant Species as Natural Herbicides for Environmentally Friendly Agriculture

    Full text link
    [EN] Natural herbicides based on essential oils (EOs) extracted from aromatic plants are gaining relevance in contemporary agriculture. Due to their allelopathic properties, they have an inhibitory effect on the germination and growth of different species, having, in general, the advantage of high specificity. For this reason, the analysis of the effects of these natural compounds on noxious weeds is continuously increasing. In the present study, three commercial EOs extracted from Mentha piperita L., Thymbra capitata (L.) Cav. and Santolina chamaecyparissus L. were tested on two invasive weeds with an increasing presence in southern Europe, Erigeron bonariensis L. and Araujia sericifera Brot. Five concentrations (0.125, 0.25, 0.50, 1 and 2 mu L mL(-1)) were tested in a randomized manner for each essential oil and five replicates with 20 seeds each for E. bonariensis and 10 replicates with 10 seeds each for A. sericifera. Two higher concentrations of 4 and 8 mu L mL(-1) of the three EOs were applied with irrigation on the plants of the two species at the vegetative growth stage. The number of replicas for each treatment and species was 7. The results obtained confirmed the significant inhibitory effects on seed germination and early seedling development, especially in E. bonariensis; of the three EOs, peppermint had the strongest effect, completely preventing germination in both species. Multivariate analysis, performed on several morphological traits scored after one month of treatment in young plants, showed a different pattern: the highest inhibition was recorded in A. sericifera and the greatest reduction in growth in the treatment with the highest dose of Santolina EO. The results obtained revealed the efficacy of these natural compounds and the specificity of their toxicity according to the species and stage of development.Bellache, M.; Torres-Pagan, N.; Verdeguer Sancho, MM.; Benfekih, LA.; Vicente, O.; Sestras, RE.; Sestras, AF.... (2022). Essential Oils of Three Aromatic Plant Species as Natural Herbicides for Environmentally Friendly Agriculture. Sustainability. 14(6):1-22. https://doi.org/10.3390/su1406359612214

    Actas Portuguesas de Horticultura

    Get PDF
    La producción de estátice en el sur de la Comunidad Valenciana es de gran relevancia en la floricultura. Se trata de una especie halotolerante en la que, hoy por hoy, para el manejo de su fertirrigación se utilizan aguas de baja salinidad en la que se incorporan elevados contenidos de nutrientes. Tanto el hecho de que se consigan elevados rendimientos de calidad como el coste menor que suponen los fertilizantes respecto a los del resto de los insumos de cultivo, explican que esta práctica se haya establecido. Se sabe, sin embargo, que el estátice es una planta poco exigente en nutrientes minerales, por lo que son de suponer elevadas pérdidas de los mismos por lixiviación, que no son permisibles desde una perspectiva de sostenibilidad. Minimizar los aportes minerales sin que se produzcan mermas ni en el rendimiento ni en la calidad de la cosecha es un claro objetivo a perseguir. El periodo de recolección se inicia a final de octubre y es continuo hasta final de mayo con variaciones estacionales de las tasas de producción. Este trabajo se plantea para conocer los efectos de la reducción en un 50% de los aportes minerales sobre el rendimiento y la calidad de la cosecha a lo largo del ciclo de cultivo. Para ello, en un mismo invernadero y a lo largo de todo un ciclo de cultivo, se ha llevado el seguimiento y comparación de la producción de tallos florales de dos poblaciones de Limonium sinuatum cv. Duel Violet, con la misma fecha de plantación. Para cada uno de los tratamientos, se han determinado semanalmente tanto la cantidad (número de tallos florales, pesos fresco y seco) como la calidad (longitud de los tallos, de las panículas, relaciones peso seco vs peso fresco) de los tallos florales cosechados. Paralelamente, se han obtenido relaciones de la biomasa producida con respecto a la radiación interceptada y la integral térmica. No se muestran diferencias cualitativas entre ambos tratamientos en casi todo el periodo de cultivo. Cuantitativamente, desde noviembre hasta marzo las respuestas de los tratamientos son similares, dependiendo el rendimiento tanto de la radiación interceptada como de la integral térmica, sin embargo, en primavera, con temperaturas medias elevadas y fotoperiodos más largos que revierten en mayores tasas de producción, la menor aportación mineral resulta en un ligero descenso del rendimiento en unidades totales al final del cultivo, que la planta compensa aumentando el peso de los tallos florales, y la calidad de los mismos

    Complement C3 Deficiency Attenuates Chronic Hypoxia-Induced Pulmonary Hypertension in Mice

    Get PDF
    Background: Evidence suggests a role of both innate and adaptive immunity in the development of pulmonary arterial hypertension. The complement system is a key sentry of the innate immune system and bridges innate and adaptive immunity. To date there are no studies addressing a role for the complement system in pulmonary arterial hypertension. Methodology/Principal Findings: Immunofluorescent staining revealed significant C3d deposition in lung sections from IPAH patients and C57Bl6/J wild-type mice exposed to three weeks of chronic hypoxia to induce pulmonary hypertension. Right ventricular systolic pressure and right ventricular hypertrophy were increased in hypoxic vs. normoxic wild-type mice, which were attenuated in C3-/- hypoxic mice. Likewise, pulmonary vascular remodeling was attenuated in the C3-/- mice compared to wild-type mice as determined by the number of muscularized peripheral arterioles and morphometric analysis of vessel wall thickness. The loss of C3 attenuated the increase in interleukin-6 and intracellular adhesion molecule-1 expression in response to chronic hypoxia, but not endothelin-1 levels. In wild-type mice, but not C3-/- mice, chronic hypoxia led to platelet activation as assessed by bleeding time, and flow cytometry of platelets to determine cell surface P-selectin expression. In addition, tissue factor expression and fibrin deposition were increased in the lungs of WT mice in response to chronic hypoxia. These pro-thrombotic effects of hypoxia were abrogated in C3-/- mice. Conclusions: Herein, we provide compelling genetic evidence that the complement system plays a pathophysiologic role in the development of PAH in mice, promoting pulmonary vascular remodeling and a pro-thrombotic phenotype. In addition we demonstrate C3d deposition in IPAH patients suggesting that complement activation plays a role in the development of PAH in humans. © 2011 Bauer et al

    Inhibitor of apoptosis proteins, NAIP, cIAP1 and cIAP2 expression during macrophage differentiation and M1/M2 polarization

    Get PDF
    Monocytes and macrophages constitute the first line of defense of the immune system against external pathogens. Macrophages have a highly plastic phenotype depending on environmental conditions; the extremes of this phenotypic spectrum are a pro-inflammatory defensive role (M1 phenotype) and an anti-inflammatory tissue-repair one (M2 phenotype). The Inhibitor of Apoptosis (IAP) proteins have important roles in the regulation of several cellular processes, including innate and adaptive immunity. In this study we have analyzed the differential expression of the IAPs, NAIP, cIAP1 and cIAP2, during macrophage differentiation and polarization into M1 or M2. In polarized THP-1 cells and primary human macrophages, NAIP is abundantly expressed in M2 macrophages, while cIAP1 and cIAP2 show an inverse pattern of expression in polarized macrophages, with elevated expression levels of cIAP1 in M2 and cIAP2 preferentially expressed in M1. Interestingly, treatment with the IAP antagonist SMC-LCL161, induced the upregulation of NAIP in M2, the downregulation of cIAP1 in M1 and M2 and an induction of cIAP2 in M1 macrophages.This work was supported by Universidad de Granada, Plan Propio 2015;#P3B: FAM, VMC (http://investigacion.ugr.es/pages/planpropio/2015/ resoluciones/p3b_def_28072015); Universidad de Granada CEI BioTic;#CAEP2-84: VMC (http:// biotic.ugr.es/pages/resolucionprovisional enseaanzapractica22demayo/!); and Canadian nstitutes of Health Research;#231421, #318176, #361847: STB, ECL, RK (http://www.cihr-irsc.gc. ca/e/193.html). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript

    Renal malformations associated with mutations of developmental genes: messages from the clinic

    Get PDF
    Renal tract malformations (RTMs) account for about 40% of children with end-stage renal failure. RTMs can be caused by mutations of genes normally active in the developing kidney and lower renal tract. Moreover, some RTMs occur in the context of multi-organ malformation syndromes. For these reasons, and because genetic testing is becoming more widely available, pediatric nephrologists should work closely with clinical geneticists to make genetic diagnoses in children with RTMs, followed by appropriate family counseling. Here we highlight families with renal cysts and diabetes, renal coloboma and Fraser syndromes, and a child with microdeletion of chromosome 19q who had a rare combination of malformations. Such diagnoses provide families with often long-sought answers to the question “why was our child born with kidney disease”. Precise genetic diagnoses will also help to define cohorts of children with RTMs for long-term clinical outcome studies

    Intraperitoneal drain placement and outcomes after elective colorectal surgery: international matched, prospective, cohort study

    Get PDF
    Despite current guidelines, intraperitoneal drain placement after elective colorectal surgery remains widespread. Drains were not associated with earlier detection of intraperitoneal collections, but were associated with prolonged hospital stay and increased risk of surgical-site infections.Background Many surgeons routinely place intraperitoneal drains after elective colorectal surgery. However, enhanced recovery after surgery guidelines recommend against their routine use owing to a lack of clear clinical benefit. This study aimed to describe international variation in intraperitoneal drain placement and the safety of this practice. Methods COMPASS (COMPlicAted intra-abdominal collectionS after colorectal Surgery) was a prospective, international, cohort study which enrolled consecutive adults undergoing elective colorectal surgery (February to March 2020). The primary outcome was the rate of intraperitoneal drain placement. Secondary outcomes included: rate and time to diagnosis of postoperative intraperitoneal collections; rate of surgical site infections (SSIs); time to discharge; and 30-day major postoperative complications (Clavien-Dindo grade at least III). After propensity score matching, multivariable logistic regression and Cox proportional hazards regression were used to estimate the independent association of the secondary outcomes with drain placement. Results Overall, 1805 patients from 22 countries were included (798 women, 44.2 per cent; median age 67.0 years). The drain insertion rate was 51.9 per cent (937 patients). After matching, drains were not associated with reduced rates (odds ratio (OR) 1.33, 95 per cent c.i. 0.79 to 2.23; P = 0.287) or earlier detection (hazard ratio (HR) 0.87, 0.33 to 2.31; P = 0.780) of collections. Although not associated with worse major postoperative complications (OR 1.09, 0.68 to 1.75; P = 0.709), drains were associated with delayed hospital discharge (HR 0.58, 0.52 to 0.66; P < 0.001) and an increased risk of SSIs (OR 2.47, 1.50 to 4.05; P < 0.001). Conclusion Intraperitoneal drain placement after elective colorectal surgery is not associated with earlier detection of postoperative collections, but prolongs hospital stay and increases SSI risk

    Pooled analysis of WHO Surgical Safety Checklist use and mortality after emergency laparotomy

    Get PDF
    Background The World Health Organization (WHO) Surgical Safety Checklist has fostered safe practice for 10 years, yet its place in emergency surgery has not been assessed on a global scale. The aim of this study was to evaluate reported checklist use in emergency settings and examine the relationship with perioperative mortality in patients who had emergency laparotomy. Methods In two multinational cohort studies, adults undergoing emergency laparotomy were compared with those having elective gastrointestinal surgery. Relationships between reported checklist use and mortality were determined using multivariable logistic regression and bootstrapped simulation. Results Of 12 296 patients included from 76 countries, 4843 underwent emergency laparotomy. After adjusting for patient and disease factors, checklist use before emergency laparotomy was more common in countries with a high Human Development Index (HDI) (2455 of 2741, 89.6 per cent) compared with that in countries with a middle (753 of 1242, 60.6 per cent; odds ratio (OR) 0.17, 95 per cent c.i. 0.14 to 0.21, P <0001) or low (363 of 860, 422 per cent; OR 008, 007 to 010, P <0.001) HDI. Checklist use was less common in elective surgery than for emergency laparotomy in high-HDI countries (risk difference -94 (95 per cent c.i. -11.9 to -6.9) per cent; P <0001), but the relationship was reversed in low-HDI countries (+121 (+7.0 to +173) per cent; P <0001). In multivariable models, checklist use was associated with a lower 30-day perioperative mortality (OR 0.60, 0.50 to 073; P <0.001). The greatest absolute benefit was seen for emergency surgery in low- and middle-HDI countries. Conclusion Checklist use in emergency laparotomy was associated with a significantly lower perioperative mortality rate. Checklist use in low-HDI countries was half that in high-HDI countries.Peer reviewe

    Surgical site infection after gastrointestinal surgery in high-income, middle-income, and low-income countries: a prospective, international, multicentre cohort study

    Get PDF
    Background: Surgical site infection (SSI) is one of the most common infections associated with health care, but its importance as a global health priority is not fully understood. We quantified the burden of SSI after gastrointestinal surgery in countries in all parts of the world. Methods: This international, prospective, multicentre cohort study included consecutive patients undergoing elective or emergency gastrointestinal resection within 2-week time periods at any health-care facility in any country. Countries with participating centres were stratified into high-income, middle-income, and low-income groups according to the UN's Human Development Index (HDI). Data variables from the GlobalSurg 1 study and other studies that have been found to affect the likelihood of SSI were entered into risk adjustment models. The primary outcome measure was the 30-day SSI incidence (defined by US Centers for Disease Control and Prevention criteria for superficial and deep incisional SSI). Relationships with explanatory variables were examined using Bayesian multilevel logistic regression models. This trial is registered with ClinicalTrials.gov, number NCT02662231. Findings: Between Jan 4, 2016, and July 31, 2016, 13 265 records were submitted for analysis. 12 539 patients from 343 hospitals in 66 countries were included. 7339 (58·5%) patient were from high-HDI countries (193 hospitals in 30 countries), 3918 (31·2%) patients were from middle-HDI countries (82 hospitals in 18 countries), and 1282 (10·2%) patients were from low-HDI countries (68 hospitals in 18 countries). In total, 1538 (12·3%) patients had SSI within 30 days of surgery. The incidence of SSI varied between countries with high (691 [9·4%] of 7339 patients), middle (549 [14·0%] of 3918 patients), and low (298 [23·2%] of 1282) HDI (p < 0·001). The highest SSI incidence in each HDI group was after dirty surgery (102 [17·8%] of 574 patients in high-HDI countries; 74 [31·4%] of 236 patients in middle-HDI countries; 72 [39·8%] of 181 patients in low-HDI countries). Following risk factor adjustment, patients in low-HDI countries were at greatest risk of SSI (adjusted odds ratio 1·60, 95% credible interval 1·05–2·37; p=0·030). 132 (21·6%) of 610 patients with an SSI and a microbiology culture result had an infection that was resistant to the prophylactic antibiotic used. Resistant infections were detected in 49 (16·6%) of 295 patients in high-HDI countries, in 37 (19·8%) of 187 patients in middle-HDI countries, and in 46 (35·9%) of 128 patients in low-HDI countries (p < 0·001). Interpretation: Countries with a low HDI carry a disproportionately greater burden of SSI than countries with a middle or high HDI and might have higher rates of antibiotic resistance. In view of WHO recommendations on SSI prevention that highlight the absence of high-quality interventional research, urgent, pragmatic, randomised trials based in LMICs are needed to assess measures aiming to reduce this preventable complication
    corecore