55 research outputs found

    Secuestro de C en plantaciones de Eucalyptus spp establecidas en terrenos agrícolas en el norte de España

    Get PDF
    Una de las estrategias que se ha planteado para fijar gases con efecto invernadero (CO2, CH4 y N2O) es el aumento de la superficie forestal. España es uno de los países que más han contribuido a incrementar la superficie forestal en Europa, lo que se debe en buena parte al establecimiento de parcelas en tierras agrarias en situación de marginalidad. Durante el período de 11 años, transcurridos entre el segundo y el tercer inventarios forestales (1987 y 1998), el almacenamiento de C en biomasa arbórea en el norte de España ha aumentado en un 50 %. En este trabajo se realiza una primera estimación de las ganancias de C en biomasa y suelo en plantaciones de eucalipto establecidas en terrenos agrícolas. Para ello se seleccionaron un total de 25 pares de parcelas. Se trata de parcelas agrarias en las cuales parte de la superficie se transformó a plantación de Eucalyptus sp. En estas parcelas se determinó la acumulación de C en biomasa arbórea, mantillo y suelo en tres profundidades 0-5, 5-15 y 15-30 cm. Dado las mejores condiciones de fertilidad y mayor profundidad, que promueven el crecimiento arbóreo, la tasa de acumulación de C en biomasa es considerablemente superior a las plantaciones establecidas en suelos forestales. Los datos de este trabajo muestran acumulaciones superiores a 17 Mg ha-1 año-1. La acumulación de C en el mantillo también es superior a 1 Mg ha-1 año-1. En cuanto al suelo mineral, las pérdidas de C se reducen con los años desde el cambio de uso, fundamentalmente a partir de los 15 años del establecimiento y en plantaciones sobre esquistos y pizarras. Hasta ese momento se produjeron ligeras pérdidas.____________________________________One of the strategies proposed for fixing gases that contribute to the greenhouse effect (CO2, CH4 and N2O) is to increase the extension of land covered by forest. Spain is one of the countries that has contributed most to the increase in the forest area in Europe, largely due to the establishment of forest plots on marginal agricultural land. During the 11- year period between the second and third forest inventories (1987 and 1998), there has been a 50% increase in the storage of C in tree biomass in northern Spain. In the present study, a first estimate was made of the increase in C in biomass and soil in eucalyptus plantations established on agricultural land. A total of 25 pairs of plots were selected for the study. The plots are on agricultural land, part of which has been transformed by plantation of Eucalyptus sp. The amount of C accumulated in the tree biomass, humus and three different depths of soil (0-5, 5-15 and 15-30 cm) was determined. Given the more fertile conditions and the greater depth of the soil, which favour tree growth, the rate of accumulation of C in the biomass was considerably higher than in plantations established on forest soils. The data obtained in the present study revealed accumulations of C of more than 17 Mg ha-1 year-1. Accumulation of C in the humus was also higher than 1 Mg ha-1 year-1. The losses of C in the mineral soil decreased since the change in land use, generally from 15 years after establishment onwards, and in plantations on schists and slates. Slight losses of C were observed up until this time

    Chronic kidney disease in the context of multimorbidity patterns: the role of physical performance

    Get PDF
    Background: Chronic kidney disease (CKD) is known to be associated with several co-occurring conditions. We aimed at exploring multimorbidity patterns associated with CKD, as well as the impact of physical performance and CKD severity on them in a population of older outpatients. Methods: Our series consisted of 2252 patients enrolled in the Screening of CKD among Older People across Europe multicenter observational study. Hypertension, stroke, transient ischemic attack, cancer, hip fracture, osteoporosis, Parkinson's disease, asthma, chronic obstructive pulmonary disease, congestive heart failure, angina, myocardial infarction, atrial fibrillation, anemia, CKD (defined as GFR < 60, <45 or < 30 ml/min/1.73 m(2)), cognitive impairment, depression, hearing impairment and vision impairment were included in the analyses. Physical performance was assessed by the Short Physical Performance Battery (SPPB) and used as stratification variable. Pairs of co-occurring diseases were analyzed by logistic regression. Patterns of multimorbidity were investigated by hierarchical cluster analysis. Results: CKD was among the most frequently observed conditions and it was rarely observed without any other co-occurring disease. CKD was significantly associated with hypertension, anemia, heart failure, atrial fibrillation, myocardial infarction and hip fracture. When stratifying by SPPB, CKD was also significantly associated with vision impairment in SPPB = 5-8 group, and hearing impairment in SPPB = 0-4 group. Cluster analysis individuated two main clusters, one including CKD, hypertension and sensory impairments, and the second including all other conditions. Stratifying by SPPB, CKD contribute to a cluster including diabetes, anemia, osteoporosis, hypertension and sensory impairments in the SPPB = 0-4 group. When defining CKD as eGFR< 45 or 30 ml/min/1.73 m(2), the strength of the association of CKD with hypertension, sensory impairments, osteoporosis, anemia and CHF increased together with CKD severity in pairs analysis. Severe CKD (eGFR< 30 ml/min/1.73 m(2)) contributed to a wide cluster including cardiovascular, respiratory and neurologic diseases, as well as osteoporosis, hip fracture and cancer. Conclusions: CKD and its severity may contribute significantly to specific multimorbidity patterns, at least based on the cluster analysis. Physical performance as assessed by SPPB may be associated with not negligible changes in both co-occurring pairs and multimorbidity clusters

    Use of tocilizumab in kidney transplant recipients with COVID-1

    Get PDF
    Acute respiratory distress syndrome associated with coronavirus infection is related to a cytokine storm with large interleukin-6 (IL-6) release. The IL-6-receptor blocker tocilizumab may control the aberrant host immune response in patients with coronavirus disease 2019 (COVID-19) . In this pandemic, kidney transplant (KT) recipients are a high-risk population for severe infection and showed poor outcomes. We present a multicenter cohort study of 80 KT patients with severe COVID-19 treated with tocilizumab during hospital admission. High mortality rate was identified (32.5%), related with older age (hazard ratio [HR] 3.12 for those older than 60 years, P = .039). IL-6 and other inflammatory markers, including lactic acid dehydrogenase, ferritin, and D-dimer increased early after tocilizumab administration and their values were higher in nonsurvivors. Instead, C-reactive protein (CRP) levels decreased after tocilizumab, and this decrease positively correlated with survival (mean 12.3 mg/L in survivors vs. 33 mg/L in nonsurvivors). Each mg/L of CRP soon after tocilizumab increased the risk of death by 1% (HR 1.01 [confidence interval 1.004-1.024], P = .003). Although patients who died presented with worse respiratory situation at admission, this was not significantly different at tocilizumab administration and did not have an impact on outcome in the multivariate analysis. Tocilizumab may be effective in controlling cytokine storm in COVID-19 but randomized trials are needed

    Correction to: Two years later: Is the SARS-CoV-2 pandemic still having an impact on emergency surgery? An international cross-sectional survey among WSES members

    Get PDF
    Background: The SARS-CoV-2 pandemic is still ongoing and a major challenge for health care services worldwide. In the first WSES COVID-19 emergency surgery survey, a strong negative impact on emergency surgery (ES) had been described already early in the pandemic situation. However, the knowledge is limited about current effects of the pandemic on patient flow through emergency rooms, daily routine and decision making in ES as well as their changes over time during the last two pandemic years. This second WSES COVID-19 emergency surgery survey investigates the impact of the SARS-CoV-2 pandemic on ES during the course of the pandemic. Methods: A web survey had been distributed to medical specialists in ES during a four-week period from January 2022, investigating the impact of the pandemic on patients and septic diseases both requiring ES, structural problems due to the pandemic and time-to-intervention in ES routine. Results: 367 collaborators from 59 countries responded to the survey. The majority indicated that the pandemic still significantly impacts on treatment and outcome of surgical emergency patients (83.1% and 78.5%, respectively). As reasons, the collaborators reported decreased case load in ES (44.7%), but patients presenting with more prolonged and severe diseases, especially concerning perforated appendicitis (62.1%) and diverticulitis (57.5%). Otherwise, approximately 50% of the participants still observe a delay in time-to-intervention in ES compared with the situation before the pandemic. Relevant causes leading to enlarged time-to-intervention in ES during the pandemic are persistent problems with in-hospital logistics, lacks in medical staff as well as operating room and intensive care capacities during the pandemic. This leads not only to the need for triage or transferring of ES patients to other hospitals, reported by 64.0% and 48.8% of the collaborators, respectively, but also to paradigm shifts in treatment modalities to non-operative approaches reported by 67.3% of the participants, especially in uncomplicated appendicitis, cholecystitis and multiple-recurrent diverticulitis. Conclusions: The SARS-CoV-2 pandemic still significantly impacts on care and outcome of patients in ES. Well-known problems with in-hospital logistics are not sufficiently resolved by now; however, medical staff shortages and reduced capacities have been dramatically aggravated over last two pandemic years

    Infected pancreatic necrosis: outcomes and clinical predictors of mortality. A post hoc analysis of the MANCTRA-1 international study

    Get PDF
    : The identification of high-risk patients in the early stages of infected pancreatic necrosis (IPN) is critical, because it could help the clinicians to adopt more effective management strategies. We conducted a post hoc analysis of the MANCTRA-1 international study to assess the association between clinical risk factors and mortality among adult patients with IPN. Univariable and multivariable logistic regression models were used to identify prognostic factors of mortality. We identified 247 consecutive patients with IPN hospitalised between January 2019 and December 2020. History of uncontrolled arterial hypertension (p = 0.032; 95% CI 1.135-15.882; aOR 4.245), qSOFA (p = 0.005; 95% CI 1.359-5.879; aOR 2.828), renal failure (p = 0.022; 95% CI 1.138-5.442; aOR 2.489), and haemodynamic failure (p = 0.018; 95% CI 1.184-5.978; aOR 2.661), were identified as independent predictors of mortality in IPN patients. Cholangitis (p = 0.003; 95% CI 1.598-9.930; aOR 3.983), abdominal compartment syndrome (p = 0.032; 95% CI 1.090-6.967; aOR 2.735), and gastrointestinal/intra-abdominal bleeding (p = 0.009; 95% CI 1.286-5.712; aOR 2.710) were independently associated with the risk of mortality. Upfront open surgical necrosectomy was strongly associated with the risk of mortality (p &lt; 0.001; 95% CI 1.912-7.442; aOR 3.772), whereas endoscopic drainage of pancreatic necrosis (p = 0.018; 95% CI 0.138-0.834; aOR 0.339) and enteral nutrition (p = 0.003; 95% CI 0.143-0.716; aOR 0.320) were found as protective factors. Organ failure, acute cholangitis, and upfront open surgical necrosectomy were the most significant predictors of mortality. Our study confirmed that, even in a subgroup of particularly ill patients such as those with IPN, upfront open surgery should be avoided as much as possible. Study protocol registered in ClinicalTrials.Gov (I.D. Number NCT04747990)

    Evaluation of appendicitis risk prediction models in adults with suspected appendicitis

    Get PDF
    Background Appendicitis is the most common general surgical emergency worldwide, but its diagnosis remains challenging. The aim of this study was to determine whether existing risk prediction models can reliably identify patients presenting to hospital in the UK with acute right iliac fossa (RIF) pain who are at low risk of appendicitis. Methods A systematic search was completed to identify all existing appendicitis risk prediction models. Models were validated using UK data from an international prospective cohort study that captured consecutive patients aged 16–45 years presenting to hospital with acute RIF in March to June 2017. The main outcome was best achievable model specificity (proportion of patients who did not have appendicitis correctly classified as low risk) whilst maintaining a failure rate below 5 per cent (proportion of patients identified as low risk who actually had appendicitis). Results Some 5345 patients across 154 UK hospitals were identified, of which two‐thirds (3613 of 5345, 67·6 per cent) were women. Women were more than twice as likely to undergo surgery with removal of a histologically normal appendix (272 of 964, 28·2 per cent) than men (120 of 993, 12·1 per cent) (relative risk 2·33, 95 per cent c.i. 1·92 to 2·84; P < 0·001). Of 15 validated risk prediction models, the Adult Appendicitis Score performed best (cut‐off score 8 or less, specificity 63·1 per cent, failure rate 3·7 per cent). The Appendicitis Inflammatory Response Score performed best for men (cut‐off score 2 or less, specificity 24·7 per cent, failure rate 2·4 per cent). Conclusion Women in the UK had a disproportionate risk of admission without surgical intervention and had high rates of normal appendicectomy. Risk prediction models to support shared decision‐making by identifying adults in the UK at low risk of appendicitis were identified

    Development and long-term dynamics of old-growth beech-fir forests in the Pyrenees: Evidence from dendroecology and dynamic vegetation modelling

    No full text
    Ecological knowledge on long-term forest dynamics and development has been primarily derived from the study of old-growth forests. Centuries of forest management have decreased the extent of temperate old-growth forests in Europe and altered managed forests. Disentangling the effects of past human disturbances and climate on current species composition is crucial for understanding the long-term development of forests under global change. In this study, we investigated disturbance and recruitment dynamics in two forests in the Western Pyrenees (Spain) with contrasting management history: an old-growth forest and a long-untouched forest, both dominated by the two shade-tolerant species Fagus sylvatica (European beech) and Abies alba (Silver fir). We used dendroecological methods in seven plots to analyse forest structure, growth patterns and disturbance histories in these forests. We benchmarked these data with the dynamic vegetation model ForClim to examine the effects of natural and human-induced disturbances on forest development, structure and species composition. Disturbance regimes differed between the study forests, but none showed evidence of stand replacing disturbances, either natural or human induced. Low disturbance rates and continuous recruitment of beech and fir dominated the old-growth forest over the last 400 years. In contrast, the long-untouched forest was intensively disturbed in 1700–1780, probably by logging, with lower natural disturbance rates thereafter. Beech and fir recruitment preferentially occurred after more intense disturbances, despite the high shade tolerance of both beech and fir. Higher fir abundance in the long-untouched forest than in the old-growth forest appeared to be related to its human-induced disturbances. ForClim closely simulated forest potential natural vegetation with a dominance of beech over fir, but overestimated the presence of less shade-tolerant species. Previously observed local fir decline may result from natural forest successional processes after logging. Within ∼200 years after logging cessation, some long-untouched forest structural attributes converged towards old-growth forest, but legacy effects still affected species composition and structure. Natural disturbance regimes in beech-fir forests of the Western Pyrenees induce temporal fluctuations between beech and fir abundance, with a natural tendency for beech dominance in advanced developmental stages with low disturbance rates.This work was funded by projects AGL2015-73190-JIN, PID2019-110273RB-I00, AGL2016-76769-C2-2-R and PID2020-119204RB-C22 by the Spanish Ministry of Science and Innovation MCIN/AEI. DM-B was also funded by contract RYC-2017-23389, and CP-C by contract RYC-2018-024939

    On the site-level suitability of biomass models

    No full text
    Tree biomass estimates in environmental studies are based on allometric models, which are known to vary with species, site, and other forest characteristics. The UNFCCC published a guideline to evaluate the appropriateness of biomass models before application, but it misleads the concept of model suitability and does also allow the selection of models with systematic deviations in the predictions. Here we present an alternative approach based on non-parametric techniques. The approach was tested for pure stands, but this methodology is likewise applicable to mixed forests. The proposed tests perform well in rejecting a model if the predictions for the targeted population are systematically deviant. It is demonstrated that the suitability of an allometric model is a matter of accuracy. The proposed method also allows localizing the model. The presented approach can improve the transparency of global forest monitoring systems and can be implemented with relatively small effort. © 2015 Elsevier Ltd

    Productivity model and reference diagram for short rotation biomass crops of poplar grown in Mediterranean environments

    No full text
    A Reference Diagram (RD) was constructed for first rotations of the Euroamerican poplar 'I-214' grown as short rotation coppice (SRC). Data from 144 plots, established in eleven sites in Mediterranean environments, were used to develop the model. The density at establishment of the plantations ranged between 6666 and 33,333stoolsha-1, covering the usual densities ranges used in short rotation forestry (SRF). The RD was based on a density-independent mortality model that relates the density of living stools to the average height of dominant shoot and the initial plantation density, and it includes a system of two simultaneously fitted equations relating a) quadratic mean basal diameter of dominant shoots to the average height of dominant shoot and the final density, and b) total above-ground woody dry biomass to quadratic mean basal diameter and final density. The isolines in the RD represented mortality, quadratic mean basal diameter of dominant shoots and total above-ground woody dry biomass at the end of a first rotation of three years. The final yield in terms of biomass ranged from 1 to 85Mgdmha-1. The RD enables rapid and straightforward comparison of different situations, both at planting and at harvesting, and is a useful tool, based on a wide range of empirical data, for management and decision making regarding short rotation poplar crops. © 2014 Elsevier Ltd
    corecore