73 research outputs found

    The search for novel diagnostic and prognostic biomarkers in cholangiocarcinoma

    Get PDF
    [EN] The poor prognosis of cholangiocarcinoma (CCA) is in part due to late diagnosis, which is currently achieved by a combination of clinical, radiological and histological approaches. Available biomarkers determined in serum and biopsy samples to assist in CCA diagnosis are not sufficiently sensitive and specific. Therefore, the identification of new biomarkers, preferably those obtained by minimally invasive methods, such as liquid biopsy, is important. The development of innovative technologies has permitted to identify a significant number of genetic, epigenetic, proteomic and metabolomic CCA features with potential clinical usefulness in early diagnosis, prognosis or prediction of treatment response. Potential new candidates must be rigorously evaluated prior to entering routine clinical application. Unfortunately, to date, no such biomarker has achieved validation for these purposes. This review is an up-to-date of currently used biomarkers and the candidates with promising characteristics that could be included in the clinical practice in the next future. This article is part of a Special Issue entitled: Cholangiocytes in Health and Disease edited by Jesus Banales, Marco Marzioni, Nicholas LaRusso and Peter Jansen

    Screening risk assessment tools for assessing the environmental impact in an abandoned pyritic mine in Spain

    Full text link
    This is the author’s version of a work that was accepted for publication in Science of the Total Environment. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Science of the Total Environment 409.4 (2011): 692-703 http://dx.doi.org/10.1016/j.scitotenv.2010.10.056This paper describes a new methodology for assessing site-specific environmental impact of contaminants. The proposed method integrates traditional risk assessment approaches with real and variable environmental characteristics at a local scale. Environmental impact on selected receptors was classified for each environmental compartment into 5 categories derived from the whole (chronic and acute) risk assessment using 8 risk levels. Risk levels were established according to three hazard quotients (HQs) which represented the ratio of exposure to acute and chronic toxicity values. This tool allowed integrating in only one impact category all the elements involved in the standard risk assessment. The methodology was applied to an abandoned metal mine in Spain, where high levels of As, Cd, Zn and Cu were detected. Risk affecting potential receptors such as aquatic and soil organisms and terrestrial vertebrates were assessed. Whole results showed that impact to the ecosystem is likely high and further investigation or remedial actions are necessary. Some proposals to refine the risk assessment for a more realistic diagnostic are included.This work has been financed by Madrid Community through EIADES Project S-505/AMB/0296, and by Spanish MinistryfEducation and Science, project CTM-2007-66401-CO2/TECN

    Dietary diversity and nutritional adequacy among an older Spanish population with metabolic syndrome in the PREDIMED-Plus study: a cross-sectional analysis

    Get PDF
    Dietary guidelines emphasize the importance of a varied diet to provide an adequate nutrient intake. However, an older age is often associated with consumption of monotonous diets that can be nutritionally inadequate, increasing the risk for the development or progression of diet-related chronic diseases, such as metabolic syndrome (MetS). To assess the association between dietary diversity (DD) and nutrient intake adequacy and to identify demographic variables associated with DD, we cross-sectionally analyzed baseline data from the PREDIMED-Plus trial: 6587 Spanish adults aged 55-75 years, with overweight/obesity who also had MetS. An energy-adjusted dietary diversity score (DDS) was calculated using a 143-item validated semi-quantitative food frequency questionnaire (FFQ). Nutrient inadequacy was defined as an intake below 2/3 of the dietary reference intake (DRI) forat least four of 17 nutrients proposed by the Institute of Medicine (IOM). Logistic regression models were used to evaluate the association between DDS and the risk of nutritionally inadequate intakes. In the higher DDS quartile there were more women and less current smokers. Compared with subjects in the highest DDS quartile, those in the lowest DDS quartile had a higher risk of inadequate nutrient intake: odds ratio (OR) = 28.56 (95% confidence interval (CI) 20.80-39.21). When we estimated food varietyfor each of the food groups, participants in the lowest quartile had a higher risk of inadequate nutrient intake for the groups of vegetables, OR = 14.03 (95% CI 10.55-18.65), fruits OR = 11.62 (95% CI 6.81-19.81), dairy products OR = 6.54 (95% CI 4.64-9.22) and protein foods OR = 6.60 (95% CI 1.96-22.24). As DDS decreased, the risk of inadequate nutrients intake rose. Given the impact of nutrient intake adequacy on the prevention of non-communicable diseases, health policies should focus on the promotion of a healthy varied diet, specifically promoting the intake of vegetables and fruit among population groups with lower DDS such as men, smokers or widow(er)s

    Ultra-processed foods consumption as a promoting factor of greenhouse gas emissions, water, energy, and land use: A longitudinal assessment

    Get PDF
    Background: Dietary patterns can produce an environmental impact. Changes in people's diet, such as the increased consumption of ultra-processed food (UPF) can not only influence human health but also environment sustainability. Objectives: Assessment of the impact of 2-year changes in UPF consumption on greenhouse gas emissions and water, energy and land use. Design: A 2-year longitudinal study after a dietary intervention including 5879 participants from a Southern European population between the ages of 55-75 years with metabolic syndrome. Methods: Food intake was assessed using a validated 143-item food frequency questionnaire, which allowed classifying foods according to the NOVA system. In addition, sociodemographic data, Mediterranean diet adherence, and physical activity were obtained from validated questionnaires. Greenhouse gas emissions, water, energy and land use were calculated by means of the Agribalyse® 3.0.1 database of environmental impact indicators for food items. Changes in UPF consumption during a 2-year period were analyzed. Statistical analyses were conducted using computed General Linear Models. Results: Participants with major reductions in their UPF consumption reduced their impact by -0.6 kg of CO2eq and -5.3 MJ of energy. Water use was the only factor that increased as the percentage of UPF was reduced. Conclusions: Low consumption of ultra-processed foods may contribute to environmental sustainability. The processing level of the consumed food should be considered not only for nutritional advice on health but also for environmental protection

    Dynamics of disease characteristics and clinical management of critically ill COVID-19 patients over the time course of the pandemic: an analysis of the prospective, international, multicentre RISC-19-ICU registry.

    Get PDF
    BACKGROUND It remains elusive how the characteristics, the course of disease, the clinical management and the outcomes of critically ill COVID-19 patients admitted to intensive care units (ICU) worldwide have changed over the course of the pandemic. METHODS Prospective, observational registry constituted by 90 ICUs across 22 countries worldwide including patients with a laboratory-confirmed, critical presentation of COVID-19 requiring advanced organ support. Hierarchical, generalized linear mixed-effect models accounting for hospital and country variability were employed to analyse the continuous evolution of the studied variables over the pandemic. RESULTS Four thousand forty-one patients were included from March 2020 to September 2021. Over this period, the age of the admitted patients (62 [95% CI 60-63] years vs 64 [62-66] years, p < 0.001) and the severity of organ dysfunction at ICU admission decreased (Sequential Organ Failure Assessment 8.2 [7.6-9.0] vs 5.8 [5.3-6.4], p < 0.001) and increased, while more female patients (26 [23-29]% vs 41 [35-48]%, p < 0.001) were admitted. The time span between symptom onset and hospitalization as well as ICU admission became longer later in the pandemic (6.7 [6.2-7.2| days vs 9.7 [8.9-10.5] days, p < 0.001). The PaO2/FiO2 at admission was lower (132 [123-141] mmHg vs 101 [91-113] mmHg, p < 0.001) but showed faster improvements over the initial 5 days of ICU stay in late 2021 compared to early 2020 (34 [20-48] mmHg vs 70 [41-100] mmHg, p = 0.05). The number of patients treated with steroids and tocilizumab increased, while the use of therapeutic anticoagulation presented an inverse U-shaped behaviour over the course of the pandemic. The proportion of patients treated with high-flow oxygen (5 [4-7]% vs 20 [14-29], p < 0.001) and non-invasive mechanical ventilation (14 [11-18]% vs 24 [17-33]%, p < 0.001) throughout the pandemic increased concomitant to a decrease in invasive mechanical ventilation (82 [76-86]% vs 74 [64-82]%, p < 0.001). The ICU mortality (23 [19-26]% vs 17 [12-25]%, p < 0.001) and length of stay (14 [13-16] days vs 11 [10-13] days, p < 0.001) decreased over 19 months of the pandemic. CONCLUSION Characteristics and disease course of critically ill COVID-19 patients have continuously evolved, concomitant to the clinical management, throughout the pandemic leading to a younger, less severely ill ICU population with distinctly different clinical, pulmonary and inflammatory presentations than at the onset of the pandemic

    Impact of the first wave of the SARS-CoV-2 pandemic on the outcome of neurosurgical patients: A nationwide study in Spain

    Get PDF
    Objective To assess the effect of the first wave of the SARS-CoV-2 pandemic on the outcome of neurosurgical patients in Spain. Settings The initial flood of COVID-19 patients overwhelmed an unprepared healthcare system. Different measures were taken to deal with this overburden. The effect of these measures on neurosurgical patients, as well as the effect of COVID-19 itself, has not been thoroughly studied. Participants This was a multicentre, nationwide, observational retrospective study of patients who underwent any neurosurgical operation from March to July 2020. Interventions An exploratory factorial analysis was performed to select the most relevant variables of the sample. Primary and secondary outcome measures Univariate and multivariate analyses were performed to identify independent predictors of mortality and postoperative SARS-CoV-2 infection. Results Sixteen hospitals registered 1677 operated patients. The overall mortality was 6.4%, and 2.9% (44 patients) suffered a perioperative SARS-CoV-2 infection. Of those infections, 24 were diagnosed postoperatively. Age (OR 1.05), perioperative SARS-CoV-2 infection (OR 4.7), community COVID-19 incidence (cases/10 5 people/week) (OR 1.006), postoperative neurological worsening (OR 5.9), postoperative need for airway support (OR 5.38), ASA grade =3 (OR 2.5) and preoperative GCS 3-8 (OR 2.82) were independently associated with mortality. For SARS-CoV-2 postoperative infection, screening swab test <72 hours preoperatively (OR 0.76), community COVID-19 incidence (cases/10 5 people/week) (OR 1.011), preoperative cognitive impairment (OR 2.784), postoperative sepsis (OR 3.807) and an absence of postoperative complications (OR 0.188) were independently associated. Conclusions Perioperative SARS-CoV-2 infection in neurosurgical patients was associated with an increase in mortality by almost fivefold. Community COVID-19 incidence (cases/10 5 people/week) was a statistically independent predictor of mortality. Trial registration number CEIM 20/217

    Spread of a SARS-CoV-2 variant through Europe in the summer of 2020

    Get PDF
    [EN] Following its emergence in late 2019, the spread of SARS-CoV-21,2 has been tracked by phylogenetic analysis of viral genome sequences in unprecedented detail3,4,5. Although the virus spread globally in early 2020 before borders closed, intercontinental travel has since been greatly reduced. However, travel within Europe resumed in the summer of 2020. Here we report on a SARS-CoV-2 variant, 20E (EU1), that was identified in Spain in early summer 2020 and subsequently spread across Europe. We find no evidence that this variant has increased transmissibility, but instead demonstrate how rising incidence in Spain, resumption of travel, and lack of effective screening and containment may explain the variant’s success. Despite travel restrictions, we estimate that 20E (EU1) was introduced hundreds of times to European countries by summertime travellers, which is likely to have undermined local efforts to minimize infection with SARS-CoV-2. Our results illustrate how a variant can rapidly become dominant even in the absence of a substantial transmission advantage in favourable epidemiological settings. Genomic surveillance is critical for understanding how travel can affect transmission of SARS-CoV-2, and thus for informing future containment strategies as travel resumes.S

    Contributions of mean and shape of blood pressure distribution to worldwide trends and variations in raised blood pressure: A pooled analysis of 1018 population-based measurement studies with 88.6 million participants

    Get PDF
    © The Author(s) 2018. Background: Change in the prevalence of raised blood pressure could be due to both shifts in the entire distribution of blood pressure (representing the combined effects of public health interventions and secular trends) and changes in its high-blood-pressure tail (representing successful clinical interventions to control blood pressure in the hypertensive population). Our aim was to quantify the contributions of these two phenomena to the worldwide trends in the prevalence of raised blood pressure. Methods: We pooled 1018 population-based studies with blood pressure measurements on 88.6 million participants from 1985 to 2016. We first calculated mean systolic blood pressure (SBP), mean diastolic blood pressure (DBP) and prevalence of raised blood pressure by sex and 10-year age group from 20-29 years to 70-79 years in each study, taking into account complex survey design and survey sample weights, where relevant. We used a linear mixed effect model to quantify the association between (probittransformed) prevalence of raised blood pressure and age-group- and sex-specific mean blood pressure. We calculated the contributions of change in mean SBP and DBP, and of change in the prevalence-mean association, to the change in prevalence of raised blood pressure. Results: In 2005-16, at the same level of population mean SBP and DBP, men and women in South Asia and in Central Asia, the Middle East and North Africa would have the highest prevalence of raised blood pressure, and men and women in the highincome Asia Pacific and high-income Western regions would have the lowest. In most region-sex-age groups where the prevalence of raised blood pressure declined, one half or more of the decline was due to the decline in mean blood pressure. Where prevalence of raised blood pressure has increased, the change was entirely driven by increasing mean blood pressure, offset partly by the change in the prevalence-mean association. Conclusions: Change in mean blood pressure is the main driver of the worldwide change in the prevalence of raised blood pressure, but change in the high-blood-pressure tail of the distribution has also contributed to the change in prevalence, especially in older age groups

    Repositioning of the global epicentre of non-optimal cholesterol

    Get PDF
    High blood cholesterol is typically considered a feature of wealthy western countries(1,2). However, dietary and behavioural determinants of blood cholesterol are changing rapidly throughout the world(3) and countries are using lipid-lowering medications at varying rates. These changes can have distinct effects on the levels of high-density lipoprotein (HDL) cholesterol and non-HDL cholesterol, which have different effects on human health(4,5). However, the trends of HDL and non-HDL cholesterol levels over time have not been previously reported in a global analysis. Here we pooled 1,127 population-based studies that measured blood lipids in 102.6 million individuals aged 18 years and older to estimate trends from 1980 to 2018 in mean total, non-HDL and HDL cholesterol levels for 200 countries. Globally, there was little change in total or non-HDL cholesterol from 1980 to 2018. This was a net effect of increases in low- and middle-income countries, especially in east and southeast Asia, and decreases in high-income western countries, especially those in northwestern Europe, and in central and eastern Europe. As a result, countries with the highest level of non-HDL cholesterol-which is a marker of cardiovascular riskchanged from those in western Europe such as Belgium, Finland, Greenland, Iceland, Norway, Sweden, Switzerland and Malta in 1980 to those in Asia and the Pacific, such as Tokelau, Malaysia, The Philippines and Thailand. In 2017, high non-HDL cholesterol was responsible for an estimated 3.9 million (95% credible interval 3.7 million-4.2 million) worldwide deaths, half of which occurred in east, southeast and south Asia. The global repositioning of lipid-related risk, with non-optimal cholesterol shifting from a distinct feature of high-income countries in northwestern Europe, north America and Australasia to one that affects countries in east and southeast Asia and Oceania should motivate the use of population-based policies and personal interventions to improve nutrition and enhance access to treatment throughout the world.Peer reviewe

    The ATLAS fast tracKer system

    Get PDF
    The ATLAS Fast TracKer (FTK) was designed to provide full tracking for the ATLAS high-level trigger by using pattern recognition based on Associative Memory (AM) chips and fitting in high-speed field programmable gate arrays. The tracks found by the FTK are based on inputs from all modules of the pixel and silicon microstrip trackers. The as-built FTK system and components are described, as is the online software used to control them while running in the ATLAS data acquisition system. Also described is the simulation of the FTK hardware and the optimization of the AM pattern banks. An optimization for long-lived particles with large impact parameter values is included. A test of the FTK system with the data playback facility that allowed the FTK to be commissioned during the shutdown between Run 2 and Run 3 of the LHC is reported. The resulting tracks from part of the FTK system covering a limited η-ϕ region of the detector are compared with the output from the FTK simulation. It is shown that FTK performance is in good agreement with the simulation. © The ATLAS collaboratio
    corecore