183 research outputs found

    Shortened Version of the Token Test: Normative data for Spanish-speaking pediatric population

    Get PDF
    OBJECTIVE: To generate normative data for the Shortened Version of the Token Test in Spanish-speaking pediatric populations. METHOD: The sample consisted of 4,373 healthy children from nine countries in Latin America (Chile, Cuba, Ecuador, Guatemala, Honduras, Mexico, Paraguay, Peru, and Puerto Rico) and Spain. Each participant was administered the Shortened Version of the Token Test as part of a larger neuropsychological battery. Shortened Version of the Token Test total scores were normed using multiple linear regressions and standard deviations of residual values. Age, age2, sex, and mean level of parental education (MLPE) were included as predictors in the analyses. RESULTS: The final multiple linear regression models showed main effects for age in all countries, such that score increased linearly as a function of age. In addition, age2 had a significant effect in all countries, except Guatemala and Puerto Rico. Models showed that children whose parent(s) had a MLPE >12 years obtained higher score compared to children whose parents had a MLPE ≤12 years in Ecuador, Guatemala, Honduras, Mexico, Paraguay, Peru, Puerto Rico, and Spain. The child’s sex did not have an effect in the Shortened Version of the Token Test total score for any of the countries. CONCLUSIONS: This is the largest Spanish-speaking pediatric normative study in the world, and it will allow neuropsychologists from these countries to have a more accurate interpretation of the Shortened Version of the Token Test when used in pediatric populations

    Concentration Endurance Test (d2): Normative data for Spanish-speaking pediatric population

    Get PDF
    OBJECTIVE: To generate normative data for the Concentration Endurance Test (d2) in Spanish-speaking pediatric populations. METHOD: The sample consisted of 4,373 healthy children from nine countries in Latin America (Chile, Cuba, Ecuador, Guatemala, Honduras, Mexico, Paraguay, Peru, and Puerto Rico) and Spain. Each participant was administered the d2 test as part of a larger neuropsychological battery. The Total number of items processed (TN), Total number of correct responses (CR), Total performance (TP), and Concentration performance (CP) scores were normed using multiple linear regressions and standard deviations of residual values. Age, age2, sex, and mean level of parental education (MLPE) were included as predictors in the analyses. RESULTS: The final multiple linear regression models showed main effects for age on all scores, such that scores increased linearly as a function of age. TN scores were affected by age2 for Guatemala and Puerto Rico; CR scores were affected by age2 for Mexico; TP scores were affected by age2 for Chile, Mexico, Puerto Rico, and Spain; and CP scores for Mexico and Spain. Models indicated that children whose parents had a MLPE >12 years obtained higher scores compared to children whose parents had a MLPE≤12 years for Mexico and Spain in all scores, and Puerto Rico for TN, CR, and TP, and Guatemala and Paraguay for CP scores. Sex affect the scores for Ecuador and Honduras (CP scores). CONCLUSIONS: This is the largest Spanish-speaking pediatric normative study in the world, and it will allow neuropsychologists from these countries to have a more accurate approach to interpret the d2 test in pediatric populations

    Microbial Biomarker Transition in High-Altitude Sinter Mounds From El Tatio (Chile) Through Different Stages of Hydrothermal Activity

    Get PDF
    Geothermal springs support microbial communities at elevated temperatures in an ecosystem with high preservation potential that makes them interesting analogs for early evolution of the biogeosphere. The El Tatio geysers field in the Atacama Desert has astrobiological relevance due to the unique occurrence of geothermal features with steep hydrothermal gradients in an otherwise high altitude, hyper-arid environment. We present here results of our multidisciplinary field and molecular study of biogeochemical evidence for habitability and preservation in silica sinter at El Tatio. We sampled three morphologically similar geyser mounds characterized by differences in water activity (i.e., episodic liquid water, steam, and inactive geyser lacking hydrothermal activity). Multiple approaches were employed to determine (past and present) biological signatures and dominant metabolism. Lipid biomarkers indicated relative abundance of thermophiles (dicarboxylic acids) and sulfate reducing bacteria (branched carboxylic acids) in the sinter collected from the liquid water mound; photosynthetic microorganisms such as cyanobacteria (alkanes and isoprenoids) in the steam sinter mound; and archaea (squalane and crocetane) as well as purple sulfur bacteria (cyclopropyl acids) in the dry sinter from the inactive geyser. The three sinter structures preserved biosignatures representative of primary (thermophilic) and secondary (including endoliths and environmental contaminants) microbial communities. Sequencing of environmental 16S rRNA genes and immuno-assays generally corroborated the lipid-based microbial identification. The multiplex immunoassays and the compound-specific isotopic analysis of carboxylic acids, alkanols, and alkanes indicated that the principal microbial pathway for carbon fixation in the three sinter mounds was through the Calvin cycle, with a relative larger contribution of the reductive acetyl-CoA pathway in the dry system. Other inferred metabolic traits varied from the liquid mound (iron and sulfur chemistry), to the steam mound (nitrogen cycle), to the dry mound (perchlorate reduction). The combined results revealed different stages of colonization that reflect differences in the lifetime of the mounds, where primary communities dominated the biosignatures preserved in sinters from the still active geysers (liquid and steam mounds), in contrast to the surviving metabolisms and microbial communities at the end of lifetime of the inactive geothermal mound

    Tenacidad a la fractura de compuestos cermets 3Al2O3*2SiO2/Ag manufacturados por molienda de alta energía

    Get PDF
    La fabricación de materiales compuestos de matriz cerámica reforzados con partículas metálicas han propiciado la formación de nuevos materiales conocidos como compuestos CERMETS, materiales que debido a sus elementos precursores poseen propiedades distintas a las de los materiales convencionales. En este trabajo se establece la ruta de fabricación de materiales compuestos cermets base 3Al2O3*2SiO2 reforzados con partículas metálicas de Ag a partir de la formación de la composición química en peso de polvos de 3Al2O3*2SiO2 / 1% Ag en busca de un aumento en la tenacidad a la fractura con respecto al cerámico base. La composición química de polvos es sometida a un proceso de mezcla molienda de alta energía en seco en un molino tipo planetario por 2 horas a 200 rpm. Los polvos posteriormente son conformados en muestras cilíndricas de 20 mm de diámetro y 3 mm de espesor mediante la aplicación de carga uniaxial en frío de 200 MPa. Las muestras son sinterizadas a 1500°C y 1600°C por una y dos horas en un horno de resistencia eléctrica en atmósfera controlada de gas nitrógeno. Los compuestos fabricados son analizados microestructuralmente por microscopia óptica y electrónica de barrido. Se determina la densidad y las propiedades mecánicas de dureza y tenacidad a la fractura, las dos últimas por el método de indentación. Los resultados muestran la viabilidad de fabricación de materiales compuestos cermets así como los cambios en la densidad, la dureza y la tenacidad a la fractura, con respecto al cerámico 3Al2O3*2SiO2 sin refuerzo metálico

    Clustering COVID-19 ARDS patients through the first days of ICU admission. An analysis of the CIBERESUCICOVID Cohort

    Full text link
    Background Acute respiratory distress syndrome (ARDS) can be classified into sub-phenotypes according to different inflammatory/clinical status. Prognostic enrichment was achieved by grouping patients into hypoinflammatory or hyperinflammatory sub-phenotypes, even though the time of analysis may change the classification according to treatment response or disease evolution. We aimed to evaluate when patients can be clustered in more than 1 group, and how they may change the clustering of patients using data of baseline or day 3, and the prognosis of patients according to their evolution by changing or not the cluster.Methods Multicenter, observational prospective, and retrospective study of patients admitted due to ARDS related to COVID-19 infection in Spain. Patients were grouped according to a clustering mixed-type data algorithm (k-prototypes) using continuous and categorical readily available variables at baseline and day 3.Results Of 6205 patients, 3743 (60%) were included in the study. According to silhouette analysis, patients were grouped in two clusters. At baseline, 1402 (37%) patients were included in cluster 1 and 2341(63%) in cluster 2. On day 3, 1557(42%) patients were included in cluster 1 and 2086 (57%) in cluster 2. The patients included in cluster 2 were older and more frequently hypertensive and had a higher prevalence of shock, organ dysfunction, inflammatory biomarkers, and worst respiratory indexes at both time points. The 90-day mortality was higher in cluster 2 at both clustering processes (43.8% [n = 1025] versus 27.3% [n = 383] at baseline, and 49% [n = 1023] versus 20.6% [n = 321] on day 3). Four hundred and fifty-eight (33%) patients clustered in the first group were clustered in the second group on day 3. In contrast, 638 (27%) patients clustered in the second group were clustered in the first group on day 3.Conclusions During the first days, patients can be clustered into two groups and the process of clustering patients may change as they continue to evolve. This means that despite a vast majority of patients remaining in the same cluster, a minority reaching 33% of patients analyzed may be re-categorized into different clusters based on their progress. Such changes can significantly impact their prognosis

    The evolution of the ventilatory ratio is a prognostic factor in mechanically ventilated COVID-19 ARDS patients

    Get PDF
    Background: Mortality due to COVID-19 is high, especially in patients requiring mechanical ventilation. The purpose of the study is to investigate associations between mortality and variables measured during the first three days of mechanical ventilation in patients with COVID-19 intubated at ICU admission. Methods: Multicenter, observational, cohort study includes consecutive patients with COVID-19 admitted to 44 Spanish ICUs between February 25 and July 31, 2020, who required intubation at ICU admission and mechanical ventilation for more than three days. We collected demographic and clinical data prior to admission; information about clinical evolution at days 1 and 3 of mechanical ventilation; and outcomes. Results: Of the 2,095 patients with COVID-19 admitted to the ICU, 1,118 (53.3%) were intubated at day 1 and remained under mechanical ventilation at day three. From days 1 to 3, PaO2/FiO2 increased from 115.6 [80.0-171.2] to 180.0 [135.4-227.9] mmHg and the ventilatory ratio from 1.73 [1.33-2.25] to 1.96 [1.61-2.40]. In-hospital mortality was 38.7%. A higher increase between ICU admission and day 3 in the ventilatory ratio (OR 1.04 [CI 1.01-1.07], p = 0.030) and creatinine levels (OR 1.05 [CI 1.01-1.09], p = 0.005) and a lower increase in platelet counts (OR 0.96 [CI 0.93-1.00], p = 0.037) were independently associated with a higher risk of death. No association between mortality and the PaO2/FiO2 variation was observed (OR 0.99 [CI 0.95 to 1.02], p = 0.47). Conclusions: Higher ventilatory ratio and its increase at day 3 is associated with mortality in patients with COVID-19 receiving mechanical ventilation at ICU admission. No association was found in the PaO2/FiO2 variation

    The PREDICT study uncovers three clinical courses of acutely decompensated cirrhosis that have distinct pathophysiology

    Get PDF
    Acute decompensation (AD) of cirrhosis is defined as the acute development of ascites, gastrointestinal hemorrhage, hepatic encephalopathy, infection or any combination thereof, requiring hospitalization. The presence of organ failure(s) in patients with AD defines acute-on-chronic liver failure (ACLF). The PREDICT study is a European, prospective, observational study, designed to characterize the clinical course of AD and to identify predictors of ACLF. A total of 1,071 patients with AD were enrolled. We collected detailed pre-specified information on the 3-month period prior to enrollment, and clinical and laboratory data at enrollment. Patients were then closely followed up for 3 months. Outcomes (liver transplantation and death) at 1 year were also recorded. Three groups of patients were identified. Pre-ACLF patients (n = 218) developed ACLF and had 3-month and 1-year mortality rates of 53.7% and 67.4%, respectively. Unstable decompensated cirrhosis (UDC) patients (n = 233) required ≥1 readmission but did not develop ACLF and had mortality rates of 21.0% and 35.6%, respectively. Stable decompensated cirrhosis (SDC) patients (n = 620) were not readmitted, did not develop ACLF and had a 1-year mortality rate of only 9.5%. The 3 groups differed significantly regarding the grade and course of systemic inflammation (high-grade at enrollment with aggravation during follow-up in pre-ACLF; low-grade at enrollment with subsequent steady-course in UDC; and low-grade at enrollment with subsequent improvement in SDC) and the prevalence of surrogates of severe portal hypertension throughout the study (high in UDC vs. low in pre-ACLF and SDC). Acute decompensation without ACLF is a heterogeneous condition with 3 different clinical courses and 2 major pathophysiological mechanisms: systemic inflammation and portal hypertension. Predicting the development of ACLF remains a major future challenge. ClinicalTrials.gov number: NCT03056612. Lay summary: Herein, we describe, for the first time, 3 different clinical courses of acute decompensation (AD) of cirrhosis after hospital admission. The first clinical course includes patients who develop acute-on-chronic liver failure (ACLF) and have a high short-term risk of death - termed pre-ACLF. The second clinical course (unstable decompensated cirrhosis) includes patients requiring frequent hospitalizations unrelated to ACLF and is associated with a lower mortality risk than pre-ACLF. Finally, the third clinical course (stable decompensated cirrhosis), includes two-thirds of all patients admitted to hospital with AD - patients in this group rarely require hospital admission and have a much lower 1-year mortality risk

    Enabling planetary science across light-years. Ariel Definition Study Report

    Get PDF
    Ariel, the Atmospheric Remote-sensing Infrared Exoplanet Large-survey, was adopted as the fourth medium-class mission in ESA's Cosmic Vision programme to be launched in 2029. During its 4-year mission, Ariel will study what exoplanets are made of, how they formed and how they evolve, by surveying a diverse sample of about 1000 extrasolar planets, simultaneously in visible and infrared wavelengths. It is the first mission dedicated to measuring the chemical composition and thermal structures of hundreds of transiting exoplanets, enabling planetary science far beyond the boundaries of the Solar System. The payload consists of an off-axis Cassegrain telescope (primary mirror 1100 mm x 730 mm ellipse) and two separate instruments (FGS and AIRS) covering simultaneously 0.5-7.8 micron spectral range. The satellite is best placed into an L2 orbit to maximise the thermal stability and the field of regard. The payload module is passively cooled via a series of V-Groove radiators; the detectors for the AIRS are the only items that require active cooling via an active Ne JT cooler. The Ariel payload is developed by a consortium of more than 50 institutes from 16 ESA countries, which include the UK, France, Italy, Belgium, Poland, Spain, Austria, Denmark, Ireland, Portugal, Czech Republic, Hungary, the Netherlands, Sweden, Norway, Estonia, and a NASA contribution

    Measuring performance on the Healthcare Access and Quality Index for 195 countries and territories and selected subnational locations: A systematic analysis from the Global Burden of Disease Study 2016

    Get PDF
    Background: A key component of achieving universal health coverage is ensuring that all populations have access to quality health care. Examining where gains have occurred or progress has faltered across and within countries is crucial to guiding decisions and strategies for future improvement. We used the Global Burden of Diseases, Injuries, and Risk Factors Study 2016 (GBD 2016) to assess personal health-care access and quality with the Healthcare Access and Quality (HAQ) Index for 195 countries and territories, as well as subnational locations in seven countries, from 1990 to 2016. Methods Drawing from established methods and updated estimates from GBD 2016, we used 32 causes from which death should not occur in the presence of effective care to approximate personal health-care access and quality by location and over time. To better isolate potential effects of personal health-care access and quality from underlying risk factor patterns, we risk-standardised cause-specific deaths due to non-cancers by location-year, replacing the local joint exposure of environmental and behavioural risks with the global level of exposure. Supported by the expansion of cancer registry data in GBD 2016, we used mortality-to-incidence ratios for cancers instead of risk-standardised death rates to provide a stronger signal of the effects of personal health care and access on cancer survival. We transformed each cause to a scale of 0-100, with 0 as the first percentile (worst) observed between 1990 and 2016, and 100 as the 99th percentile (best); we set these thresholds at the country level, and then applied them to subnational locations. We applied a principal components analysis to construct the HAQ Index using all scaled cause values, providing an overall score of 0-100 of personal health-care access and quality by location over time. We then compared HAQ Index levels and trends by quintiles on the Socio-demographic Index (SDI), a summary measure of overall development. As derived from the broader GBD study and other data sources, we examined relationships between national HAQ Index scores and potential correlates of performance, such as total health spending per capita. Findings In 2016, HAQ Index performance spanned from a high of 97\ub71 (95% UI 95\ub78-98\ub71) in Iceland, followed by 96\ub76 (94\ub79-97\ub79) in Norway and 96\ub71 (94\ub75-97\ub73) in the Netherlands, to values as low as 18\ub76 (13\ub71-24\ub74) in the Central African Republic, 19\ub70 (14\ub73-23\ub77) in Somalia, and 23\ub74 (20\ub72-26\ub78) in Guinea-Bissau. The pace of progress achieved between 1990 and 2016 varied, with markedly faster improvements occurring between 2000 and 2016 for many countries in sub-Saharan Africa and southeast Asia, whereas several countries in Latin America and elsewhere saw progress stagnate after experiencing considerable advances in the HAQ Index between 1990 and 2000. Striking subnational disparities emerged in personal health-care access and quality, with China and India having particularly large gaps between locations with the highest and lowest scores in 2016. In China, performance ranged from 91\ub75 (89\ub71-93\ub76) in Beijing to 48\ub70 (43\ub74-53\ub72) in Tibet (a 43\ub75-point difference), while India saw a 30\ub78-point disparity, from 64\ub78 (59\ub76-68\ub78) in Goa to 34\ub70 (30\ub73-38\ub71) in Assam. Japan recorded the smallest range in subnational HAQ performance in 2016 (a 4\ub78-point difference), whereas differences between subnational locations with the highest and lowest HAQ Index values were more than two times as high for the USA and three times as high for England. State-level gaps in the HAQ Index in Mexico somewhat narrowed from 1990 to 2016 (from a 20\ub79-point to 17\ub70-point difference), whereas in Brazil, disparities slightly increased across states during this time (a 17\ub72-point to 20\ub74-point difference). Performance on the HAQ Index showed strong linkages to overall development, with high and high-middle SDI countries generally having higher scores and faster gains for non-communicable diseases. Nonetheless, countries across the development spectrum saw substantial gains in some key health service areas from 2000 to 2016, most notably vaccine-preventable diseases. Overall, national performance on the HAQ Index was positively associated with higher levels of total health spending per capita, as well as health systems inputs, but these relationships were quite heterogeneous, particularly among low-to-middle SDI countries. Interpretation GBD 2016 provides a more detailed understanding of past success and current challenges in improving personal health-care access and quality worldwide. Despite substantial gains since 2000, many low-SDI and middle- SDI countries face considerable challenges unless heightened policy action and investments focus on advancing access to and quality of health care across key health services, especially non-communicable diseases. Stagnating or minimal improvements experienced by several low-middle to high-middle SDI countries could reflect the complexities of re-orienting both primary and secondary health-care services beyond the more limited foci of the Millennium Development Goals. Alongside initiatives to strengthen public health programmes, the pursuit of universal health coverage hinges upon improving both access and quality worldwide, and thus requires adopting a more comprehensive view-and subsequent provision-of quality health care for all populations

    Measuring performance on the Healthcare Access and Quality Index for 195 countries and territories and selected subnational locations: A systematic analysis from the Global Burden of Disease Study 2016

    Get PDF
    Copyright © 2018 The Author(s). Published by Elsevier Ltd. Background A key component of achieving universal health coverage is ensuring that all populations have access to quality health care. Examining where gains have occurred or progress has faltered across and within countries is crucial to guiding decisions and strategies for future improvement. We used the Global Burden of Diseases, Injuries, and Risk Factors Study 2016 (GBD 2016) to assess personal health-care access and quality with the Healthcare Access and Quality (HAQ) Index for 195 countries and territories, as well as subnational locations in seven countries, from 1990 to 2016. Methods Drawing from established methods and updated estimates from GBD 2016, we used 32 causes from which death should not occur in the presence of effective care to approximate personal health-care access and quality by location and over time. To better isolate potential effects of personal health-care access and quality from underlying risk factor patterns, we risk-standardised cause-specific deaths due to non-cancers by location-year, replacing the local joint exposure of environmental and behavioural risks with the global level of exposure. Supported by the expansion of cancer registry data in GBD 2016, we used mortality-to-incidence ratios for cancers instead of risk-standardised death rates to provide a stronger signal of the effects of personal health care and access on cancer survival. We transformed each cause to a scale of 0-100, with 0 as the first percentile (worst) observed between 1990 and 2016, and 100 as the 99th percentile (best); we set these thresholds at the country level, and then applied them to subnational locations. We applied a principal components analysis to construct the HAQ Index using all scaled cause values, providing an overall score of 0-100 of personal health-care access and quality by location over time. We then compared HAQ Index levels and trends by quintiles on the Socio-demographic Index (SDI), a summary measure of overall development. As derived from the broader GBD study and other data sources, we examined relationships between national HAQ Index scores and potential correlates of performance, such as total health spending per capita. Findings In 2016, HAQ Index performance spanned from a high of 97·1 (95% UI 95·8-98·1) in Iceland, followed by 96·6 (94·9-97·9) in Norway and 96·1 (94·5-97·3) in the Netherlands, to values as low as 18·6 (13·1-24·4) in the Central African Republic, 19·0 (14·3-23·7) in Somalia, and 23·4 (20·2-26·8) in Guinea-Bissau. The pace of progress achieved between 1990 and 2016 varied, with markedly faster improvements occurring between 2000 and 2016 for many countries in sub-Saharan Africa and southeast Asia, whereas several countries in Latin America and elsewhere saw progress stagnate after experiencing considerable advances in the HAQ Index between 1990 and 2000. Striking subnational disparities emerged in personal health-care access and quality, with China and India having particularly large gaps between locations with the highest and lowest scores in 2016. In China, performance ranged from 91·5 (89·1-93·6) in Beijing to 48·0 (43·4-53·2) in Tibet (a 43·5-point difference), while India saw a 30·8-point disparity, from 64·8 (59·6-68·8) in Goa to 34·0 (30·3-38·1) in Assam. Japan recorded the smallest range in subnational HAQ performance in 2016 (a 4·8-point difference), whereas differences between subnational locations with the highest and lowest HAQ Index values were more than two times as high for the USA and three times as high for England. State-level gaps in the HAQ Index in Mexico somewhat narrowed from 1990 to 2016 (from a 20·9-point to 17·0-point difference), whereas in Brazil, disparities slightly increased across states during this time (a 17·2-point to 20·4-point difference). Performance on the HAQ Index showed strong linkages to overall development, with high and high-middle SDI countries generally having higher scores and faster gains for non-communicable diseases. Nonetheless, countries across the development spectrum saw substantial gains in some key health service areas from 2000 to 2016, most notably vaccine-preventable diseases. Overall, national performance on the HAQ Index was positively associated with higher levels of total health spending per capita, as well as health systems inputs, but these relationships were quite heterogeneous, particularly among low-to-middle SDI countries. Interpretation GBD 2016 provides a more detailed understanding of past success and current challenges in improving personal health-care access and quality worldwide. Despite substantial gains since 2000, many low-SDI and middle- SDI countries face considerable challenges unless heightened policy action and investments focus on advancing access to and quality of health care across key health services, especially non-communicable diseases. Stagnating or minimal improvements experienced by several low-middle to high-middle SDI countries could reflect the complexities of re-orienting both primary and secondary health-care services beyond the more limited foci of the Millennium Development Goals. Alongside initiatives to strengthen public health programmes, the pursuit of universal health coverage hinges upon improving both access and quality worldwide, and thus requires adopting a more comprehensive view - and subsequent provision - of quality health care for all populations
    corecore