42 research outputs found

    Ultrahigh-Field MRI in Human Ischemic Stroke – a 7 Tesla Study

    Get PDF
    INTRODUCTION: Magnetic resonance imaging (MRI) using field strengths up to 3 Tesla (T) has proven to be a powerful tool for stroke diagnosis. Recently, ultrahigh-field (UHF) MRI at 7 T has shown relevant diagnostic benefits in imaging of neurological diseases, but its value for stroke imaging has not been investigated yet. We present the first evaluation of a clinically feasible stroke imaging protocol at 7 T. For comparison an established stroke imaging protocol was applied at 3 T. METHODS: In a prospective imaging study seven patients with subacute and chronic stroke were included. Imaging at 3 T was immediately followed by 7 T imaging. Both protocols included T1-weighted 3D Magnetization-Prepared Rapid-Acquired Gradient-Echo (3D-MPRAGE), T2-weighted 2D Fluid Attenuated Inversion Recovery (2D-FLAIR), T2-weighted 2D Fluid Attenuated Inversion Recovery (2D-T2-TSE), T2* weighted 2D Fast Low Angle Shot Gradient Echo (2D-HemoFLASH) and 3D Time-of-Flight angiography (3D-TOF). RESULTS: The diagnostic information relevant for clinical stroke imaging obtained at 3 T was equally available at 7 T. Higher spatial resolution at 7 T revealed more anatomical details precisely depicting ischemic lesions and periinfarct alterations. A clear benefit in anatomical resolution was also demonstrated for vessel imaging at 7 T. RF power deposition constraints induced scan time prolongation and reduced brain coverage for 2D-FLAIR, 2D-T2-TSE and 3D-TOF at 7 T versus 3 T. CONCLUSIONS: The potential of 7 T MRI for human stroke imaging is shown. Our pilot study encourages a further evaluation of the diagnostic benefit of stroke imaging at 7 T in a larger study

    Adaptation and Selective Information Transmission in the Cricket Auditory Neuron AN2

    Get PDF
    Sensory systems adapt their neural code to changes in the sensory environment, often on multiple time scales. Here, we report a new form of adaptation in a first-order auditory interneuron (AN2) of crickets. We characterize the response of the AN2 neuron to amplitude-modulated sound stimuli and find that adaptation shifts the stimulus–response curves toward higher stimulus intensities, with a time constant of 1.5 s for adaptation and recovery. The spike responses were thus reduced for low-intensity sounds. We then address the question whether adaptation leads to an improvement of the signal's representation and compare the experimental results with the predictions of two competing hypotheses: infomax, which predicts that information conveyed about the entire signal range should be maximized, and selective coding, which predicts that “foreground” signals should be enhanced while “background” signals should be selectively suppressed. We test how adaptation changes the input–response curve when presenting signals with two or three peaks in their amplitude distributions, for which selective coding and infomax predict conflicting changes. By means of Bayesian data analysis, we quantify the shifts of the measured response curves and also find a slight reduction of their slopes. These decreases in slopes are smaller, and the absolute response thresholds are higher than those predicted by infomax. Most remarkably, and in contrast to the infomax principle, adaptation actually reduces the amount of encoded information when considering the whole range of input signals. The response curve changes are also not consistent with the selective coding hypothesis, because the amount of information conveyed about the loudest part of the signal does not increase as predicted but remains nearly constant. Less information is transmitted about signals with lower intensity

    Impact of primary kidney disease on the effects of empagliflozin in patients with chronic kidney disease: secondary analyses of the EMPA-KIDNEY trial

    Get PDF
    Background: The EMPA KIDNEY trial showed that empagliflozin reduced the risk of the primary composite outcome of kidney disease progression or cardiovascular death in patients with chronic kidney disease mainly through slowing progression. We aimed to assess how effects of empagliflozin might differ by primary kidney disease across its broad population. Methods: EMPA-KIDNEY, a randomised, controlled, phase 3 trial, was conducted at 241 centres in eight countries (Canada, China, Germany, Italy, Japan, Malaysia, the UK, and the USA). Patients were eligible if their estimated glomerular filtration rate (eGFR) was 20 to less than 45 mL/min per 1·73 m2, or 45 to less than 90 mL/min per 1·73 m2 with a urinary albumin-to-creatinine ratio (uACR) of 200 mg/g or higher at screening. They were randomly assigned (1:1) to 10 mg oral empagliflozin once daily or matching placebo. Effects on kidney disease progression (defined as a sustained ≥40% eGFR decline from randomisation, end-stage kidney disease, a sustained eGFR below 10 mL/min per 1·73 m2, or death from kidney failure) were assessed using prespecified Cox models, and eGFR slope analyses used shared parameter models. Subgroup comparisons were performed by including relevant interaction terms in models. EMPA-KIDNEY is registered with ClinicalTrials.gov, NCT03594110. Findings: Between May 15, 2019, and April 16, 2021, 6609 participants were randomly assigned and followed up for a median of 2·0 years (IQR 1·5–2·4). Prespecified subgroupings by primary kidney disease included 2057 (31·1%) participants with diabetic kidney disease, 1669 (25·3%) with glomerular disease, 1445 (21·9%) with hypertensive or renovascular disease, and 1438 (21·8%) with other or unknown causes. Kidney disease progression occurred in 384 (11·6%) of 3304 patients in the empagliflozin group and 504 (15·2%) of 3305 patients in the placebo group (hazard ratio 0·71 [95% CI 0·62–0·81]), with no evidence that the relative effect size varied significantly by primary kidney disease (pheterogeneity=0·62). The between-group difference in chronic eGFR slopes (ie, from 2 months to final follow-up) was 1·37 mL/min per 1·73 m2 per year (95% CI 1·16–1·59), representing a 50% (42–58) reduction in the rate of chronic eGFR decline. This relative effect of empagliflozin on chronic eGFR slope was similar in analyses by different primary kidney diseases, including in explorations by type of glomerular disease and diabetes (p values for heterogeneity all >0·1). Interpretation: In a broad range of patients with chronic kidney disease at risk of progression, including a wide range of non-diabetic causes of chronic kidney disease, empagliflozin reduced risk of kidney disease progression. Relative effect sizes were broadly similar irrespective of the cause of primary kidney disease, suggesting that SGLT2 inhibitors should be part of a standard of care to minimise risk of kidney failure in chronic kidney disease. Funding: Boehringer Ingelheim, Eli Lilly, and UK Medical Research Council

    BLOOM: A 176B-Parameter Open-Access Multilingual Language Model

    Full text link
    Large language models (LLMs) have been shown to be able to perform new tasks based on a few demonstrations or natural language instructions. While these capabilities have led to widespread adoption, most LLMs are developed by resource-rich organizations and are frequently kept from the public. As a step towards democratizing this powerful technology, we present BLOOM, a 176B-parameter open-access language model designed and built thanks to a collaboration of hundreds of researchers. BLOOM is a decoder-only Transformer language model that was trained on the ROOTS corpus, a dataset comprising hundreds of sources in 46 natural and 13 programming languages (59 in total). We find that BLOOM achieves competitive performance on a wide variety of benchmarks, with stronger results after undergoing multitask prompted finetuning. To facilitate future research and applications using LLMs, we publicly release our models and code under the Responsible AI License

    The Changing Landscape for Stroke\ua0Prevention in AF: Findings From the GLORIA-AF Registry Phase 2

    Get PDF
    Background GLORIA-AF (Global Registry on Long-Term Oral Antithrombotic Treatment in Patients with Atrial Fibrillation) is a prospective, global registry program describing antithrombotic treatment patterns in patients with newly diagnosed nonvalvular atrial fibrillation at risk of stroke. Phase 2 began when dabigatran, the first non\u2013vitamin K antagonist oral anticoagulant (NOAC), became available. Objectives This study sought to describe phase 2 baseline data and compare these with the pre-NOAC era collected during phase 1. Methods During phase 2, 15,641 consenting patients were enrolled (November 2011 to December 2014); 15,092 were eligible. This pre-specified cross-sectional analysis describes eligible patients\u2019 baseline characteristics. Atrial fibrillation disease characteristics, medical outcomes, and concomitant diseases and medications were collected. Data were analyzed using descriptive statistics. Results Of the total patients, 45.5% were female; median age was 71 (interquartile range: 64, 78) years. Patients were from Europe (47.1%), North America (22.5%), Asia (20.3%), Latin America (6.0%), and the Middle East/Africa (4.0%). Most had high stroke risk (CHA2DS2-VASc [Congestive heart failure, Hypertension, Age  6575 years, Diabetes mellitus, previous Stroke, Vascular disease, Age 65 to 74 years, Sex category] score  652; 86.1%); 13.9% had moderate risk (CHA2DS2-VASc = 1). Overall, 79.9% received oral anticoagulants, of whom 47.6% received NOAC and 32.3% vitamin K antagonists (VKA); 12.1% received antiplatelet agents; 7.8% received no antithrombotic treatment. For comparison, the proportion of phase 1 patients (of N = 1,063 all eligible) prescribed VKA was 32.8%, acetylsalicylic acid 41.7%, and no therapy 20.2%. In Europe in phase 2, treatment with NOAC was more common than VKA (52.3% and 37.8%, respectively); 6.0% of patients received antiplatelet treatment; and 3.8% received no antithrombotic treatment. In North America, 52.1%, 26.2%, and 14.0% of patients received NOAC, VKA, and antiplatelet drugs, respectively; 7.5% received no antithrombotic treatment. NOAC use was less common in Asia (27.7%), where 27.5% of patients received VKA, 25.0% antiplatelet drugs, and 19.8% no antithrombotic treatment. Conclusions The baseline data from GLORIA-AF phase 2 demonstrate that in newly diagnosed nonvalvular atrial fibrillation patients, NOAC have been highly adopted into practice, becoming more frequently prescribed than VKA in Europe and North America. Worldwide, however, a large proportion of patients remain undertreated, particularly in Asia and North America. (Global Registry on Long-Term Oral Antithrombotic Treatment in Patients With Atrial Fibrillation [GLORIA-AF]; NCT01468701

    Measuring performance on the Healthcare Access and Quality Index for 195 countries and territories and selected subnational locations: A systematic analysis from the Global Burden of Disease Study 2016

    Get PDF
    Background: A key component of achieving universal health coverage is ensuring that all populations have access to quality health care. Examining where gains have occurred or progress has faltered across and within countries is crucial to guiding decisions and strategies for future improvement. We used the Global Burden of Diseases, Injuries, and Risk Factors Study 2016 (GBD 2016) to assess personal health-care access and quality with the Healthcare Access and Quality (HAQ) Index for 195 countries and territories, as well as subnational locations in seven countries, from 1990 to 2016. Methods Drawing from established methods and updated estimates from GBD 2016, we used 32 causes from which death should not occur in the presence of effective care to approximate personal health-care access and quality by location and over time. To better isolate potential effects of personal health-care access and quality from underlying risk factor patterns, we risk-standardised cause-specific deaths due to non-cancers by location-year, replacing the local joint exposure of environmental and behavioural risks with the global level of exposure. Supported by the expansion of cancer registry data in GBD 2016, we used mortality-to-incidence ratios for cancers instead of risk-standardised death rates to provide a stronger signal of the effects of personal health care and access on cancer survival. We transformed each cause to a scale of 0-100, with 0 as the first percentile (worst) observed between 1990 and 2016, and 100 as the 99th percentile (best); we set these thresholds at the country level, and then applied them to subnational locations. We applied a principal components analysis to construct the HAQ Index using all scaled cause values, providing an overall score of 0-100 of personal health-care access and quality by location over time. We then compared HAQ Index levels and trends by quintiles on the Socio-demographic Index (SDI), a summary measure of overall development. As derived from the broader GBD study and other data sources, we examined relationships between national HAQ Index scores and potential correlates of performance, such as total health spending per capita. Findings In 2016, HAQ Index performance spanned from a high of 97\ub71 (95% UI 95\ub78-98\ub71) in Iceland, followed by 96\ub76 (94\ub79-97\ub79) in Norway and 96\ub71 (94\ub75-97\ub73) in the Netherlands, to values as low as 18\ub76 (13\ub71-24\ub74) in the Central African Republic, 19\ub70 (14\ub73-23\ub77) in Somalia, and 23\ub74 (20\ub72-26\ub78) in Guinea-Bissau. The pace of progress achieved between 1990 and 2016 varied, with markedly faster improvements occurring between 2000 and 2016 for many countries in sub-Saharan Africa and southeast Asia, whereas several countries in Latin America and elsewhere saw progress stagnate after experiencing considerable advances in the HAQ Index between 1990 and 2000. Striking subnational disparities emerged in personal health-care access and quality, with China and India having particularly large gaps between locations with the highest and lowest scores in 2016. In China, performance ranged from 91\ub75 (89\ub71-93\ub76) in Beijing to 48\ub70 (43\ub74-53\ub72) in Tibet (a 43\ub75-point difference), while India saw a 30\ub78-point disparity, from 64\ub78 (59\ub76-68\ub78) in Goa to 34\ub70 (30\ub73-38\ub71) in Assam. Japan recorded the smallest range in subnational HAQ performance in 2016 (a 4\ub78-point difference), whereas differences between subnational locations with the highest and lowest HAQ Index values were more than two times as high for the USA and three times as high for England. State-level gaps in the HAQ Index in Mexico somewhat narrowed from 1990 to 2016 (from a 20\ub79-point to 17\ub70-point difference), whereas in Brazil, disparities slightly increased across states during this time (a 17\ub72-point to 20\ub74-point difference). Performance on the HAQ Index showed strong linkages to overall development, with high and high-middle SDI countries generally having higher scores and faster gains for non-communicable diseases. Nonetheless, countries across the development spectrum saw substantial gains in some key health service areas from 2000 to 2016, most notably vaccine-preventable diseases. Overall, national performance on the HAQ Index was positively associated with higher levels of total health spending per capita, as well as health systems inputs, but these relationships were quite heterogeneous, particularly among low-to-middle SDI countries. Interpretation GBD 2016 provides a more detailed understanding of past success and current challenges in improving personal health-care access and quality worldwide. Despite substantial gains since 2000, many low-SDI and middle- SDI countries face considerable challenges unless heightened policy action and investments focus on advancing access to and quality of health care across key health services, especially non-communicable diseases. Stagnating or minimal improvements experienced by several low-middle to high-middle SDI countries could reflect the complexities of re-orienting both primary and secondary health-care services beyond the more limited foci of the Millennium Development Goals. Alongside initiatives to strengthen public health programmes, the pursuit of universal health coverage hinges upon improving both access and quality worldwide, and thus requires adopting a more comprehensive view-and subsequent provision-of quality health care for all populations

    Measuring performance on the Healthcare Access and Quality Index for 195 countries and territories and selected subnational locations: a systematic analysis from the Global Burden of Disease Study 2016.

    Get PDF
    BACKGROUND: A key component of achieving universal health coverage is ensuring that all populations have access to quality health care. Examining where gains have occurred or progress has faltered across and within countries is crucial to guiding decisions and strategies for future improvement. We used the Global Burden of Diseases, Injuries, and Risk Factors Study 2016 (GBD 2016) to assess personal health-care access and quality with the Healthcare Access and Quality (HAQ) Index for 195 countries and territories, as well as subnational locations in seven countries, from 1990 to 2016. METHODS: Drawing from established methods and updated estimates from GBD 2016, we used 32 causes from which death should not occur in the presence of effective care to approximate personal health-care access and quality by location and over time. To better isolate potential effects of personal health-care access and quality from underlying risk factor patterns, we risk-standardised cause-specific deaths due to non-cancers by location-year, replacing the local joint exposure of environmental and behavioural risks with the global level of exposure. Supported by the expansion of cancer registry data in GBD 2016, we used mortality-to-incidence ratios for cancers instead of risk-standardised death rates to provide a stronger signal of the effects of personal health care and access on cancer survival. We transformed each cause to a scale of 0-100, with 0 as the first percentile (worst) observed between 1990 and 2016, and 100 as the 99th percentile (best); we set these thresholds at the country level, and then applied them to subnational locations. We applied a principal components analysis to construct the HAQ Index using all scaled cause values, providing an overall score of 0-100 of personal health-care access and quality by location over time. We then compared HAQ Index levels and trends by quintiles on the Socio-demographic Index (SDI), a summary measure of overall development. As derived from the broader GBD study and other data sources, we examined relationships between national HAQ Index scores and potential correlates of performance, such as total health spending per capita

    Mortality from gastrointestinal congenital anomalies at 264 hospitals in 74 low-income, middle-income, and high-income countries: a multicentre, international, prospective cohort study

    Get PDF
    Summary Background Congenital anomalies are the fifth leading cause of mortality in children younger than 5 years globally. Many gastrointestinal congenital anomalies are fatal without timely access to neonatal surgical care, but few studies have been done on these conditions in low-income and middle-income countries (LMICs). We compared outcomes of the seven most common gastrointestinal congenital anomalies in low-income, middle-income, and high-income countries globally, and identified factors associated with mortality. Methods We did a multicentre, international prospective cohort study of patients younger than 16 years, presenting to hospital for the first time with oesophageal atresia, congenital diaphragmatic hernia, intestinal atresia, gastroschisis, exomphalos, anorectal malformation, and Hirschsprung’s disease. Recruitment was of consecutive patients for a minimum of 1 month between October, 2018, and April, 2019. We collected data on patient demographics, clinical status, interventions, and outcomes using the REDCap platform. Patients were followed up for 30 days after primary intervention, or 30 days after admission if they did not receive an intervention. The primary outcome was all-cause, in-hospital mortality for all conditions combined and each condition individually, stratified by country income status. We did a complete case analysis. Findings We included 3849 patients with 3975 study conditions (560 with oesophageal atresia, 448 with congenital diaphragmatic hernia, 681 with intestinal atresia, 453 with gastroschisis, 325 with exomphalos, 991 with anorectal malformation, and 517 with Hirschsprung’s disease) from 264 hospitals (89 in high-income countries, 166 in middleincome countries, and nine in low-income countries) in 74 countries. Of the 3849 patients, 2231 (58·0%) were male. Median gestational age at birth was 38 weeks (IQR 36–39) and median bodyweight at presentation was 2·8 kg (2·3–3·3). Mortality among all patients was 37 (39·8%) of 93 in low-income countries, 583 (20·4%) of 2860 in middle-income countries, and 50 (5·6%) of 896 in high-income countries (p<0·0001 between all country income groups). Gastroschisis had the greatest difference in mortality between country income strata (nine [90·0%] of ten in lowincome countries, 97 [31·9%] of 304 in middle-income countries, and two [1·4%] of 139 in high-income countries; p≤0·0001 between all country income groups). Factors significantly associated with higher mortality for all patients combined included country income status (low-income vs high-income countries, risk ratio 2·78 [95% CI 1·88–4·11], p<0·0001; middle-income vs high-income countries, 2·11 [1·59–2·79], p<0·0001), sepsis at presentation (1·20 [1·04–1·40], p=0·016), higher American Society of Anesthesiologists (ASA) score at primary intervention (ASA 4–5 vs ASA 1–2, 1·82 [1·40–2·35], p<0·0001; ASA 3 vs ASA 1–2, 1·58, [1·30–1·92], p<0·0001]), surgical safety checklist not used (1·39 [1·02–1·90], p=0·035), and ventilation or parenteral nutrition unavailable when needed (ventilation 1·96, [1·41–2·71], p=0·0001; parenteral nutrition 1·35, [1·05–1·74], p=0·018). Administration of parenteral nutrition (0·61, [0·47–0·79], p=0·0002) and use of a peripherally inserted central catheter (0·65 [0·50–0·86], p=0·0024) or percutaneous central line (0·69 [0·48–1·00], p=0·049) were associated with lower mortality. Interpretation Unacceptable differences in mortality exist for gastrointestinal congenital anomalies between lowincome, middle-income, and high-income countries. Improving access to quality neonatal surgical care in LMICs will be vital to achieve Sustainable Development Goal 3.2 of ending preventable deaths in neonates and children younger than 5 years by 2030

    The evolving SARS-CoV-2 epidemic in Africa: Insights from rapidly expanding genomic surveillance

    Get PDF
    INTRODUCTION Investment in Africa over the past year with regard to severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) sequencing has led to a massive increase in the number of sequences, which, to date, exceeds 100,000 sequences generated to track the pandemic on the continent. These sequences have profoundly affected how public health officials in Africa have navigated the COVID-19 pandemic. RATIONALE We demonstrate how the first 100,000 SARS-CoV-2 sequences from Africa have helped monitor the epidemic on the continent, how genomic surveillance expanded over the course of the pandemic, and how we adapted our sequencing methods to deal with an evolving virus. Finally, we also examine how viral lineages have spread across the continent in a phylogeographic framework to gain insights into the underlying temporal and spatial transmission dynamics for several variants of concern (VOCs). RESULTS Our results indicate that the number of countries in Africa that can sequence the virus within their own borders is growing and that this is coupled with a shorter turnaround time from the time of sampling to sequence submission. Ongoing evolution necessitated the continual updating of primer sets, and, as a result, eight primer sets were designed in tandem with viral evolution and used to ensure effective sequencing of the virus. The pandemic unfolded through multiple waves of infection that were each driven by distinct genetic lineages, with B.1-like ancestral strains associated with the first pandemic wave of infections in 2020. Successive waves on the continent were fueled by different VOCs, with Alpha and Beta cocirculating in distinct spatial patterns during the second wave and Delta and Omicron affecting the whole continent during the third and fourth waves, respectively. Phylogeographic reconstruction points toward distinct differences in viral importation and exportation patterns associated with the Alpha, Beta, Delta, and Omicron variants and subvariants, when considering both Africa versus the rest of the world and viral dissemination within the continent. Our epidemiological and phylogenetic inferences therefore underscore the heterogeneous nature of the pandemic on the continent and highlight key insights and challenges, for instance, recognizing the limitations of low testing proportions. We also highlight the early warning capacity that genomic surveillance in Africa has had for the rest of the world with the detection of new lineages and variants, the most recent being the characterization of various Omicron subvariants. CONCLUSION Sustained investment for diagnostics and genomic surveillance in Africa is needed as the virus continues to evolve. This is important not only to help combat SARS-CoV-2 on the continent but also because it can be used as a platform to help address the many emerging and reemerging infectious disease threats in Africa. In particular, capacity building for local sequencing within countries or within the continent should be prioritized because this is generally associated with shorter turnaround times, providing the most benefit to local public health authorities tasked with pandemic response and mitigation and allowing for the fastest reaction to localized outbreaks. These investments are crucial for pandemic preparedness and response and will serve the health of the continent well into the 21st century

    Reducing the environmental impact of surgery on a global scale: systematic review and co-prioritization with healthcare workers in 132 countries

    Get PDF
    Abstract Background Healthcare cannot achieve net-zero carbon without addressing operating theatres. The aim of this study was to prioritize feasible interventions to reduce the environmental impact of operating theatres. Methods This study adopted a four-phase Delphi consensus co-prioritization methodology. In phase 1, a systematic review of published interventions and global consultation of perioperative healthcare professionals were used to longlist interventions. In phase 2, iterative thematic analysis consolidated comparable interventions into a shortlist. In phase 3, the shortlist was co-prioritized based on patient and clinician views on acceptability, feasibility, and safety. In phase 4, ranked lists of interventions were presented by their relevance to high-income countries and low–middle-income countries. Results In phase 1, 43 interventions were identified, which had low uptake in practice according to 3042 professionals globally. In phase 2, a shortlist of 15 intervention domains was generated. In phase 3, interventions were deemed acceptable for more than 90 per cent of patients except for reducing general anaesthesia (84 per cent) and re-sterilization of ‘single-use’ consumables (86 per cent). In phase 4, the top three shortlisted interventions for high-income countries were: introducing recycling; reducing use of anaesthetic gases; and appropriate clinical waste processing. In phase 4, the top three shortlisted interventions for low–middle-income countries were: introducing reusable surgical devices; reducing use of consumables; and reducing the use of general anaesthesia. Conclusion This is a step toward environmentally sustainable operating environments with actionable interventions applicable to both high– and low–middle–income countries
    corecore