38 research outputs found
Quantemol Electron Collisions (QEC): An Enhanced Expert System for Performing Electron Molecule Collision Calculations Using the R-Matrix Method
Collisions of low energy electrons with molecules are important for understanding many aspects of the environment and technologies. Understanding the processes that occur in these types of collisions can give insights into plasma etching processes, edge effects in fusion plasmas, radiation damage to biological tissues and more. A radical update of the previous expert system for computing observables relevant to these processes, Quantemol-N, is presented. The new Quantemol Electron Collision (QEC) expert system simplifyies the user experience, improving reliability and implements new features. The QEC graphical user interface (GUI) interfaces the Molpro quantum chemistry package for molecular target setups, and the sophisticated UKRmol+ codes to generate accurate and reliable cross-sections. These include elastic cross-sections, super elastic cross-sections between excited states, electron impact dissociation, scattering reaction rates, dissociative electron attachment, differential cross-sections, momentum transfer cross-sections, ionization cross sections, and high energy electron scattering cross-sections. With this new interface we will be implementing dissociative recombination estimations, vibrational excitations for neutrals and ions, and effective core potentials in the near future
Scanning electrochemical microscopy as a local probe of oxygen permeability in cartilage
The use of scanning electrochemical microscopy, a high-resolution chemical imaging technique, to probe the distribution and mobility of solutes in articular cartilage is described. In this application, a mobile ultramicroelectrode is positioned close (not, vert, similar1 μm) to the cartilage sample surface, which has been equilibrated in a bathing solution containing the solute of interest. The solute is electrolyzed at a diffusion-limited rate, and the current response measured as the ultramicroelectrode is scanned across the sample surface. The topography of the samples was determined using Ru(CN)64−, a solute to which the cartilage matrix was impermeable. This revealed a number of pit-like depressions corresponding to the distribution of chondrocytes, which were also observed by atomic force and light microscopy. Subsequent imaging of the same area of the cartilage sample for the diffusion-limited reduction of oxygen indicated enhanced, but heterogeneous, permeability of oxygen across the cartilage surface. In particular, areas of high permeability were observed in the cellular and pericellular regions. This is the first time that inhomogeneities in the permeability of cartilage toward simple solutes, such as oxygen, have been observed on a micrometer scale
Efficacious Intermittent Dosing of a Novel JAK2 Inhibitor in Mouse Models of Polycythemia Vera
A high percentage of patients with the myeloproliferative disorder polycythemia vera (PV) harbor a Val617→Phe activating mutation in the Janus kinase 2 (JAK2) gene, and both cell culture and mouse models have established a functional role for this mutation in the development of this disease. We describe the properties of MRLB-11055, a highly potent inhibitor of both the WT and V617F forms of JAK2, that has therapeutic efficacy in erythropoietin (EPO)-driven and JAK2V617F-driven mouse models of PV. In cultured cells, MRLB-11055 blocked proliferation and induced apoptosis in a manner consistent with JAK2 pathway inhibition. MRLB-11055 effectively prevented EPO-induced STAT5 activation in the peripheral blood of acutely dosed mice, and could prevent EPO-induced splenomegaly and erythrocytosis in chronically dosed mice. In a bone marrow reconstituted JAK2V617F-luciferase murine PV model, MRLB-11055 rapidly reduced the burden of JAK2V617F-expressing cells from both the spleen and the bone marrow. Using real-time in vivo imaging, we examined the kinetics of disease regression and resurgence, enabling the development of an intermittent dosing schedule that achieved significant reductions in both erythroid and myeloid populations with minimal impact on lymphoid cells. Our studies provide a rationale for the use of non-continuous treatment to provide optimal therapy for PV patients
Morbidity and mortality after anaesthesia in early life: results of the European prospective multicentre observational study, neonate and children audit of anaesthesia practice in Europe (NECTARINE)
BACKGROUND: Neonates and infants requiring anaesthesia are at risk of physiological instability and complications, but triggers for peri-anaesthetic interventions and associations with subsequent outcome are unknown. METHODS: This prospective, observational study recruited patients up to 60 weeks' postmenstrual age undergoing anaesthesia for surgical or diagnostic procedures from 165 centres in 31 European countries between March 2016 and January 2017. The primary aim was to identify thresholds of pre-determined physiological variables that triggered a medical intervention. The secondary aims were to evaluate morbidities, mortality at 30 and 90 days, or both, and associations with critical events. RESULTS: Infants (n=5609) born at mean (standard deviation [sd]) 36.2 (4.4) weeks postmenstrual age (35.7% preterm) underwent 6542 procedures within 63 (48) days of birth. Critical event(s) requiring intervention occurred in 35.2% of cases, mainly hypotension (>30% decrease in blood pressure) or reduced oxygenation (SpO2 <85%). Postmenstrual age influenced the incidence and thresholds for intervention. Risk of critical events was increased by prior neonatal medical conditions, congenital anomalies, or both (relative risk [RR]=1.16; 95% confidence interval [CI], 1.04–1.28) and in those requiring preoperative intensive support (RR=1.27; 95% CI, 1.15–1.41). Additional complications occurred in 16.3% of patients by 30 days, and overall 90-day mortality was 3.2% (95% CI, 2.7–3.7%). Co-occurrence of intraoperative hypotension, hypoxaemia, and anaemia was associated with increased risk of morbidity (RR=3.56; 95% CI, 1.64–7.71) and mortality (RR=19.80; 95% CI, 5.87–66.7). CONCLUSIONS: Variability in physiological thresholds that triggered an intervention, and the impact of poor tissue oxygenation on patient's outcome, highlight the need for more standardised perioperative management guidelines for neonates and infants
Safety of pulsed field ablation in more than 17,000 patients with atrial fibrillation in the MANIFEST-17K study
Pulsed field ablation (PFA) is an emerging technology for the treatment of atrial fibrillation (AF), for which pre-clinical and early-stage clinical data are suggestive of some degree of preferentiality to myocardial tissue ablation without damage to adjacent structures. Here in the MANIFEST-17K study we assessed the safety of PFA by studying the post-approval use of this treatment modality. Of the 116 centers performing post-approval PFA with a pentaspline catheter, data were received from 106 centers (91.4% participation) regarding 17,642 patients undergoing PFA (mean age 64, 34.7% female, 57.8% paroxysmal AF and 35.2% persistent AF). No esophageal complications, pulmonary vein stenosis or persistent phrenic palsy was reported (transient palsy was reported in 0.06% of patients; 11 of 17,642). Major complications, reported for ~1% of patients (173 of 17,642), were pericardial tamponade (0.36%; 63 of 17,642) and vascular events (0.30%; 53 of 17,642). Stroke was rare (0.12%; 22 of 17,642) and death was even rarer (0.03%; 5 of 17,642). Unexpected complications of PFA were coronary arterial spasm in 0.14% of patients (25 of 17,642) and hemolysis-related acute renal failure necessitating hemodialysis in 0.03% of patients (5 of 17,642). Taken together, these data indicate that PFA demonstrates a favorable safety profile by avoiding much of the collateral damage seen with conventional thermal ablation. PFA has the potential to be transformative for the management of patients with AF.Peer reviewe
Late Presentation With HIV in Africa: Phenotypes, Risk, and Risk Stratification in the REALITY Trial.
This article has been accepted for publication in Clinical Infectious Diseases Published by Oxford University PressBackground: Severely immunocompromised human immunodeficiency virus (HIV)-infected individuals have high mortality shortly after starting antiretroviral therapy (ART). We investigated predictors of early mortality and "late presenter" phenotypes. Methods: The Reduction of EArly MortaLITY (REALITY) trial enrolled ART-naive adults and children ≥5 years of age with CD4 counts .1). Results: Among 1711 included participants, 203 (12%) died. Mortality was independently higher with older age; lower CD4 count, albumin, hemoglobin, and grip strength; presence of World Health Organization stage 3/4 weight loss, fever, or vomiting; and problems with mobility or self-care at baseline (all P < .04). Receiving enhanced antimicrobial prophylaxis independently reduced mortality (P = .02). Of five late-presenter phenotypes, Group 1 (n = 355) had highest mortality (25%; median CD4 count, 28 cells/µL), with high symptom burden, weight loss, poor mobility, and low albumin and hemoglobin. Group 2 (n = 394; 11% mortality; 43 cells/µL) also had weight loss, with high white cell, platelet, and neutrophil counts suggesting underlying inflammation/infection. Group 3 (n = 218; 10% mortality) had low CD4 counts (27 cells/µL), but low symptom burden and maintained fat mass. The remaining groups had 4%-6% mortality. Conclusions: Clinical and laboratory features identified groups with highest mortality following ART initiation. A screening tool could identify patients with low CD4 counts for prioritizing same-day ART initiation, enhanced prophylaxis, and intensive follow-up. Clinical Trials Registration: ISRCTN43622374.REALITY was funded by the Joint Global Health Trials Scheme (JGHTS) of the UK Department for International Development, the Wellcome Trust, and Medical Research Council (MRC) (grant number G1100693). Additional funding support was provided by the PENTA Foundation and core support to the MRC Clinical Trials Unit at University College London (grant numbers MC_UU_12023/23 and MC_UU_12023/26). Cipla Ltd, Gilead Sciences, ViiV Healthcare/GlaxoSmithKline, and Merck Sharp & Dohme donated drugs for REALITY, and ready-to-use supplementary food was purchased from Valid International. A. J. P. is funded by the Wellcome Trust (grant number 108065/Z/15/Z). J. A. B. is funded by the JGHTS (grant number MR/M007367/1). The Malawi-Liverpool–Wellcome Trust Clinical Research Programme, University of Malawi College of Medicine (grant number 101113/Z/13/Z) and the Kenya Medical Research Institute (KEMRI)/Wellcome Trust Research Programme, Kilifi (grant number 203077/Z/16/Z) are supported by strategic awards from the Wellcome Trust, United Kingdom. Permission to publish was granted by the Director of KEMRI. This supplement was supported by funds from the Bill & Melinda Gates Foundation
High-Value Token-Blocking: Efficient Blocking Method for Record Linkage
Data integration is an important component of Big Data analytics. One of the key challenges in data integration is record linkage, that is, matching records that represent the same real-world entity. Because of computational costs, methods referred to as blocking are employed as a part of the record linkage pipeline in order to reduce the number of comparisons among records. In the past decade, a range of blocking techniques have been proposed. Real-world applications require approaches that can handle heterogeneous data sources and do not rely on labelled data. We propose high-value token-blocking (HVTB), a simple and efficient approach for blocking that is unsupervised and schema-agnostic, based on a crafted use of Term Frequency-Inverse Document Frequency. We compare HVTB with multiple methods and over a range of datasets, including a novel unstructured dataset composed of titles and abstracts of scientific papers. We thoroughly discuss results in terms of accuracy, use of computational resources, and different characteristics of datasets and records. The simplicity of HVTB yields fast computations and does not harm its accuracy when compared with existing approaches. It is shown to be significantly superior to other methods, suggesting that simpler methods for blocking should be considered before resorting to more sophisticated methods
A review of unsupervised and semi-supervised blocking methods for record linkage
Record linkage, referred to also as entity resolution, is a process of identifying records representing the same real-world entity (e.g. a person) across varied data sources. To reduce the computational complexity associated with record comparisons, a task referred to as blocking is commonly performed prior to the linkage process. The blocking task involves partitioning records into blocks of records and treating records from different blocks as not related to the same entity. Following this, record linkage methods are applied within each block significantly reducing the number of record comparisons. Most of the existing blocking techniques require some degree of parameter selection in order to optimise the performance for a particular dataset (e.g. attributes and blocking functions used for splitting records into blocks). Optimal parameters can be selected manually but this is expensive in terms of time and cost and assumes a domain expert to be available. Automatic supervised blocking techniques have been proposed; however, they require a set of labelled data in which the matching status of each record is known. In the majority of real-world scenarios, we do not have any information regarding the matching status of records obtained from multiple sources. Therefore, there is a demand for blocking techniques that sufficiently reduce the number of record comparisons with little to no human input or labelled data required. Given the importance of the problem, recent research efforts have seen the development of novel unsupervised and semi-supervised blocking techniques. In this chapter, we review existing blocking techniques and discuss their advantages and disadvantages. We detail other research areas that have recently arose and discuss other unresolved issues that are still to be addressed
A review of unsupervised and semi-supervised blocking methods for record linkage
Record linkage, referred to also as entity resolution, is a process of identifying records representing the same real-world entity (e.g. a person) across varied data sources. To reduce the computational complexity associated with record comparisons, a task referred to as blocking is commonly performed prior to the linkage process. The blocking task involves partitioning records into blocks of records and treating records from different blocks as not related to the same entity. Following this, record linkage methods are applied within each block significantly reducing the number of record comparisons. Most of the existing blocking techniques require some degree of parameter selection in order to optimise the performance for a particular dataset (e.g. attributes and blocking functions used for splitting records into blocks). Optimal parameters can be selected manually but this is expensive in terms of time and cost and assumes a domain expert to be available. Automatic supervised blocking techniques have been proposed; however, they require a set of labelled data in which the matching status of each record is known. In the majority of real-world scenarios, we do not have any information regarding the matching status of records obtained from multiple sources. Therefore, there is a demand for blocking techniques that sufficiently reduce the number of record comparisons with little to no human input or labelled data required. Given the importance of the problem, recent research efforts have seen the development of novel unsupervised and semi-supervised blocking techniques. In this chapter, we review existing blocking techniques and discuss their advantages and disadvantages. We detail other research areas that have recently arose and discuss other unresolved issues that are still to be addressed