40 research outputs found

    Introduction of a Toric Intraocular Lens to a Non-Refractive Cataract Practice: Challenges and Outcomes

    Get PDF
    AIM: To identify challenges inherent in introducing a toric intraocular lens (IOL) to a non-refractive cataract practice, and evaluate residual astigmatism achieved and its impact on patient satisfaction. METHODS: Following introduction of a toric IOL to a cataract practice with all procedures undertaken by a single, non-refractive, surgeon (SB), pre-operative, intra-operative and post-operative data was analysed. Attenuation of anticipated post-operative astigmatism was examined, and subjectively perceived visual functioning was assessed using validated questionnaires. RESULTS: Median difference vector (DV, the induced astigmatic change [by magnitude and axis] that would enable the initial surgery to achieve intended target) was 0.93D; median anticipated DV with a non-toric IOL was 2.38D. One eye exhibited 0.75D residual astigmatism, compared to 3.8D anticipated residual astigmatism with a non-toric IOL. 100% of respondents reported satisfaction of ≥ 6/10, with 37.84% of respondents entirely satisfied (10/10). 17 patients (38.63%) reported no symptoms of dysphotopsia (dysphoptosia score 0/10), only 3 respondents (6.8%) reported a clinically meaningful level of dysphotopsia (≥ 4/10). Mean post-operative NEI VF-11 score was 0.54 (+/-0.83; scale 0 – 4). CONCLUSION: Use of a toric IOL to manage astigmatism during cataract surgery results in less post-operative astigmatism than a non-toric IOL, resulting in avoidance of unacceptable post-operative astigmatism

    Development and qualification of a scale-down model of a commercial mammalian cell culture bioreactor using Computational Fluid Dynamics

    Get PDF
    The use of computational fluid dynamics (CFD) techniques can be used to develop and/or optimize a scale-down model to investigate mixing, oxygen mass transfer characteristics and turbulence, strain rate, and bubble size distribution in laboratory-scale stirred-tank bioreactors. In this work, CFD was used to test and modify a laboratory-scale bioreactor model of a manufacturing-scale bioreactor. The laboratory-scale model was originally established based on power per volume (P/V) and volume of gas per bioreactor volume per minute (vvm). CFD simulations of mixing time, power input, and gas volume hold-up were performed to demonstrate comparability between the laboratory-scale model and the manufacturing-scale bioreactor. These simulations were verified with experimental measurement of mixing time and gas hold-up. The results were used to propose sparge rate and impeller agitation as factors in a Design of Experiments (DoE) study in laboratory-scale bioreactors. The impact of sparge rate and impeller agitation on cell growth, productivity, and product quality attributes were evaluated in the DOE study. The laboratory-scale production bioreactor model was compared to the manufacturing-scale production bioreactor. The results confirmed that CFD techniques could be used to establish sparge rate and impeller agitation to improve a scale-down model

    Developing European operational oceanography for Blue Growth, climate change adaptation and mitigation, and ecosystem-based management

    Get PDF
    Operational approaches have been more and more widely developed and used for providing marine data and information services for different socio-economic sectors of the Blue Growth and to advance knowledge about the marine environment. The objective of operational oceanographic research is to develop and improve the efficiency, timeliness, robustness and product quality of this approach. This white paper aims to address key scientific challenges and research priorities for the development of operational oceanography in Europe for the next 5–10 years. Knowledge gaps and deficiencies are identified in relation to common scientific challenges in four EuroGOOS knowledge areas: European Ocean Observations, Modelling and Forecasting Technology, Coastal Operational Oceanography and Operational Ecology. The areas “European Ocean Observations” and “Modelling and Forecasting Technology” focus on the further advancement of the basic instruments and capacities for European operational oceanography, while “Coastal Operational Oceanography” and “Operational Ecology” aim at developing new operational approaches for the corresponding knowledge areas

    Long-term safety and efficacy of extended-interval prophylaxis with recombinant factor IX Fc fusion protein (rFIXFc) in subjects with haemophilia B

    Get PDF
    The safety, efficacy, and prolonged half-life of recombinant factor IX Fc fusion protein (rFIXFc) were demonstrated in the Phase 3 B-LONG (adults/adolescents ≥12 years) and Kids B-LONG (children <12 years) studies of subjects with haemophilia B (≤2 IU/dl). Here, we report interim, long-term safety and efficacy data from B-YOND, the rFIXFc extension study. Eligible subjects who completed B-LONG or Kids B-LONG could enrol in B-YOND. There were four treatment groups: weekly prophylaxis (20–100 IU/kg every 7 days), individualised prophylaxis (100 IU/kg every 8–16 days), modified prophylaxis (further dosing personalisation to optimise prophylaxis), and episodic (on-demand) treatment. Subjects could change treatment groups at any point. Primary endpoint was inhibitor development. One hundred sixteen subjects enrolled in B-YOND. From the start of the parent studies to the B-YOND interim data cut, median duration of rFIXFc treatment was 39.5 months and 21.9 months among adults/adolescents and children, respectively; 68/93 (73.1 %) adults/adolescents and 9/23 (39.1 %) children had ≥100 cumulative rFIXFc exposure days. No inhibitors were observed. Median annualised bleeding rates (ABRs) were low in all prophylaxis regimens: weekly (≥12 years: 2.3; <6 years: 0.0; 6 to <12 years: 2.7), individualised (≥12 years: 2.3; 6 to <12 years: 2.4), and modified (≥12 years: 2.4). One or two infusions were sufficient to control 97 % (adults/adolescents) and 95 % (children) of bleeding episodes. Interim data from B-YOND are consistent with data from B-LONG and Kids B-LONG, and confirm the long-term safety of rFIXFc, absence of inhibitors, and maintenance of low ABRs with prophylactic dosing every 1 to 2 weeks

    Global transpiration data from sap flow measurements : the SAPFLUXNET database

    Get PDF
    Plant transpiration links physiological responses of vegetation to water supply and demand with hydrological, energy, and carbon budgets at the land-atmosphere interface. However, despite being the main land evaporative flux at the global scale, transpiration and its response to environmental drivers are currently not well constrained by observations. Here we introduce the first global compilation of whole-plant transpiration data from sap flow measurements (SAPFLUXNET, https://sapfluxnet.creaf.cat/, last access: 8 June 2021). We harmonized and quality-controlled individual datasets supplied by contributors worldwide in a semi-automatic data workflow implemented in the R programming language. Datasets include sub-daily time series of sap flow and hydrometeorological drivers for one or more growing seasons, as well as metadata on the stand characteristics, plant attributes, and technical details of the measurements. SAPFLUXNET contains 202 globally distributed datasets with sap flow time series for 2714 plants, mostly trees, of 174 species. SAPFLUXNET has a broad bioclimatic coverage, with woodland/shrubland and temperate forest biomes especially well represented (80 % of the datasets). The measurements cover a wide variety of stand structural characteristics and plant sizes. The datasets encompass the period between 1995 and 2018, with 50 % of the datasets being at least 3 years long. Accompanying radiation and vapour pressure deficit data are available for most of the datasets, while on-site soil water content is available for 56 % of the datasets. Many datasets contain data for species that make up 90 % or more of the total stand basal area, allowing the estimation of stand transpiration in diverse ecological settings. SAPFLUXNET adds to existing plant trait datasets, ecosystem flux networks, and remote sensing products to help increase our understanding of plant water use, plant responses to drought, and ecohydrological processes. SAPFLUXNET version 0.1.5 is freely available from the Zenodo repository (https://doi.org/10.5281/zenodo.3971689; Poyatos et al., 2020a). The "sapfluxnetr" R package - designed to access, visualize, and process SAPFLUXNET data - is available from CRAN.Peer reviewe

    COVID-19 trajectories among 57 million adults in England: a cohort study using electronic health records

    Get PDF
    BACKGROUND: Updatable estimates of COVID-19 onset, progression, and trajectories underpin pandemic mitigation efforts. To identify and characterise disease trajectories, we aimed to define and validate ten COVID-19 phenotypes from nationwide linked electronic health records (EHR) using an extensible framework. METHODS: In this cohort study, we used eight linked National Health Service (NHS) datasets for people in England alive on Jan 23, 2020. Data on COVID-19 testing, vaccination, primary and secondary care records, and death registrations were collected until Nov 30, 2021. We defined ten COVID-19 phenotypes reflecting clinically relevant stages of disease severity and encompassing five categories: positive SARS-CoV-2 test, primary care diagnosis, hospital admission, ventilation modality (four phenotypes), and death (three phenotypes). We constructed patient trajectories illustrating transition frequency and duration between phenotypes. Analyses were stratified by pandemic waves and vaccination status. FINDINGS: Among 57 032 174 individuals included in the cohort, 13 990 423 COVID-19 events were identified in 7 244 925 individuals, equating to an infection rate of 12·7% during the study period. Of 7 244 925 individuals, 460 737 (6·4%) were admitted to hospital and 158 020 (2·2%) died. Of 460 737 individuals who were admitted to hospital, 48 847 (10·6%) were admitted to the intensive care unit (ICU), 69 090 (15·0%) received non-invasive ventilation, and 25 928 (5·6%) received invasive ventilation. Among 384 135 patients who were admitted to hospital but did not require ventilation, mortality was higher in wave 1 (23 485 [30·4%] of 77 202 patients) than wave 2 (44 220 [23·1%] of 191 528 patients), but remained unchanged for patients admitted to the ICU. Mortality was highest among patients who received ventilatory support outside of the ICU in wave 1 (2569 [50·7%] of 5063 patients). 15 486 (9·8%) of 158 020 COVID-19-related deaths occurred within 28 days of the first COVID-19 event without a COVID-19 diagnoses on the death certificate. 10 884 (6·9%) of 158 020 deaths were identified exclusively from mortality data with no previous COVID-19 phenotype recorded. We observed longer patient trajectories in wave 2 than wave 1. INTERPRETATION: Our analyses illustrate the wide spectrum of disease trajectories as shown by differences in incidence, survival, and clinical pathways. We have provided a modular analytical framework that can be used to monitor the impact of the pandemic and generate evidence of clinical and policy relevance using multiple EHR sources. FUNDING: British Heart Foundation Data Science Centre, led by Health Data Research UK

    The Changing Landscape for Stroke\ua0Prevention in AF: Findings From the GLORIA-AF Registry Phase 2

    Get PDF
    Background GLORIA-AF (Global Registry on Long-Term Oral Antithrombotic Treatment in Patients with Atrial Fibrillation) is a prospective, global registry program describing antithrombotic treatment patterns in patients with newly diagnosed nonvalvular atrial fibrillation at risk of stroke. Phase 2 began when dabigatran, the first non\u2013vitamin K antagonist oral anticoagulant (NOAC), became available. Objectives This study sought to describe phase 2 baseline data and compare these with the pre-NOAC era collected during phase&nbsp;1. Methods During phase 2, 15,641 consenting patients were enrolled (November 2011 to December 2014); 15,092 were eligible. This pre-specified cross-sectional analysis describes eligible patients\u2019 baseline characteristics. Atrial fibrillation&nbsp;disease characteristics, medical outcomes, and concomitant diseases and medications were collected. Data were analyzed using descriptive statistics. Results Of the total patients, 45.5% were female; median age was 71 (interquartile range: 64, 78) years. Patients were from Europe (47.1%), North America (22.5%), Asia (20.3%), Latin America (6.0%), and the Middle East/Africa (4.0%). Most had high stroke risk (CHA2DS2-VASc [Congestive heart failure, Hypertension, Age&nbsp; 6575 years, Diabetes mellitus, previous Stroke, Vascular disease, Age 65 to 74 years, Sex category] score&nbsp; 652; 86.1%); 13.9% had moderate risk (CHA2DS2-VASc&nbsp;= 1). Overall, 79.9% received oral anticoagulants, of whom 47.6% received NOAC and 32.3% vitamin K antagonists (VKA); 12.1% received antiplatelet agents; 7.8% received no antithrombotic treatment. For comparison, the proportion of phase 1 patients (of N&nbsp;= 1,063 all eligible) prescribed VKA was 32.8%, acetylsalicylic acid 41.7%, and no therapy 20.2%. In Europe in phase 2, treatment with NOAC was more common than VKA (52.3% and 37.8%, respectively); 6.0% of patients received antiplatelet treatment; and 3.8% received no antithrombotic treatment. In North America, 52.1%, 26.2%, and 14.0% of patients received NOAC, VKA, and antiplatelet drugs, respectively; 7.5% received no antithrombotic treatment. NOAC use was less common in Asia (27.7%), where 27.5% of patients received VKA, 25.0% antiplatelet drugs, and 19.8% no antithrombotic treatment. Conclusions The baseline data from GLORIA-AF phase 2 demonstrate that in newly diagnosed nonvalvular atrial fibrillation patients, NOAC have been highly adopted into practice, becoming more frequently prescribed than VKA in&nbsp;Europe and North America. Worldwide, however, a large proportion of patients remain undertreated, particularly in&nbsp;Asia&nbsp;and North America. (Global Registry on Long-Term Oral Antithrombotic Treatment in Patients With Atrial Fibrillation [GLORIA-AF]; NCT01468701

    Large expert-curated database for benchmarking document similarity detection in biomedical literature search

    Get PDF
    Document recommendation systems for locating relevant literature have mostly relied on methods developed a decade ago. This is largely due to the lack of a large offline gold-standard benchmark of relevant documents that cover a variety of research fields such that newly developed literature search techniques can be compared, improved and translated into practice. To overcome this bottleneck, we have established the RElevant LIterature SearcH consortium consisting of more than 1500 scientists from 84 countries, who have collectively annotated the relevance of over 180 000 PubMed-listed articles with regard to their respective seed (input) article/s. The majority of annotations were contributed by highly experienced, original authors of the seed articles. The collected data cover 76% of all unique PubMed Medical Subject Headings descriptors. No systematic biases were observed across different experience levels, research fields or time spent on annotations. More importantly, annotations of the same document pairs contributed by different scientists were highly concordant. We further show that the three representative baseline methods used to generate recommended articles for evaluation (Okapi Best Matching 25, Term Frequency-Inverse Document Frequency and PubMed Related Articles) had similar overall performances. Additionally, we found that these methods each tend to produce distinct collections of recommended articles, suggesting that a hybrid method may be required to completely capture all relevant articles. The established database server located at https://relishdb.ict.griffith.edu.au is freely available for the downloading of annotation data and the blind testing of new methods. We expect that this benchmark will be useful for stimulating the development of new powerful techniques for title and title/abstract-based search engines for relevant articles in biomedical research.Peer reviewe

    Genetic Testing to Inform Epilepsy Treatment Management From an International Study of Clinical Practice

    Get PDF
    IMPORTANCE: It is currently unknown how often and in which ways a genetic diagnosis given to a patient with epilepsy is associated with clinical management and outcomes. OBJECTIVE: To evaluate how genetic diagnoses in patients with epilepsy are associated with clinical management and outcomes. DESIGN, SETTING, AND PARTICIPANTS: This was a retrospective cross-sectional study of patients referred for multigene panel testing between March 18, 2016, and August 3, 2020, with outcomes reported between May and November 2020. The study setting included a commercial genetic testing laboratory and multicenter clinical practices. Patients with epilepsy, regardless of sociodemographic features, who received a pathogenic/likely pathogenic (P/LP) variant were included in the study. Case report forms were completed by all health care professionals. EXPOSURES: Genetic test results. MAIN OUTCOMES AND MEASURES: Clinical management changes after a genetic diagnosis (ie, 1 P/LP variant in autosomal dominant and X-linked diseases; 2 P/LP variants in autosomal recessive diseases) and subsequent patient outcomes as reported by health care professionals on case report forms. RESULTS: Among 418 patients, median (IQR) age at the time of testing was 4 (1-10) years, with an age range of 0 to 52 years, and 53.8% (n = 225) were female individuals. The mean (SD) time from a genetic test order to case report form completion was 595 (368) days (range, 27-1673 days). A genetic diagnosis was associated with changes in clinical management for 208 patients (49.8%) and usually (81.7% of the time) within 3 months of receiving the result. The most common clinical management changes were the addition of a new medication (78 [21.7%]), the initiation of medication (51 [14.2%]), the referral of a patient to a specialist (48 [13.4%]), vigilance for subclinical or extraneurological disease features (46 [12.8%]), and the cessation of a medication (42 [11.7%]). Among 167 patients with follow-up clinical information available (mean [SD] time, 584 [365] days), 125 (74.9%) reported positive outcomes, 108 (64.7%) reported reduction or elimination of seizures, 37 (22.2%) had decreases in the severity of other clinical signs, and 11 (6.6%) had reduced medication adverse effects. A few patients reported worsening of outcomes, including a decline in their condition (20 [12.0%]), increased seizure frequency (6 [3.6%]), and adverse medication effects (3 [1.8%]). No clinical management changes were reported for 178 patients (42.6%). CONCLUSIONS AND RELEVANCE: Results of this cross-sectional study suggest that genetic testing of individuals with epilepsy may be materially associated with clinical decision-making and improved patient outcomes
    corecore