40 research outputs found

    Alternate day versus consecutive day oral iron supplementation in iron-depleted women: a randomized double-blind placebo-controlled study

    Get PDF
    Background: Guidelines to treat iron deficiency recommend daily provision of oral iron, but this may decrease fractional iron absorption and increase side effects. Our objective was to compare consecutive-day versus alternate-day iron supplementation. Methods: In a double-masked, randomized, placebo-controlled trial, young Swiss women (n = 150; serum ferritin ≤30 μg/L) were assigned to: daily 100 mg iron for 90 d, followed by daily placebo for another 90 d (consecutive-day group) or the same daily dose of iron and placebo on alternate days for 180 d (alternate-day group). The study period was 24/11/2021-10/8/2022. Co-primary outcomes, at equal total iron doses, were serum ferritin and gastrointestinal side effects; secondary outcomes were iron deficiency and serum hepcidin. Compliance and side effects were recorded daily using a mobile application. Data were analysed using mixed models and longitudinal prevalence ratios (LPR). The trial was registered at ClinicalTrials.gov (NCT05105438). Findings: 75 women were assigned to each group and included in the intention-to-treat analysis. Capsule adherence and side effect reporting was >97% in both groups. At equal total iron doses, comparing consecutive-day and alternate-day groups, median serum ferritin was 43.8 μg/L (31.7-58.2) versus 44.8 μg/L (33.8-53.6) (P = 0.98), the LPR for gastrointestinal side effects on days of iron intake was 1.56 (95% CI: 1.38, 1.77; P < 0.0001), and median serum hepcidin was 3.0 nM (IQR 2.0-5.0) versus 1.9 nM (1.4-2.9) (P < 0.0001). Iron deficiency prevalence after 3 months was 5.5% versus 4.3% (P = 0.74) and after 6 months was 11.4% and 3.0% (P = 0.049). Interpretation: At equal total iron doses, compared to consecutive day dosing of iron, alternate day dosing did not result in higher serum ferritin but reduced iron deficiency at 6 months and triggered fewer gastrointestinal side effects

    Impact of respiratory motion correction and spatial resolution on lesion detection in PET: a simulation study based on real MR dynamic data

    Get PDF
    The aim of this study is to investigate the impact of respiratory motion correction and spatial resolution on lesion detectability in PET as a function of lesion size and tracer uptake. Real respiratory signals describing different breathing types are combined with a motion model formed from real dynamic MR data to simulate multiple dynamic PET datasets acquired from a continuously moving subject. Lung and liver lesions were simulated with diameters ranging from 6 to 12 mm and lesion to background ratio ranging from 3:1 to 6:1. Projection data for 6 and 3 mm PET scanner resolution were generated using analytic simulations and reconstructed without and with motion correction. Motion correction was achieved using motion compensated image reconstruction. The detectability performance was quantified by a receiver operating characteristic (ROC) analysis obtained using a channelized Hotelling observer and the area under the ROC curve (AUC) was calculated as the figure of merit. The results indicate that respiratory motion limits the detectability of lung and liver lesions, depending on the variation of the breathing cycle length and amplitude. Patients with large quiescent periods had a greater AUC than patients with regular breathing cycles and patients with long-term variability in respiratory cycle or higher motion amplitude. In addition, small (less than 10 mm diameter) or low contrast (3:1) lesions showed the greatest improvement in AUC as a result of applying motion correction. In particular, after applying motion correction the AUC is improved by up to 42% with current PET resolution (i.e. 6 mm) and up to 51% for higher PET resolution (i.e. 3 mm). Finally, the benefit of increasing the scanner resolution is small unless motion correction is applied. This investigation indicates high impact of respiratory motion correction on lesion detectability in PET and highlights the importance of motion correction in order to benefit from the increased resolution of future PET scanners

    Operational Ontology for Oncology (O3): A Professional Society-Based, Multistakeholder, Consensus-Driven Informatics Standard Supporting Clinical and Research Use of Real-World Data From Patients Treated for Cancer

    Get PDF
    PURPOSE: The ongoing lack of data standardization severely undermines the potential for automated learning from the vast amount of information routinely archived in electronic health records (EHRs), radiation oncology information systems, treatment planning systems, and other cancer care and outcomes databases. We sought to create a standardized ontology for clinical data, social determinants of health, and other radiation oncology concepts and interrelationships. METHODS AND MATERIALS: The American Association of Physicists in Medicine\u27s Big Data Science Committee was initiated in July 2019 to explore common ground from the stakeholders\u27 collective experience of issues that typically compromise the formation of large inter- and intra-institutional databases from EHRs. The Big Data Science Committee adopted an iterative, cyclical approach to engaging stakeholders beyond its membership to optimize the integration of diverse perspectives from the community. RESULTS: We developed the Operational Ontology for Oncology (O3), which identified 42 key elements, 359 attributes, 144 value sets, and 155 relationships ranked in relative importance of clinical significance, likelihood of availability in EHRs, and the ability to modify routine clinical processes to permit aggregation. Recommendations are provided for best use and development of the O3 to 4 constituencies: device manufacturers, centers of clinical care, researchers, and professional societies. CONCLUSIONS: O3 is designed to extend and interoperate with existing global infrastructure and data science standards. The implementation of these recommendations will lower the barriers for aggregation of information that could be used to create large, representative, findable, accessible, interoperable, and reusable data sets to support the scientific objectives of grant programs. The construction of comprehensive real-world data sets and application of advanced analytical techniques, including artificial intelligence, holds the potential to revolutionize patient management and improve outcomes by leveraging increased access to information derived from larger, more representative data sets

    Effect of dietary factors and time of day on iron absorption from oral iron supplements in iron deficient women

    No full text
    Guidelines generally recommend taking iron supplements in the morning away from meals and with ascorbic acid (AA) to increase iron absorption. However, there is little direct evidence on the effects of dietary factors and time of day on absorption from iron supplements. In iron-depleted women (n = 34; median serum ferritin 19.4 μg/L), we administered 100 mg iron doses labeled with ⁵⁴Fe, ⁵⁷Fe, or ⁵⁸Fe in each of six different conditions with: (1) water (reference) in the morning; (2) 80 mg AA; (3) 500 mg AA; (4) coffee; (5) breakfast including coffee and orange juice (containing ~90 mg AA); and (6) water in the afternoon. Fractional iron absorption (FIA) from these n = 204 doses was calculated based on erythrocyte incorporation of multiple isotopic labels. Compared to the reference: 80 mg AA increased FIA by 30% (p < .001) but 500 mg AA did not further increase FIA (p = .226); coffee decreased FIA by 54% (p = .004); coffee with breakfast decreased FIA by 66% (p < .001) despite the presence of ~90 mg of AA. Serum hepcidin was higher (p < .001) and FIA was 37% lower (p = .059) in the afternoon compared to the morning. Our data suggest that to maximize efficacy, ferrous iron supplements should be consumed in the morning, away from meals or coffee, and with an AA-rich food or beverage. Compared to consuming a 100 mg iron dose in the morning with coffee or breakfast, consuming it with orange juice alone results in a ~ 4-fold increase in iron absorption, and provides ~20 more mg of absorbed iron per dose. The trial was registered at Clinicaltrials.gov(NCT04074707).ISSN:1096-865

    Oral iron supplementation in iron-deficient women: How much and how often?

    No full text
    Iron deficiency and iron deficiency anemia (IDA) are major public health problems worldwide, especially in young women. Oral iron supplementation can be an effective strategy to treat and prevent IDA, but guidelines vary. Some experts recommend doses of 150–200 mg elemental iron per day, with the dose split through the day. However, recent studies suggest this may not be an optimal regimen. The fraction of iron absorbed from high doses of oral iron is low, and unabsorbed iron can cause gut irritation, inflammation and dysbiosis, and these reduce compliance. In recent studies using serum hepcidin profiles and stable iron isotopes to quantify iron absorption in young women, we have shown that: (a) oral iron doses ≥60 mg in iron-deficient women, and doses ≥100 mg in women with IDA, stimulate an acute increase in hepcidin that persists 24 h after the dose, but subsides by 48 h; (b) therefore, to maximize fractional iron absorption, oral doses ≥60 mg should be given on alternate days; (c) the circadian increase in plasma hepcidin is augmented by a morning iron dose; therefore, iron doses should not be given in the afternoon or evening after a morning dose. If rate of Hb response is important, a pooled analysis of our data done for this review indicates that total iron absorption is also higher if twice the target daily iron dose is given on alternate days. In summary, these studies suggest changing from daily to alternate-day schedules and from divided to morning single doses increases iron absorption and may reduce side effects. Thus, providing morning doses of 60–120 mg iron as a ferrous salt given with ascorbic acid on alternate days may be an optimal oral dosing regimen for women with iron-deficiency and mild IDA

    DOCREP: Document Representation for Natural Language Processing

    No full text
    The field of natural language processing (NLP) revolves around the computational interpretation and generation of natural language. The language typically processed in NLP occurs in paragraphs or documents rather than in single isolated sentences. Despite this, most NLP tools operate over one sentence at a time, not utilising the context outside of the sentence nor any of the metadata associated with the underlying document. One pragmatic reason for this disparity is that representing documents and their annotations through an NLP pipeline is difficult with existing infrastructure. Representing linguistic annotations for a text document using a plain text markupbased format is not sufficient to capture arbitrarily nested and overlapping annotations. Despite this, most linguistic text corpora and NLP tools still operate in this fashion. A document representation framework (DRF) supports the creation of linguistic annotations stored separately to the original document, overcoming this nesting and overlapping annotations problem. Despite the prevalence of pipelines in NLP, there is little published work on, or implementations of, DRFs. The main DRFs, GATE and UIMA, exhibit usability issues which have limited their uptake by the NLP community. This thesis aims to solve this problem through a novel, modern DRF, DOCREP; a portmanteau of document representation. DOCREP is designed to be efficient, programming language and environment agnostic, and most importantly, easy to use. We want DOCREP to be powerful and simple enough to use that NLP researchers and language technology application developers would even use it in their own small projects instead of developing their own ad hoc solution. This thesis begins by presenting the design criteria for our new DRF, extending upon existing requirements from the literature with additional usability and efficiency requirements that should lead to greater use of DRFs. We outline how our new DRF, DOCREP, differs from existing DRFs in terms of the data model, serialisation strategy, developer interactions, support for rapid prototyping, and the expected runtime and environment requirements. We then describe our provided implementations of DOCREP in Python, C++, and Java, the most common languages in NLP; outlining their efficiency, idiomaticity, and the ways in which these implementations satisfy our design requirements. We then present two different evaluations of DOCREP. First, we evaluate its ability to model complex linguistic corpora through the conversion of the OntoNotes 5 corpus to DOCREP and UIMA, outlining the differences in modelling approaches required and efficiency when using these two DRFs. Second, we evaluate DOCREP against our usability requirements from the perspective of a computational linguist who is new to DOCREP. We walk through a number of common use cases for working with text corpora and contrast traditional approaches again their DOCREP counterpart. These two evaluations conclude that DOCREP satisfies our outlined design requirements and outperforms existing DRFs in terms of efficiency, and most importantly, usability. With DOCREP designed and evaluated, we then show how NLP applications can harness document structure. We present a novel document structureaware tokenization framework for the first stage of fullstack NLP systems. We then present a new structureaware NER system which achieves stateoftheart results on multiple standard NER evaluations. The tokenization framework produces its tokenization, sentence boundary, and document structure annotations as native DOCREP annotations. The NER system consumes DOCREP annotations and utilises many components of the DOCREP runtime. We believe that the adoption of DOCREP throughout the NLP community will assist in the reproducibility of results, substitutability of components, and overall quality assurance of NLP systems and corpora, all of which are problematic areas within NLP research and applications. This adoption will make developing and combining NLP components into applications faster, more efficient, and more reliable

    Effect of Nursing in the Head Elevated Tilt Position (15 ) on the Incidence of Bradycardic and Hypoxemic Episodes in Preterm Infants

    Full text link
    OBJECTIVE: We investigated whether nursing in the head elevated tilt position (HETP), compared with the horizontal position, has any effect on the incidence of bradycardic and hypoxemic episodes in preterm infants. METHODS: Twelve spontaneously breathing preterm infants with idiopathic recurrent apnea were studied in a randomized controlled crossover trial. Nine infants were treated with aminophylline. Each spent a total of 24 hours in the horizontal prone position and a total of 24 hours in HETP (prone, 15 degrees). The position was changed in random order every 6 hours. Thoracic impedance, heart rate, and arterial oxygen saturation were recorded continuously. The frequency of isolated hypoxemia (arterial saturation <80%), of isolated bradycardia (heart rate <90 beats per minute), and of mixed events was analyzed and compared without knowledge of the allocated position. RESULTS: In total, there were significantly fewer bradycardic and/or hypoxemic episodes (28.2%) in HETP compared with the horizontal position (mean difference, 13.35 episodes/24 hours; 95% confidence interval [CI]: 5.9- 20.8). The decrease was largest for isolated hypoxemic episodes (48.5%; mean difference, 11.74 episodes/24 hours; 95% CI: 6.1-17.4). Isolated bradycardic episodes (mean difference, 2.27 episodes/24 hours; 95% CI: -0.78-5.31) and mixed events were not decreased significantly in HETP. CONCLUSIONS: Nursing in a moderately tilted position (15 degrees) reduces hypoxemic events in preterm infants. This intervention is easy to apply, quickly reversible, and can be combined with drugs such as aminophylline
    corecore