131 research outputs found
Open Evaluation Tool for Layout Analysis of Document Images
This paper presents an open tool for standardizing the evaluation process of
the layout analysis task of document images at pixel level. We introduce a new
evaluation tool that is both available as a standalone Java application and as
a RESTful web service. This evaluation tool is free and open-source in order to
be a common tool that anyone can use and contribute to. It aims at providing as
many metrics as possible to investigate layout analysis predictions, and also
provide an easy way of visualizing the results. This tool evaluates document
segmentation at pixel level, and support multi-labeled pixel ground truth.
Finally, this tool has been successfully used for the ICDAR2017 competition on
Layout Analysis for Challenging Medieval Manuscripts.Comment: The 14th IAPR International Conference on Document Analysis and
Recognition (ICDAR), HIP: 4th International Workshop on Historical Document
Imaging and Processing, Kyoto, Japan, 201
The trauma patient in hemorrhagic shock: How is the C-priority addressed between emergency and ICU admission?
BACKGROUND: Trauma is the leading cause of death in young people with an injury related mortality rate of 47.6/100,000 in European high income countries. Early deaths often result from rapidly evolving and deteriorating secondary complications e.g. shock, hypoxia or uncontrolled hemorrhage. The present study assessed how well ABC priorities (A: Airway, B: Breathing/Ventilation and C: Circulation with hemorrhage control) with focus on the C-priority including coagulation management are addressed during early trauma care and to what extent these priorities have been controlled for prior to ICU admission among patients arriving to the ER in states of moderate or severe hemorrhagic shock. METHODS: A retrospective analysis of data documented in the TraumaRegister of the ‘Deutsche Gesellschaft für Unfallchirurgie’ (TR-DGU®()) was conducted. Relevant clinical and laboratory parameters reflecting status and basic physiology of severely injured patients (ISS ≥ 25) in either moderate or severe shock according to base excess levels (BE -2 to -6 or BE < -6) as surrogate for shock and hemorrhage combined with coagulopathy (Quick’s value <70%) were analyzed upon ER arrival and ICU admission. RESULTS: A total of 517 datasets was eligible for analysis. Upon ICU admission shock was reversed to BE > -2 in 36.4% and in 26.4% according to the subgroups. Two of three patients with initially moderate shock and three out of four patients with severe shock upon ER arrival were still in shock upon ICU admission. All patients suffered from coagulation dysfunction upon ER arrival (Quick’s value ≤ 70%). Upon ICU admission 3 out of 4 patients in both groups still had a disturbed coagulation function. The number of patients with significant thrombocytopenia had increased 5-6 fold between ER and ICU admission. CONCLUSION: The C-priority including coagulation management was not adequately addressed during primary survey and initial resuscitation between ER and ICU admission, in this cohort of severely injured patients
Optimisation de la coopération utilisateur/système pour l'apprentissage en-ligne d'un classifieur évolutif
National audienceLes interfaces homme-machine tactiles permettent de nouveaux modes d'interaction comme l'utilisation de commandes gestuelles. Afin de mémoriser facilement plus d'une douzaine de commandes, il est important de pouvoir les personnaliser. Le classifieur utilisé pour reconnaitre les symboles dessinés doit donc être personnalisable, pouvoir s'initialiser à partir de très peu de données, et évolutif, pouvoir s'améliorer pendant son utilisation. Ces travaux étudient l'importance et les impacts de l'utilisation du rejet pour superviser l'apprentissage en-ligne du classifieur. L'objectif est d'obtenir un système de commandes gestuelles gérant au mieux sa coopération avec l'utilisateur : pour apprendre de ses erreurs sans pour autant le solliciter trop souvent. Il faut donc faire un compromis entre le nombre de sollicitations de l'utilisateur, pour superviser l'apprentissage, et le nombre d'erreurs faites par le système, qui nécessitent une correction de l'utilisateur
Vitamin D Endocrine System and COVID-19. Treatment with Calcifediol
The COVID-19 pandemic is the greatest challenge facing modern medicine and public health systems. The viral evolution of SARS-CoV-2, with the emergence of new variants with in-creased infectious potential, is a cause for concern. In addition, vaccination coverage remains in-sufficient worldwide. Therefore, there is a need to develop new therapeutic options, and/or to optimize the repositioning of drugs approved for other indications for COVID-19. This may include the use of calcifediol, the prohormone of the vitamin D endocrine system (VDES) as it may have potential useful effects for the treatment of COVID-19. We review the aspects associating COVID-19 with VDES and the potential use of calcifediol in COVID-19. VDES/VDR stimulation may enhance innate antiviral effector mechanisms, facilitating the induction of antimicrobial peptides/autophagy, with a critical modulatory role in the subsequent host reactive hyperinflammatory phase during COVID-19: By decreasing the cytokine/chemokine storm, regulating the renin–angiotensin–bradykinin system (RAAS), modulating neutrophil activity and maintaining the integrity of the pulmonary epithelial barrier, stimulating epithelial repair, and directly and indirectly decreasing the increased coagulability and prothrombotic tendency associated with severe COVID-19 and its complications. Available evidence suggests that VDES/VDR stimulation, while maintaining optimal serum 25OHD status, in patients with SARS-CoV-2 infection may significantly reduce the risk of acute respiratory distress syndrome (ARDS) and severe COVID-19, with possible beneficial effects on the need for mechanical ventilation and/or intensive care unit (ICU) admission, as well as deaths in the course of the disease. The pharmacokinetic and functional characteristics of calcifediol give it superiority in rapidly optimizing 25OHD levels in COVID-19. A pilot study and several observational intervention studies using high doses of calcifediol (0.532 mg on day 1 and 0.266 mg on days 3, 7, 14, 21, and 28) dramatically decreased the need for ICU admission and the mortality rate. We, therefore, propose to use calcifediol at the doses described for the rapid correction of 25OHD deficiency in all patients in the early stages of COVID-19, in association, if necessary, with the new oral antiviral agents
Measuring Vitamin D3 Metabolic Status, Comparison between Vitamin D Deficient and Sufficient Individuals
The main branch of vitamin D3 metabolism involves several hydroxylation reactions to obtain mono-, di- and trihydroxylated metabolites, including the circulating and active forms—25(OH)D3 and 1,25(OH)2D3, respectively. However, most clinical trials strictly target the determination of 25(OH)D3 to offer a view of the metabolic status of vitamin D3. Due to the growing interest in expanding this restricted view, we have developed a method for measuring vitamin D3 metabolism by determination of vitamin D3, 25(OH)D3, 24,25(OH)2D3, 1,25(OH)2D3 and 1,24,25(OH)3D3 in human plasma. The method was based on SPE–LC–MS/MS with a large volume injection of human plasma (240 µL). Detection of di- and trihydroxymetabolites, found at the picogram per milliliter level, was attained by the combined action of high preconcentration and clean-up effects. The method allows obtaining information about ratios such as the known vitamin D metabolite ratio (24,25(OH)2D3/25(OH)D3), which can provide complementary views of vitamin D3 metabolic status. The method was applied to a cohort of obese patients and a reference cohort of healthy volunteers to find metabolic correlations between target analytes as well as differences as a function of vitamin D levels within and between cohorts
Renaissance of base deficit for the initial assessment of trauma patients: a base deficit-based classification for hypovolemic shock developed on data from 16,305 patients derived from the TraumaRegister DGU®
INTRODUCTION: The recognition and management of hypovolemic shock still remain an important task during initial trauma assessment. Recently, we have questioned the validity of the Advanced Trauma Life Support (ATLS) classification of hypovolemic shock by demonstrating that the suggested combination of heart rate, systolic blood pressure and Glasgow Coma Scale displays substantial deficits in reflecting clinical reality. The aim of this study was to introduce and validate a new classification of hypovolemic shock based upon base deficit (BD) at emergency department (ED) arrival. METHODS: Between 2002 and 2010, 16,305 patients were retrieved from the TraumaRegister DGU(® )database, classified into four strata of worsening BD [class I (BD ≤ 2 mmol/l), class II (BD > 2.0 to 6.0 mmol/l), class III (BD > 6.0 to 10 mmol/l) and class IV (BD > 10 mmol/l)] and assessed for demographics, injury characteristics, transfusion requirements and fluid resuscitation. This new BD-based classification was validated to the current ATLS classification of hypovolemic shock. RESULTS: With worsening of BD, injury severity score (ISS) increased in a step-wise pattern from 19.1 (± 11.9) in class I to 36.7 (± 17.6) in class IV, while mortality increased in parallel from 7.4% to 51.5%. Decreasing hemoglobin and prothrombin ratios as well as the amount of transfusions and fluid resuscitation paralleled the increasing frequency of hypovolemic shock within the four classes. The number of blood units transfused increased from 1.5 (± 5.9) in class I patients to 20.3 (± 27.3) in class IV patients. Massive transfusion rates increased from 5% in class I to 52% in class IV. The new introduced BD-based classification of hypovolemic shock discriminated transfusion requirements, massive transfusion and mortality rates significantly better compared to the conventional ATLS classification of hypovolemic shock (p < 0.001). CONCLUSIONS: BD may be superior to the current ATLS classification of hypovolemic shock in identifying the presence of hypovolemic shock and in risk stratifying patients in need of early blood product transfusion
Predicting on-going hemorrhage and transfusion requirement after severe trauma: a validation of six scoring systems and algorithms on the TraumaRegister DGU®
INTRODUCTION: The early aggressive management of the acute coagulopathy of trauma may improve survival in the trauma population. However, the timely identification of lethal exsanguination remains challenging. This study validated six scoring systems and algorithms to stratify patients for the risk of massive transfusion (MT) at a very early stage after trauma on one single dataset of severely injured patients derived from the TR-DGU (TraumaRegister DGU(® )of the German Trauma Society (DGU)) database. METHODS: Retrospective internal and external validation of six scoring systems and algorithms (four civilian and two military systems) to predict the risk of massive transfusion at a very early stage after trauma on one single dataset of severely injured patients derived from the TraumaRegister DGU(® )database (2002-2010). Scoring systems and algorithms assessed were: TASH (Trauma-Associated Severe Hemorrhage) score, PWH (Prince of Wales Hospital/Rainer) score, Vandromme score, ABC (Assessment of Blood Consumption/Nunez) score, Schreiber score and Larsen score. Data from 56,573 patients were screened to extract one complete dataset matching all variables needed to calculate all systems assessed in this study. Scores were applied and area-under-the-receiver-operating-characteristic curves (AUCs) were calculated. From the AUC curves the cut-off with the best relation of sensitivity-to-specificity was used to recalculate sensitivity, specificity, positive predictive values (PPV), and negative predictive values (NPV). RESULTS: A total of 5,147 patients with blunt trauma (95%) was extracted from the TR-DGU. The mean age of patients was 45.7 ± 19.3 years with a mean ISS of 24.3 ± 13.2. The overall MT rate was 5.6% (n = 289). 95% (n = 4,889) patients had sustained a blunt trauma. The TASH score had the highest overall accuracy as reflected by an AUC of 0.889 followed by the PWH-Score (0.860). At the defined cut-off values for each score the highest sensitivity was observed for the Schreiber score (85.8%) but also the lowest specificity (61.7%). The TASH score at a cut-off ≥ 8.5 showed a sensitivity of 84.4% and also a high specificity (78.4%). The PWH score had a lower sensitivity (80.6%) with comparable specificity. The Larson score showed the lowest sensitivity (70.9%) at a specificity of 80.4%. CONCLUSIONS: Weighted and more sophisticated systems such as TASH and PWH scores including higher numbers of variables perform superior over simple non-weighted models. Prospective validations are needed to improve the development process and use of scoring systems in the future
Long-Term Treatment and Effect of Discontinuation of Calcifediol in Postmenopausal Women with Vitamin D Deficiency: A Randomized Trial
Vitamin D plays a major role in bone health and probably also in multiple extraskeletal acute and chronic diseases. Although supplementation with calcifediol, a vitamin D metabolite, has demonstrated efficacy and safety in short-term clinical trials, its effects after long-term monthly administration have been studied less extensively. This report describes the results of a 1-year, phase III-IV, double-blind, randomized, controlled, parallel, multicenter superiority clinical trial to assess the efficacy and safety of monthly calcifediol 0.266 mg versus cholecalciferol 25,000 IU (0.625 mg) in postmenopausal women with vitamin D deficiency (25(OH)D < 20 ng/mL). A total of 303 women were randomized and 298 evaluated. Patients were randomized 1:1:1 to calcifediol 0.266 mg/month for 12 months (Group A1), calcifediol 0.266 mg/month for 4 months followed by placebo for 8 months (Group A2), and cholecalciferol 25,000 IU/month (0.625 mg/month) for 12 months (Group B). By month 4, stable 25(OH)D levels were documented with both calcifediol and cholecalciferol (intention-to-treat population): 26.8 ± 8.5 ng/mL (Group A1) and 23.1 ± 5.4 ng/mL (Group B). By month 12, 25(OH)D levels were 23.9 ± 8.0 ng/mL (Group A1) and 22.4 ± 5.5 ng/mL (Group B). When calcifediol treatment was withdrawn in Group A2, 25(OH)D levels decreased to baseline levels (28.5 ± 8.7 ng/mL at month 4 versus 14.4 ± 6.0 ng/mL at month 12). No relevant treatment-related safety issues were reported in any of the groups. The results confirm that long-term treatment with monthly calcifediol in vitamin D-deficient patients is effective and safe. The withdrawal of treatment leads to a pronounced decrease of 25(OH)D levels. Calcifediol presented a faster onset of action compared to monthly cholecalciferol. Long-term treatment produces stable and sustained 25(OH)D concentrations with no associated safety concerns.This study was funded by Faes Farma, S.A. and Bruno Farmaceutici S.p.A. The authors wish to thank the study participants, research staff, the secondary investigators of the Osteoferol study group, the home nursing staff (Emibet), and Faes Farma clinical research team: Paula Arranz Gutiérrez, Lorena Elgezabal González, Mariana Frau Usoz, and Iñigo Saez Riesco. Medical writing support was provided by Francisco López de Saro (Trialance SCCL), funded by Faes Farma, S.A
Calcifediol is superior to cholecalciferol in improving vitamin D status in postmenopausal women: a randomized trial
Vitamin D has shown to play a role in multiple diseases due to its skeletal and extraskeletal actions. Furthermore, vitamin D deficiency has become a worldwide health issue. Few supplementation guidelines mention calcifediol treatment, despite being the direct precursor of calcitriol and the biomarker of vitamin D status. This 1-year, phase III-IV, double-blind, randomized, controlled, multicenter clinical trial assessed the efficacy and safety of calcifediol 0.266 mg soft capsules in vitamin D-deficient postmenopausal women, compared to cholecalciferol. Results reported here are from a prespecified interim analysis, for the evaluation of the study's primary endpoint: the percentage of patients with serum 25-hydroxyvitamin D (25(OH)D) levels above 30 ng/ml after 4 months. A total of 303 patients were enrolled, of whom 298 were included in the intention-to-treat (ITT) population. Patients with baseline levels of serum 25(OH)D <20 ng/ml were randomized 1:1:1 to calcifediol 0.266 mg/month for 12 months, calcifediol 0.266 mg/month for 4 months followed by placebo for 8 months, and cholecalciferol 25,000 IU/month for 12 months. At month 4, 35.0% of postmenopausal women treated with calcifediol and 8.2% of those treated with cholecalciferol reached serum 25(OH)D levels above 30 ng/ml (p < 0.0001). The most remarkable difference between both drugs in terms of mean change in serum 25(OH)D levels was observed after the first month of treatment (mean ± standard deviation change = 9.7 ± 6.7 and 5.1 ± 3.5 ng/ml in patients treated with calcifediol and cholecalciferol, respectively). No relevant treatment-related safety issues were reported in any of the groups studied. These results thus confirm that calcifediol is effective, faster, and more potent than cholecalciferol in raising serum 25(OH)D levels and is a valuable option for the treatment of vitamin D deficiency
- …
