47 research outputs found

    Evaluation of a New POCT Bedside Glucose Meter and Strip With Hematocrit and Interference Corrections

    Get PDF
    Introduction: Based on the expanding role of point of care testing glucose meters and the need to improve accuracy and precision, the new Nova Biomedical StatStrip was evaluated and compared with the LifeScan SureStepFlexx (current point of care testing meter). Methods: Specimen volume variation, within-run imprecision, lot-to-lot bias, bias relative to a plasma hexokinase assay, and analytical interferences likely to be encountered in hospitalized patients were studied. Results: Strip dosing did not affect the StatStrip meter but did affect the SureStepFlexx at 5-and 50-KL specimen volumes. Within-run precision for each glucose meter was less than 5% at 39 to 47 mg/dL of glucose, less than 1.7% at 215 to 265 mg/dL, and less than 2.6% at 370 to 470 mg/dL. Improper coding resulted in erroneous measurements on the SureStepFlexx. Each meter was compared with the Dade RxL hexokinase plasma reference method, giving the following correlation equations: StatStrip = 1.015 (hexokinase) j 1.412 (r 2 = 0.996); SureStepFlexx = 0.889 (hexokinase) + 8.865 (r 2 = 0.989). At [glucose] of 55 mg/dL, ascorbic acid interfered with the SureStepFlexx but did not affect StatStrip. Hematocrit also affected the correlation of whole blood glucose on the SureStepFlexx to the plasma hexokinase reference glucose but did not affect the StatStrip meter. Conclusions: These studies suggest that the new StatStrip meter may be more accurate and precise (elimination of hematocrit effect and electrochemical interferences with no error because of strip dosing or calibration) than the SureStepFlexx meter. This reduction in total error may help achieve better glycemic control in hospitalized patients

    Evaluation of the impact of hematocrit and other interference on the accuracy of hospital-based glucose meters,”

    Get PDF
    ABSTRACT Background: Most glucose meter comparisons to date have focused on performance specifications likely to impact subcutaneous dosing of insulin. We evaluated four hospital-based glucose meter technologies for accuracy, precision, and analytical interferences likely to be encountered in critically ill patients, with the goal of identifying and discriminating glucose meter performance specifications likely to impact intensive intravenous insulin dosing. Methods: Precision, both within-run and day-to-day, was evaluated on all four glucose meters. Accuracy (bias) of the meters and analytical interference were evaluated by comparing results obtained on whole blood specimens to plasma samples obtained from these whole blood specimens run on a hexokinase reference method. Results: Precision was acceptable and differed little between meters. There were significant differences in the degree to which the meters correlated with the reference hexokinase method. Ascorbic acid showed significant interference with three of the four meters. Hematocrit also affected the correlation between whole blood and plasma hexokinase glucose on three of the four glucose meters tested, with the magnitude of this interference also varying by glucose meter technology. Conclusions: Correlation to plasma hexokinase values and hematocrit interference are the main variables that differentiate glucose meters. Meters that correlate with plasma glucose measured by a reference method over a wide range of glucose concentrations and minimize the effects of hematocrit will allow better glycemic control for critically ill patients

    A Roadmap for HEP Software and Computing R&D for the 2020s

    Get PDF
    Particle physics has an ambitious and broad experimental programme for the coming decades. This programme requires large investments in detector hardware, either to build new facilities and experiments, or to upgrade existing ones. Similarly, it requires commensurate investment in the R&D of software to acquire, manage, process, and analyse the shear amounts of data to be recorded. In planning for the HL-LHC in particular, it is critical that all of the collaborating stakeholders agree on the software goals and priorities, and that the efforts complement each other. In this spirit, this white paper describes the R&D activities required to prepare for this software upgrade.Peer reviewe

    Extraction of the Muon Signals Recorded with the Surface Detector of the Pierre Auger Observatory Using Recurrent Neural Networks

    Get PDF
    We present a method based on the use of Recurrent Neural Networks to extract the muon component from the time traces registered with water-Cherenkov detector (WCD) stations of the Surface Detector of the Pierre Auger Observatory. The design of the WCDs does not allow to separate the contribution of muons to the time traces obtained from the WCDs from those of photons, electrons and positrons for all events. Separating the muon and electromagnetic components is crucial for the determination of the nature of the primary cosmic rays and properties of the hadronic interactions at ultra-high energies. We trained a neural network to extract the muon and the electromagnetic components from the WCD traces using a large set of simulated air showers, with around 450 000 simulated events. For training and evaluating the performance of the neural network, simulated events with energies between 1018.5, eV and 1020 eV and zenith angles below 60 degrees were used. We also study the performance of this method on experimental data of the Pierre Auger Observatory and show that our predicted muon lateral distributions agree with the parameterizations obtained by the AGASA collaboration

    Performance of the 433 m surface array of the Pierre Auger Observatory

    Get PDF

    Multiple Scenario Generation of Subsurface Models:Consistent Integration of Information from Geophysical and Geological Data throuh Combination of Probabilistic Inverse Problem Theory and Geostatistics

    Get PDF
    Neutrinos with energies above 1017 eV are detectable with the Surface Detector Array of the Pierre Auger Observatory. The identification is efficiently performed for neutrinos of all flavors interacting in the atmosphere at large zenith angles, as well as for Earth-skimming \u3c4 neutrinos with nearly tangential trajectories relative to the Earth. No neutrino candidates were found in 3c 14.7 years of data taken up to 31 August 2018. This leads to restrictive upper bounds on their flux. The 90% C.L. single-flavor limit to the diffuse flux of ultra-high-energy neutrinos with an E\u3bd-2 spectrum in the energy range 1.0 7 1017 eV -2.5 7 1019 eV is E2 dN\u3bd/dE\u3bd < 4.4 7 10-9 GeV cm-2 s-1 sr-1, placing strong constraints on several models of neutrino production at EeV energies and on the properties of the sources of ultra-high-energy cosmic rays

    The Dynamics of fear at the Time of Covid-19: A contextual behavioral science perspective

    No full text
    COVID-19 is the relevant disease caused by the severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) transmitted via close contact between persons. On March 12th, 2020, WHO announced COVID-19 outbreak a pandemic, in view of its worldwide escalation. As the pandemic disease explodes, a parallel outbreak of fear and worry is also spreading. We react to fear symbolically, by arbitrarily relating it to other objects and events through derived verbal relations, so language may alter the way we experience events and consequently affects how we are functionally or dysfunctionally oriented to the world around us. In this paper we will outline the different human learning processes connected to fear responding, from the simplest type to the more complex cognitive ones, approaching them from the point of view of contextual behavioral science, a modern form of behavioral thinking. We will outline a model of intervention to foster psychological flexibility and more functional value-based actions. We will argue that in a pandemic and in the post-pandemic phase it could be a key for adapting to new and changed circumstances
    corecore