99,682 research outputs found

    Visualizing Magnitude: Graphical Number Representations Help Users Detect Large Number Entry Errors

    Get PDF
    Nurses frequently have to program infusion pumps to deliver a prescribed quantity of drug over time. Occasional errors are made in the performance of this routine number entry task, resulting in patients receiving the incorrect dose of a drug. While many of these number entry errors are inconsequential, others are not; infusing 100 ml of a drug instead of 10 ml can be fatal. This paper investigates whether a supplementary graphical number representation, depicting the magnitude of a number, can help people detect number entry errors. An experiment was conducted in which 48 participants had to enter numbers from a ‘prescription sheet’ to a computer interface using a keyboard. The graphical representation was supplementary and was shown both on the ‘prescription sheet’ and the device interface. Results show that while overall more errors were made when the graphical representation was visible, the graphical representation helped participants to detect larger number entry errors (i.e., those that were out by at least an order of magnitude). This work suggests that a graphical number entry system that visualizes magnitude of number can help people detect serious number entry errors

    Neutron Stars in Teleparallel Gravity

    Full text link
    In this paper we deal with neutron stars, which are described by a perfect fluid model, in the context of the teleparallel equivalent of general relativity. We use numerical simulations to find the relationship between the angular momentum of the field and the angular momentum of the source. Such a relation was established for each stable star reached by the numerical simulation once the code is fed with an equation of state, the central energy density and the ratio between polar and equatorial radii. We also find a regime where linear relation between gravitational angular momentum and moment of inertia (as well as angular velocity of the fluid) is valid. We give the spatial distribution of the gravitational energy and show that it has a linear dependence with the squared angular velocity of the source.Comment: 19 pages, 14 figures. arXiv admin note: text overlap with arXiv:1206.331

    Tamari Lattices and the symmetric Thompson monoid

    Full text link
    We investigate the connection between Tamari lattices and the Thompson group F, summarized in the fact that F is a group of fractions for a certain monoid F+sym whose Cayley graph includes all Tamari lattices. Under this correspondence, the Tamari lattice operations are the counterparts of the least common multiple and greatest common divisor operations in F+sym. As an application, we show that, for every n, there exists a length l chain in the nth Tamari lattice whose endpoints are at distance at most 12l/n.Comment: 35page

    A phase III, multi-centre, double-masked randomised controlled trial of adjunctive intraocular and peri-ocular steroid (triamcinolone acetonide) versus standard treatment in eyes undergoing vitreoretinal surgery for open globe trauma (ASCOT): statistical analysis plan.

    Get PDF
    BACKGROUND: Open globe ocular trauma complicated by intraocular scarring (proliferative vitreoretinopathy) is a relatively rare, blinding, but potentially treatable condition for which, at present, surgery is often unsatisfactory and visual results frequently poor. To date, no pharmacological adjuncts to surgery have been proven to be effective. The aim of the Adjunctive Steroid Combination in Ocular Trauma (ASCOT) randomised controlled trial is to determine whether adjunctive steroid (triamcinolone acetonide), given at the time of surgery, can improve the outcome of vitreoretinal surgery in patients with open globe ocular trauma. This article presents the statistical analysis plan for the main publication as approved and signed off by the Trial Steering Committee prior to the first data extraction for the Data Monitoring Committee meeting report. METHODS/DESIGN: ASCOT is a pragmatic, multi-centre, parallel-group, double-masked randomised controlled trial. The aim of the study is to recruit from 20-25 centres in the United Kingdom and randomise 300 eyes (from 300 patients) into two treatment arms. Both groups will receive standard surgical treatment and care; the intervention arm will additionally receive a pre-operative steroid combination (triamcinolone acetonide) into the vitreous cavity consisting of 4 mg/0.1 ml and 40 mg/1 ml sub-Tenon's. Participants will be followed for 6 months post-surgery. The primary outcome is the proportion of patients achieving a clinically meaning improvement in visual acuity in the study eye at 6 months after initial surgery, defined as a 10 letter score improvement in the ETDRS (the standard scale to test visual acuity). TRIAL REGISTRATION: ISRCTN30012492 . Registered on 5 September 2014. EudraCT2014-002193-37 . Registered on 5 September 2014

    Is the even distribution of insecticide-treated cattle essential for tsetse control? Modelling the impact of baits in heterogeneous environments

    Get PDF
    Background: Eliminating Rhodesian sleeping sickness, the zoonotic form of Human African Trypanosomiasis, can be achieved only through interventions against the vectors, species of tsetse (Glossina). The use of insecticide-treated cattle is the most cost-effective method of controlling tsetse but its impact might be compromised by the patchy distribution of livestock. A deterministic simulation model was used to analyse the effects of spatial heterogeneities in habitat and baits (insecticide-treated cattle and targets) on the distribution and abundance of tsetse. Methodology/Principal Findings: The simulated area comprised an operational block extending 32 km from an area of good habitat from which tsetse might invade. Within the operational block, habitat comprised good areas mixed with poor ones where survival probabilities and population densities were lower. In good habitat, the natural daily mortalities of adults averaged 6.14% for males and 3.07% for females; the population grew 8.46in a year following a 90% reduction in densities of adults and pupae, but expired when the population density of males was reduced to <0.1/km2; daily movement of adults averaged 249 m for males and 367 m for females. Baits were placed throughout the operational area, or patchily to simulate uneven distributions of cattle and targets. Gaps of 2–3 km between baits were inconsequential provided the average imposed mortality per km2 across the entire operational area was maintained. Leaving gaps 5–7 km wide inside an area where baits killed 10% per day delayed effective control by 4–11 years. Corrective measures that put a few baits within the gaps were more effective than deploying extra baits on the edges. Conclusions/Significance: The uneven distribution of cattle within settled areas is unlikely to compromise the impact of insecticide-treated cattle on tsetse. However, where areas of >3 km wide are cattle-free then insecticide-treated targets should be deployed to compensate for the lack of cattle

    Entropy production in phase field theories

    Full text link
    Allen-Cahn (Ginzburg-Landau) dynamics for scalar fields with heat conduction is treated in rigid bodies using a non-equilibrium thermodynamic framework with weakly nonlocal internal variables. The entropy production and entropy flux is calculated with the classical method of irreversible thermodynamics by separating full divergences.Comment: 5 pages, no figure

    Emergency and on-demand health care: modelling a large complex system

    No full text
    This paper describes how system dynamics was used as a central part of a whole-system review of emergency and on-demand health care in Nottingham, England. Based on interviews with 30 key individuals across health and social care, a 'conceptual map' of the system was developed, showing potential patient pathways through the system. This was used to construct a stock-flow model, populated with current activity data, in order to simulate patient flows and to identify system bottle-necks. Without intervention, assuming current trends continue, Nottingham hospitals are unlikely to reach elective admission targets or achieve the government target of 82% bed occupancy. Admissions from general practice had the greatest influence on occupancy rates. Preventing a small number of emergency admissions in elderly patients showed a substantial effect, reducing bed occupancy by 1% per annum over 5 years. Modelling indicated a range of undesirable outcomes associated with continued growth in demand for emergency care, but also considerable potential to intervene to alleviate these problems, in particular by increasing the care options available in the community

    Checking and Enforcing Security through Opacity in Healthcare Applications

    Full text link
    The Internet of Things (IoT) is a paradigm that can tremendously revolutionize health care thus benefiting both hospitals, doctors and patients. In this context, protecting the IoT in health care against interference, including service attacks and malwares, is challenging. Opacity is a confidentiality property capturing a system's ability to keep a subset of its behavior hidden from passive observers. In this work, we seek to introduce an IoT-based heart attack detection system, that could be life-saving for patients without risking their need for privacy through the verification and enforcement of opacity. Our main contributions are the use of a tool to verify opacity in three of its forms, so as to detect privacy leaks in our system. Furthermore, we develop an efficient, Symbolic Observation Graph (SOG)-based algorithm for enforcing opacity

    Evaluation of Directive-Based GPU Programming Models on a Block Eigensolver with Consideration of Large Sparse Matrices

    Get PDF
    Achieving high performance and performance portability for large-scale scientific applications is a major challenge on heterogeneous computing systems such as many-core CPUs and accelerators like GPUs. In this work, we implement a widely used block eigensolver, Locally Optimal Block Preconditioned Conjugate Gradient (LOBPCG), using two popular directive based programming models (OpenMP and OpenACC) for GPU-accelerated systems. Our work differs from existing work in that it adopts a holistic approach that optimizes the full solver performance rather than narrowing the problem into small kernels (e.g., SpMM, SpMV). Our LOPBCG GPU implementation achieves a 2.8×{\times }–4.3×{\times } speedup over an optimized CPU implementation when tested with four different input matrices. The evaluated configuration compared one Skylake CPU to one Skylake CPU and one NVIDIA V100 GPU. Our OpenMP and OpenACC LOBPCG GPU implementations gave nearly identical performance. We also consider how to create an efficient LOBPCG solver that can solve problems larger than GPU memory capacity. To this end, we create microbenchmarks representing the two dominant kernels (inner product and SpMM kernel) in LOBPCG and then evaluate performance when using two different programming approaches: tiling the kernels, and using Unified Memory with the original kernels. Our tiled SpMM implementation achieves a 2.9×{\times } and 48.2×{\times } speedup over the Unified Memory implementation on supercomputers with PCIe Gen3 and NVLink 2.0 CPU to GPU interconnects, respectively
    corecore