403 research outputs found

    A simulation study of the winter bed crisis

    Get PDF
    The winter bed crisis is a cyclical phenomenon which appears in British hospitals every year, two or three weeks after Christmas. The crisis is usually attributed to factors such as the bad weather, influenza, older people, geriatricians, lack of cash or nurse shortages. However, a possible alternative explanation could be that beds within the hospital are blocked because of lack of social services for discharge of hospital patients during the Christmas period. Adopting this explanation of why the bed crisis occurs, the problem was considered as a queuing system and discrete event simulation was employed to evaluate the model numerically. The model shows that stopping discharges of rehabilitating patients for 21 days accompanied by a cessation of planned patients for 14 days precipitate a bed crisis when the planned admissions recommence. The extensive 'what-if' capabilities of such models could be proved to be crucial to the designing and implementation of possible solutions to the problem

    An OLAP-enabled software environment for modelling patient flow

    Get PDF
    On-Line Analytical Processing (OLAP) tools use multidimensional views to provide quick access to information. They have become the de facto standard in the business world for analytical databases. In health care, care givers and managers could benefit from being able to perform interactive data exploration, ad-hoc analysis and possibly discover hidden trends and patterns in health data. However, health data have unique characteristics that distinguish them from common business examples, an aspect that makes the direct adaptation of the already established business oriented solutions difficult. In this paper we report the development of an OLAP system for analyzing hospital discharge data and for modeling hospital length of stay

    Stroboscopic back-action evasion in a dense alkali-metal vapor

    Full text link
    We explore experimentally quantum non-demolition (QND) measurements of atomic spin in a hot potassium vapor in the presence of spin-exchange relaxation. We demonstrate a new technique for back-action evasion by stroboscopic modulation of the probe light. With this technique we study spin noise as a function of polarization for atoms with spin greater than 1/2 and obtain good agreement with a simple theoretical model. We point that in a system with fast spin-exchange, where the spin relaxation rate is changing with time, it is possible to improve the long-term sensitivity of atomic magnetometry by using QND measurements

    A data warehouse environment for storing and analyzing simulation output data

    Get PDF
    Discrete event simulation modelling has been extensively used in modelling complex systems. Although it offers great conceptual-modelling flexibility, it is both computationally expensive and data intensive. There are several examples of simulation models that generate millions of observations to achieve satisfactory point and confidence interval estimations for the model variables. In these cases, it is exceptionally cumbersome to conduct the required output and sensitivity analysis in a spreadsheet or statistical package. In this paper, we highlight the advantages of employing data warehousing techniques for storing and analyzing simulation output data. The proposed data warehouse environment is capable of providing the means for automating the necessary algorithms and procedures for estimating different parameters of the simulation. These include initial transient in steady-state simulations and point and confidence interval estimations. Previously developed models for evaluating patient flow through hospital epartments are used to demonstrate the problem and the proposed solutions

    Analysis of stopping criteria for the EM algorithm in the context of patient grouping according to length of stay

    Get PDF
    The expectation maximisation (EM) algorithm is an iterative maximum likelihood procedure often used for estimating the parameters of a mixture model. Theoretically, increases in the likelihood function are guaranteed as the algorithm iteratively improves upon previously derived parameter estimates. The algorithm is considered to converge when all parameter estimates become stable and no further improvements can be made to the likelihood value. However, to reduce computational time, it is often common practice for the algorithm to be stopped before complete convergence using heuristic approaches. In this paper, we consider various stopping criteria and evaluate their effect on fitting Gaussian mixture models (GMMs) to patient length of stay (LOS) data. Although the GMM can be successfully fitted to positively skewed data such as LOS, the fitting procedure often requires many iterations of the EM algorithm. To our knowledge, no previous study has evaluated the effect of different stopping criteria on fitting GMMs to skewed distributions. Hence, the aim of this paper is to evaluate the effect of various stopping criteria in order to select and justify their use within a patient spell classification methodology. Results illustrate that criteria based on the difference in the likelihood value and on the GMM parameters may not always be a good indicator for stopping the algorithm. In fact we show that the values of the difference in the variance parameters should be used instead, as these parameters are the last to stabilise. In addition, we also specify threshold values for the other stopping criteria

    Sylvatic Dengue Virus Type 2 Activity in Humans, Nigeria, 1966

    Get PDF
    Using phylogenetic analysis of complete virus genomes from human isolates obtained in Nigeria in 1966, we identified sylvatic dengue virus (DENV) strains from 3 febrile patients. This finding extends current understanding of the role of sylvatic DENV in febrile disease and documents another focus of sylvatic DENV transmission in West Africa

    Data warehouses-TOLAP-decision making

    Get PDF
    Data warehouses (DWH) have been established as the core of decision support systems. On top of a DWH, different applications can be realised with regard to conventional reporting. On line Analytical Processing (OLAP) has reached the maturity as an interactive and explorative way of analysing DWH data. However DWH are mostly organised as snapshot databases. For this reason important tasks like "how many times have products of a specific brand been sold in the "past?" cannot be answered successfully - in order to control the success of reshuffling the product range it is necessary to compare the sales of "old" and "new" products. The same applies in cases where the seasonality aspect for a particular range of products has to be answered. On the other hand, temporal databases allow a valid time to be assigned to data. In this manner, a past state can be reconstructed during retrieval. In this paper, we address the integration of DWH and OLAP with temporal database semantics

    Improving sensor network performance with wireless energy transfer

    Get PDF
    Through recent technology advances in the field of wireless energy transmission Wireless Rechargeable Sensor Networks have emerged. In this new paradigm for wireless sensor networks a mobile entity called mobile charger (MC) traverses the network and replenishes the dissipated energy of sensors. In this work we first provide a formal definition of the charging dispatch decision problem and prove its computational hardness. We then investigate how to optimise the trade-offs of several critical aspects of the charging process such as: a) the trajectory of the charger; b) the different charging policies; c) the impact of the ratio of the energy the Mobile Charger may deliver to the sensors over the total available energy in the network. In the light of these optimisations, we then study the impact of the charging process to the network lifetime for three characteristic underlying routing protocols; a Greedy protocol, a clustering protocol and an energy balancing protocol. Finally, we propose a mobile charging protocol that locally adapts the circular trajectory of the MC to the energy dissipation rate of each sub-region of the network. We compare this protocol against several MC trajectories for all three routing families by a detailed experimental evaluation. The derived findings demonstrate significant performance gains, both with respect to the no charger case as well as the different charging alternatives; in particular, the performance improvements include the network lifetime, as well as connectivity, coverage and energy balance properties

    A Demand and Capacity Model For Home-Based Intermediate Care: Optimizing The ‘Step Down’ Pathway

    Get PDF
    This is the author accepted manuscript. The final version is available from IEEE via the DOI in this recordIntermediate care supports timely discharge from hospital for patients with complex healthcare needs. The purpose of 'step-down' care is to enable patients to leave hospital as soon as medically fit, avoiding costly discharge delays and consequent risks to patient health and wellbeing. Determining optimal intermediate care capacity requires balancing costs to both acute hospital and community care providers. Too much community capacity results in underutilized resources and poor economic efficiency, while too little risks excessive hospital discharge delays. Application of discrete-time simulation shows that total costs across the acute-community interface can be minimized by identifying optimal community capacity in terms of the maximum number of patients for which home visits can be provided by the service. To our knowledge, this is the first simulation study to model the patient pathway from hospital discharge through to community visits. Simulation modeling has supported short-term resource planning in a major English healthcare system.Health Data Research U
    corecore