807 research outputs found

    Forecasting and inventory control for hospital management

    Get PDF
    This thesis was submitted for the degree of Doctor of Philosophy and was awarded by Brunel University.Economic stringencies have compelled Canadian hospitals to examine their administrative effectiveness critically. Improved supplies and inventory procedures adopted by leading industrial corporations, suggest that hospitals might benefit from such systems. Lack of the profit incentive, and the high ratio of wages to total expenses in hospitals, have delayed adoption of modern inventory management techniques. This study examined the economic status of Canadian hospitals, and endeavoured to discover whether a computer-based inventory management system, incorporating short-term statistical demand forecasting, would be feasible and advantageous. Scientific forecasting for inventory management is not used by hospitals. The writer considered which technique would be most suited to their needs, taking account of benefits claimed by industrial users. Samples of demand data were subjected to a variety of simple forecasting methods, including moving averages, exponentially smoothed averages and the Box-Jenkins method. Comparisons were made in terms of relative size of forecast errors; ease of data maintenance, and demands upon hospital clerical staffs. The computer system: BRUFICH facilitated scrutiny of the effect of each technique upon major components of the system. It is concluded that either of two methods would be appropriate: moving averages and double exponential smoothing. The latter, when combined with adaptive control through tracking signals, is easily incorporated within the total inventory system. It requires only a short run of data, tracks trend satisfactorily, and demands little operator intervention. The original system designed by this writer was adopted by the Hospital for Sick Children, Toronto, and has significantly improved their inventory management.Lakehead University and the Ministry of Health, Government of Ontario

    Evaluation of sliding baseline methods for spatial estimation for cluster detection in the biosurveillance system

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The Centers for Disease Control and Prevention's (CDC's) BioSense system provides near-real time situational awareness for public health monitoring through analysis of electronic health data. Determination of anomalous spatial and temporal disease clusters is a crucial part of the daily disease monitoring task. Our study focused on finding useful anomalies at manageable alert rates according to available BioSense data history.</p> <p>Methods</p> <p>The study dataset included more than 3 years of daily counts of military outpatient clinic visits for respiratory and rash syndrome groupings. We applied four spatial estimation methods in implementations of space-time scan statistics cross-checked in Matlab and C. We compared the utility of these methods according to the resultant background cluster rate (a false alarm surrogate) and sensitivity to injected cluster signals. The comparison runs used a spatial resolution based on the facility zip code in the patient record and a finer resolution based on the residence zip code.</p> <p>Results</p> <p>Simple estimation methods that account for day-of-week (DOW) data patterns yielded a clear advantage both in background cluster rate and in signal sensitivity. A 28-day baseline gave the most robust results for this estimation; the preferred baseline is long enough to remove daily fluctuations but short enough to reflect recent disease trends and data representation. Background cluster rates were lower for the rash syndrome counts than for the respiratory counts, likely because of seasonality and the large scale of the respiratory counts.</p> <p>Conclusion</p> <p>The spatial estimation method should be chosen according to characteristics of the selected data streams. In this dataset with strong day-of-week effects, the overall best detection performance was achieved using subregion averages over a 28-day baseline stratified by weekday or weekend/holiday behavior. Changing the estimation method for particular scenarios involving different spatial resolution or other syndromes can yield further improvement.</p

    Resilience of critical structures, infrastructure, and communities

    Get PDF
    In recent years, the concept of resilience has been introduced to the field of engineering as it relates to disaster mitigation and management. However, the built environment is only one element that supports community functionality. Maintaining community functionality during and after a disaster, defined as resilience, is influenced by multiple components. This report summarizes the research activities of the first two years of an ongoing collaboration between the Politecnico di Torino and the University of California, Berkeley, in the field of disaster resilience. Chapter 1 focuses on the economic dimension of disaster resilience with an application to the San Francisco Bay Area; Chapter 2 analyzes the option of using base-isolation systems to improve the resilience of hospitals and school buildings; Chapter 3 investigates the possibility to adopt discrete event simulation models and a meta-model to measure the resilience of the emergency department of a hospital; Chapter 4 applies the meta-model developed in Chapter 3 to the hospital network in the San Francisco Bay Area, showing the potential of the model for design purposes Chapter 5 uses a questionnaire combined with factorial analysis to evaluate the resilience of a hospital; Chapter 6 applies the concept of agent-based models to analyze the performance of socio-technical networks during an emergency. Two applications are shown: a museum and a train station; Chapter 7 defines restoration fragility functions as tools to measure uncertainties in the restoration process; and Chapter 8 focuses on modeling infrastructure interdependencies using temporal networks at different spatial scales

    Resilience of healthcare and education networks and their interactions following major earthquakes

    Get PDF
    2021 Spring.Includes bibliographical references.Healthcare and education systems have been identified by various national and international organizations as the main pillars of communities' stability. Ensuring the continuation of vital community services such as healthcare and education is critical for minimizing social losses after extreme events. A shortage of healthcare services could have catastrophic short-term and long-term effects on a community including an increase in morbidity and mortality, as well as population outmigration. Moreover, a shortage or lack of facilities for K-12 education, including elementary, middle, and high schools could impact a wide range of the community's population and could lead to impact population outmigration. Despite their importance to communities, there are a lack of comprehensive models that can be used to quantify recovery of functionalities of healthcare systems and schools following natural disasters. In addition to capturing the recovery of functionality, understanding the correlation between these main social services institutions is critical to determining the welfare of communities following natural disasters. Although hospitals and schools are key indicators of the stability of community social services, no studies to date have been conducted to determine the level of interdependence between hospitals and schools and their collective influence on their recoveries following extreme events. In this study, comprehensive frameworks are devised for estimating the losses, functionality, and recovery of healthcare and educational services following earthquakes. Success trees and semi-Markov stochastic models coupled with dynamic optimization are used to develop socio-technical models that describe functionalities and restorations of the facilities providing these services, by integrating the physical infrastructure, the supplies, and the people who operate and use these facilities. New frameworks are proposed to simulate processes such as patient demand on hospitals, hospitals' interaction, student enrollment, and school administration as well as different decisions and mitigation strategies applied by hospitals and schools while considering the disturbance imposed by earthquake events on these processes. The complex interaction between healthcare and education networks is captured using a new agent-based model which has been developed in the context of the communities' physical, social, and economic sectors that affect overall recovery. This model is employed to simulate the functional processes within each facility while optimizing their recovery trajectories after earthquake occurrence. The results highlight significant interdependencies between hospitals and schools, including direct and indirect relationships, suggesting the need for collective coupling of their recovery to achieve full functionality of either of the two systems following natural disasters. Recognizing this high level of interdependence, a social services stability index is then established which can be used by policymakers and community leaders to quantify the impact of healthcare and educational services on community resilience and social services stability

    Improvement of surgery duration estimation using statistical methods and analysis of scheduling policies using discrete event simulation

    Get PDF
    The United States health care system currently faces many challenges, with the most notable one being rising costs. In an effort to decrease those costs, health providers are aiming to improve efficiency in their operations. A primary source of revenue for hospitals and some clinics is the surgery department, making it a key department for improvement in efficiency. Surgery schedules drive the department and affect the operations of many other departments. The most significant challenge to creating an efficient surgery schedule is estimating surgery durations and scheduling cases in a manner that will minimize the time a surgery is off schedule and maximize utilization of resources. To identify ways to better estimate surgery durations, an analysis of the surgery scheduling process at UnityPoint Health - Des Moines, in Des Moines, Iowa was completed. Estimated surgery durations were compared to actual durations using a t test. Multiple linear regression models were created for the most common surgeries including the input variables of age of the patient, anesthesiologist, operating room (OR), number of residents, and day of the week. To find optimal scheduling policies, simulation models were created, each representing a series of surgery cases in one operating room during one day. Four scheduling policies were investigated: shortest estimated time first, longest estimated time first, most common surgery first, and adding an extra twenty minutes to each case in the existing order. The performance of the policies was compared to those of the existing schedule. Using the historical data from a one-year period at UnityPoint Health - Des Moines, the estimated surgery durations for the top four surgeries by count and top surgeons were found to be statistically different in 75% of the data sets. After creating multiple linear regression models for each of the top four surgeries and surgeons performing those surgeries, the β values for each variable were compared across models. Age was found to have a minimal impact on surgery duration in all models. The binary variable indicating residents present, was found to have minimal impact as well. For the rest of the variables, consistencies were difficult to assess, making multiple linear regression an unideal method for identifying the impact of the variables investigated. On the other hand, the simulation model proved to be useful in identifying useful scheduling policies. Eight series based on real series were modeled individually. Each model was validated against reality, with 75% of durations simulated in the models not being statistically different than reality. Each of the four scheduling policies was modeled for each series and the average minutes off schedule and idle time between cases were compared across models. Adding an extra twenty minutes to each case in the existing order resulted in the lowest minutes off schedule, but significantly increased the idle time between cases. Most common surgery first did not have a consistent impact on the performance indicators. Longest estimated time first did not improve the performance indicators in the majority of the cases. Shortest estimated time first resulted in the best performance for minutes off schedule and idle time between cases in combination; therefore, we recommend this policy is employed when the scheduling process allows

    A Computational Approach to Patient Flow Logistics in Hospitals

    Get PDF
    Scheduling decisions in hospitals are often taken in a decentralized way. This means that different specialized hospital units decide autonomously on e.g. patient admissions and schedules of shared resources. Decision support in such a setting requires methods and techniques that are different from the majority of existing literature in which centralized models are assumed. The design and analysis of such methods and techniques is the focus of this thesis. Specifically, we develop computational models to provide dynamic decision support for hospital resource management, the prediction of future resource occupancy and the application thereof. Hospital resource management targets the efficient deployment of resources like operating rooms and beds. Allocating resources to hospital units is a major managerial issue as the relationship between resources, utilization and patient flow of different patient groups is complex. The issues are further complicated by the fact that patient arrivals are dynamic and treatment processes are stochastic. Our approach to providing decision support combines techniques from multi-agent systems and computational intelligence (CI). This combination of techniques allows to properly consider the dynamics of the problem while reflecting the distributed decision making practice in hospitals. Multi-agent techniques are used to model multiple hospital care units and their decision policies, multiple patient groups with stochastic treatment processes and uncertain resource availability due to overlapping patient treatment processes. The agent-based model closely resembles the real-world situation. Optimization and learning techniques from CI allow for designing and evaluating improved (adaptive) decision policies for the agent-based model, which can then be implemented easily in hospital practice. In order to gain insight into the functioning of this complex and dynamic problem setting, we developed an agent-based model for the hospital care units with their patients. To assess the applicability of this agent-based model, we developed an extensive simulation. Several experiments demonstrate the functionality of the simulation and show that it is an accurate representation of the real world. The simulation is used to study decision support in resource management and patient admission control. To further improve the quality of decision support, we study the prediction of future hospital resource usage. Using prediction, the future impact of taking a certain decision can be taken into account. In the problem setting at hand for instance, predicting the resource utilization resulting from an admission decision is important to prevent future bottlenecks that may cause the blocking of patient flow and increase patient waiting times. The methods we investigate for the task of prediction are forward simulation and supervised learning using neural networks. In an extensive analysis we study the underlying probability distributions of resource occupancy and investigate, by stochastic techniques, how to obtain accurate and precise prediction outcomes. To optimize resource allocation decisions we consider multiple criteria that are important in the hospital problem setting. We use three conflicting objectives in the optimization: maximal patient throughput, minimal resource costs and minimal usage of back-up capacity. All criteria can be taken into account by finding decision policies that have the best trade-off between the criteria. We derived various decision policies that partly allow for adaptive resource allocations. The design of the policies allows the policies to be easily understandable for hospital experts. Moreover, we present a bed exchange mechanism that enables a realistic implementation of these adaptive policies in practice. In our optimization approach, the parameters of the different decision policies are determined using a multiobjective evolutionary algorithm (MOEA). Specifically, the MOEA optimizes the output of the simulation (i.e. the three optimization criteria) as a function of the policy parameters. Our results on resource management show that the benchmark allocations obtained from a case study are considerably improved by the optimized decision policies. Furthermore, our results show that using adaptive policies can lead to better results and that further improvements may be obtained by integrating prediction into a decision policy

    Anomaly Detection in Time Series: Theoretical and Practical Improvements for Disease Outbreak Detection

    Get PDF
    The automatic collection and increasing availability of health data provides a new opportunity for techniques to monitor this information. By monitoring pre-diagnostic data sources, such as over-the-counter cough medicine sales or emergency room chief complaints of cough, there exists the potential to detect disease outbreaks earlier than traditional laboratory disease confirmation results. This research is particularly important for a modern, highly-connected society, where the onset of disease outbreak can be swift and deadly, whether caused by a naturally occurring global pandemic such as swine flu or a targeted act of bioterrorism. In this dissertation, we first describe the problem and current state of research in disease outbreak detection, then provide four main additions to the field. First, we formalize a framework for analyzing health series data and detecting anomalies: using forecasting methods to predict the next day's value, subtracting the forecast to create residuals, and finally using detection algorithms on the residuals. The formalized framework indicates the link between the forecast accuracy of the forecast method and the performance of the detector, and can be used to quantify and analyze the performance of a variety of heuristic methods. Second, we describe improvements for the forecasting of health data series. The application of weather as a predictor, cross-series covariates, and ensemble forecasting each provide improvements to forecasting health data. Third, we describe improvements for detection. This includes the use of multivariate statistics for anomaly detection and additional day-of-week preprocessing to aid detection. Most significantly, we also provide a new method, based on the CuScore, for optimizing detection when the impact of the disease outbreak is known. This method can provide an optimal detector for rapid detection, or for probability of detection within a certain timeframe. Finally, we describe a method for improved comparison of detection methods. We provide tools to evaluate how well a simulated data set captures the characteristics of the authentic series and time-lag heatmaps, a new way of visualizing daily detection rates or displaying the comparison between two methods in a more informative way

    The development of a full probabilistic risk assessment model for quantifying the life safety risk in buildings in case of fire

    Get PDF
    In het kader van dit onderzoek is een probabilistisch model ontwikkeld dat het brandveiligheidsniveau van een gebouwontwerp kan kwantificeren en dit berekende veiligheidsniveau kan evalueren aan de hand van een vooraf gedefinieerd aanvaardbaar risicocriterium. De ontwikkelde methodiek kan zowel prescriptieve als op prestatie-gebaseerde ontwerpmethoden objectiveren door rekening te houden met de onzekerheid van ontwerpparameters en de betrouwbaarheid van veiligheidssystemen. Het model bestaat uit zowel een deterministisch als een probabilistisch gedeelte. Het deterministische kader is opgebouwd uit verschillende deelmodellen om zowel de verspreiding van brand en rook, als de interactie met evacuerende personen te simuleren. Verschillende deelmodellen zijn ontwikkeld om het effect van geïmplementeerde veiligheidsmaatregelen zoals detectie, sprinklers , rook- en warmteafvoersystemen, enz. mee in rekening te brengen. Het probabilistische kader is opgebouwd uit modellering van responsoppervlakken, steekproeftechnieken en ontwerp van grenstoestanden. De methodiek maakt gebruik van deze technieken om de nodige rekenkracht te beperken. Het uiteindelijke resultaat wordt vertaald naar een kans op sterfte, een individueel risico en een groepsrisico. De grote meerwaarde van de ontwikkelde methodiek is dat het mogelijk wordt om verschillende ontwerpmethodieken objectief met elkaar te vergelijken en het positieve effect van verbeterde veiligheidstechnieken en redundantie mee in rekening te brengen in het eindresultaat
    • …
    corecore