36 research outputs found

    OPTIMIZATION MODELS AND METHODOLOGIES TO SUPPORT EMERGENCY PREPAREDNESS AND POST-DISASTER RESPONSE

    Get PDF
    This dissertation addresses three important optimization problems arising during the phases of pre-disaster emergency preparedness and post-disaster response in time-dependent, stochastic and dynamic environments. The first problem studied is the building evacuation problem with shared information (BEPSI), which seeks a set of evacuation routes and the assignment of evacuees to these routes with the minimum total evacuation time. The BEPSI incorporates the constraints of shared information in providing on-line instructions to evacuees and ensures that evacuees departing from an intermediate or source location at a mutual point in time receive common instructions. A mixed-integer linear program is formulated for the BEPSI and an exact technique based on Benders decomposition is proposed for its solution. Numerical experiments conducted on a mid-sized real-world example demonstrate the effectiveness of the proposed algorithm. The second problem addressed is the network resilience problem (NRP), involving an indicator of network resilience proposed to quantify the ability of a network to recover from randomly arising disruptions resulting from a disaster event. A stochastic, mixed integer program is proposed for quantifying network resilience and identifying the optimal post-event course of action to take. A solution technique based on concepts of Benders decomposition, column generation and Monte Carlo simulation is proposed. Experiments were conducted to illustrate the resilience concept and procedure for its measurement, and to assess the role of network topology in its magnitude. The last problem addressed is the urban search and rescue team deployment problem (USAR-TDP). The USAR-TDP seeks an optimal deployment of USAR teams to disaster sites, including the order of site visits, with the ultimate goal of maximizing the expected number of saved lives over the search and rescue period. A multistage stochastic program is proposed to capture problem uncertainty and dynamics. The solution technique involves the solution of a sequence of interrelated two-stage stochastic programs with recourse. A column generation-based technique is proposed for the solution of each problem instance arising as the start of each decision epoch over a time horizon. Numerical experiments conducted on an example of the 2010 Haiti earthquake are presented to illustrate the effectiveness of the proposed approach

    Optimization of Handover, Survivability, Multi-Connectivity and Secure Slicing in 5G Cellular Networks using Matrix Exponential Models and Machine Learning

    Get PDF
    Title from PDF of title page, viewed January 31, 2023Dissertation advisor: Cory BeardVitaIncludes bibliographical references (pages 173-194)Dissertation (Ph.D.)--Department of Computer Science and Electrical Engineering. University of Missouri--Kansas City, 2022This works proposes optimization of cellular handovers, cellular network survivability modeling, multi-connectivity and secure network slicing using matrix exponentials and machine learning techniques. We propose matrix exponential (ME) modeling of handover arrivals with the potential to much more accurately characterize arrivals and prioritize resource allocation for handovers, especially handovers for emergency or public safety needs. With the use of a ‘B’ matrix for representing a handover arrival, we have a rich set of dimensions to model system handover behavior. We can study multiple parameters and the interactions between system events along with the user mobility, which would trigger a handoff in any given scenario. Additionally, unlike any traditional handover improvement scheme, we develop a ‘Deep-Mobility’ model by implementing a deep learning neural network (DLNN) to manage network mobility, utilizing in-network deep learning and prediction. We use the radio and the network key performance indicators (KPIs) to train our model to analyze network traffic and handover requirements. Cellular network design must incorporate disaster response, recovery and repair scenarios. Requirements for high reliability and low latency often fail to incorporate network survivability for mission critical and emergency services. Our Matrix Exponential (ME) model shows how survivable networks can be designed based on controlling numbers of crews, times taken for individual repair stages, and the balance between fast and slow repairs. Transient and the steady state representations of system repair models, namely, fast and slow repairs for networks consisting of multiple repair crews have been analyzed. Failures are exponentially modeled as per common practice, but ME distributions describe the more complex recovery processes. In some mission critical communications, the availability requirements may exceed five or even six nines (99.9999%). To meet such a critical requirement and minimize the impact of mobility during handover, a Fade Duration Outage Probability (FDOP) based multiple radio link connectivity handover method has been proposed. By applying such a method, a high degree of availability can be achieved by utilizing two or more uncorrelated links based on minimum FDOP values. Packet duplication (PD) via multi-connectivity is a method of compensating for lost packets on a wireless channel. Utilizing two or more uncorrelated links, a high degree of availability can be attained with this strategy. However, complete packet duplication is inefficient and frequently unnecessary. We provide a novel adaptive fractional packet duplication (A-FPD) mechanism for enabling and disabling packet duplication based on a variety of parameters. We have developed a ‘DeepSlice’ model by implementing Deep Learning (DL) Neural Network to manage network load efficiency and network availability, utilizing in-network deep learning and prediction. Our Neural Network based ‘Secure5G’ Network Slicing model will proactively detect and eliminate threats based on incoming connections before they infest the 5G core network elements. These will enable the network operators to sell network slicing as-a-service to serve diverse services efficiently over a single infrastructure with higher level of security and reliability.Introduction -- Matrix exponential and deep learning neural network modeling of cellular handovers -- Survivability modeling in cellular networks -- Multi connectivity based handover enhancement and adaptive fractional packet duplication in 5G cellular networks -- Deepslice and Secure5G: a deep learning framework towards an efficient, reliable and secure network slicing in 5G networks -- Conclusion and future scop

    Earthquake Engineering

    Get PDF
    The book Earthquake Engineering - From Engineering Seismology to Optimal Seismic Design of Engineering Structures contains fifteen chapters written by researchers and experts in the fields of earthquake and structural engineering. This book provides the state-of-the-art on recent progress in the field of seimology, earthquake engineering and structural engineering. The book should be useful to graduate students, researchers and practicing structural engineers. It deals with seismicity, seismic hazard assessment and system oriented emergency response for abrupt earthquake disaster, the nature and the components of strong ground motions and several other interesting topics, such as dam-induced earthquakes, seismic stability of slopes and landslides. The book also tackles the dynamic response of underground pipes to blast loads, the optimal seismic design of RC multi-storey buildings, the finite-element analysis of cable-stayed bridges under strong ground motions and the acute psychiatric trauma intervention due to earthquakes

    Probabilistic verification of satellite systems for mission critical applications

    Get PDF
    In this thesis, we present a quantitative approach using probabilistic verification techniques for the analysis of reliability, availability, maintainability, and safety (RAMS) properties of satellite systems. The subject of our research is satellites used in mission critical industrial applications. A strong case for using probabilistic model checking to support RAMS analysis of satellite systems is made by our verification results. This study is intended to build a foundation to help reliability engineers with a basic background in model checking to apply probabilistic model checking to small satellite systems. We make two major contributions. One of these is the approach of RAMS analysis to satellite systems. In the past, RAMS analysis has been extensively applied to the field of electrical and electronics engineering. It allows system designers and reliability engineers to predict the likelihood of failures from the indication of historical or current operational data. There is a high potential for the application of RAMS analysis in the field of space science and engineering. However, there is a lack of standardisation and suitable procedures for the correct study of RAMS characteristics for satellite systems. This thesis considers the promising application of RAMS analysis to the case of satellite design, use, and maintenance, focusing on its system segments. Data collection and verification procedures are discussed, and a number of considerations are also presented on how to predict the probability of failure. Our second contribution is leveraging the power of probabilistic model checking to analyse satellite systems. We present techniques for analysing satellite systems that differ from the more common quantitative approaches based on traditional simulation and testing. These techniques have not been applied in this context before. We present the use of probabilistic techniques via a suite of detailed examples, together with their analysis. Our presentation is done in an incremental manner: in terms of complexity of application domains and system models, and a detailed PRISM model of each scenario. We also provide results from practical work together with a discussion about future improvements

    Blood Shortage Reduction by Deployment of Lateral Transshipment Approach

    Get PDF
    The healing effects of human blood make it one of the essential, life-saving components in a variety of medical procedures. However, assuring timely and sufficient blood supply for use in life-critical medical procedures is one of the major challenges that most health care networks around the world are persistently facing and trying to resolve. Based on the WHO’s latest statistics, 107 out of 180 countries all around the world have an insufficient amount of blood units to meet their demands. For two years in the row (2018-19), Canadian Blood Services have called for 100,000 new donors to sign up in order to meet the anticipated demand for blood. The perishability of blood components and uncertainty in both donation and demand scale are two important reasons that contribute to the blood shortage. Due to the poor inventory planning, the high rate of discarded units is another worldwide issue that exists in the blood supply chain and needs to be urgently addressed. Canadian blood supply chain network consists of several organizational entities and each of them impacts the blood units’ inventory levels in its own, unique way. In this study, an integrated supply chain model has been considered, and it consists of three main networked organizations: 1. Mobile Collection Centers, 2. Blood Centers, and 3. Hospitals. The main goal of this research is to develop a mathematical optimization model that can improve the proposed supply chain’s performance by reducing its related costs, and the currently existing shortage rate from about 25% to less than 15%. Lateral Transshipment and Emergency Ordering are two main approaches that have been implemented in the proposed model in order to improve both performance and efficiency. The Greater Toronto Area (GTA) has been considered as the case study focus for this research, and both models have been applied in this case study. All the further necessary actions and recommendations would be taken based on the case study’s results

    Modelling and Design of Resilient Networks under Challenges

    Get PDF
    Communication networks, in particular the Internet, face a variety of challenges that can disrupt our daily lives resulting in the loss of human lives and significant financial costs in the worst cases. We define challenges as external events that trigger faults that eventually result in service failures. Understanding these challenges accordingly is essential for improvement of the current networks and for designing Future Internet architectures. This dissertation presents a taxonomy of challenges that can help evaluate design choices for the current and Future Internet. Graph models to analyse critical infrastructures are examined and a multilevel graph model is developed to study interdependencies between different networks. Furthermore, graph-theoretic heuristic optimisation algorithms are developed. These heuristic algorithms add links to increase the resilience of networks in the least costly manner and they are computationally less expensive than an exhaustive search algorithm. The performance of networks under random failures, targeted attacks, and correlated area-based challenges are evaluated by the challenge simulation module that we developed. The GpENI Future Internet testbed is used to conduct experiments to evaluate the performance of the heuristic algorithms developed

    Monte Carlo Method with Heuristic Adjustment for Irregularly Shaped Food Product Volume Measurement

    Get PDF
    Volume measurement plays an important role in the production and processing of food products. Various methods have been proposed to measure the volume of food products with irregular shapes based on 3D reconstruction. However, 3D reconstruction comes with a high-priced computational cost. Furthermore, some of the volume measurement methods based on 3D reconstruction have a low accuracy. Another method for measuring volume of objects uses Monte Carlo method. Monte Carlo method performs volume measurements using random points. Monte Carlo method only requires information regarding whether random points fall inside or outside an object and does not require a 3D reconstruction. This paper proposes volume measurement using a computer vision system for irregularly shaped food products without 3D reconstruction based on Monte Carlo method with heuristic adjustment. Five images of food product were captured using five cameras and processed to produce binary images. Monte Carlo integration with heuristic adjustment was performed to measure the volume based on the information extracted from binary images. The experimental results show that the proposed method provided high accuracy and precision compared to the water displacement method. In addition, the proposed method is more accurate and faster than the space carving method

    Big Data Computing for Geospatial Applications

    Get PDF
    The convergence of big data and geospatial computing has brought forth challenges and opportunities to Geographic Information Science with regard to geospatial data management, processing, analysis, modeling, and visualization. This book highlights recent advancements in integrating new computing approaches, spatial methods, and data management strategies to tackle geospatial big data challenges and meanwhile demonstrates opportunities for using big data for geospatial applications. Crucial to the advancements highlighted in this book is the integration of computational thinking and spatial thinking and the transformation of abstract ideas and models to concrete data structures and algorithms
    corecore