256 research outputs found

    The safety case and the lessons learned for the reliability and maintainability case

    Get PDF
    This paper examine the safety case and the lessons learned for the reliability and maintainability case

    Practical Methods for Optimizing Equipment Maintenance Strategies Using an Analytic Hierarchy Process and Prognostic Algorithms

    Get PDF
    Many large organizations report limited success using Condition Based Maintenance (CbM). This work explains some of the causes for limited success, and recommends practical methods that enable the benefits of CbM. The backbone of CbM is a Prognostics and Health Management (PHM) system. Use of PHM alone does not ensure success; it needs to be integrated into enterprise level processes and culture, and aligned with customer expectations. To integrate PHM, this work recommends a novel life cycle framework, expanding the concept of maintenance into several levels beginning with an overarching maintenance strategy and subordinate policies, tactics, and PHM analytical methods. During the design and in-service phases of the equipment’s life, an organization must prove that a maintenance policy satisfies specific safety and technical requirements, business practices, and is supported by the logistic and resourcing plan to satisfy end-user needs and expectations. These factors often compete with each other because they are designed and considered separately, and serve disparate customers. This work recommends using the Analytic Hierarchy Process (AHP) as a practical method for consolidating input from stakeholders and quantifying the most preferred maintenance policy. AHP forces simultaneous consideration of all factors, resolving conflicts in the trade-space of the decision process. When used within the recommended life cycle framework, it is a vehicle for justifying the decision to transition from generalized high-level concepts down to specific lower-level actions. This work demonstrates AHP using degradation data, prognostic algorithms, cost data, and stakeholder input to select the most preferred maintenance policy for a paint coating system. It concludes the following for this particular system: A proactive maintenance policy is most preferred, and a predictive (CbM) policy is more preferred than predeterminative (time-directed) and corrective policies. A General Path prognostic Model with Bayesian updating (GPM) provides the most accurate prediction of the Remaining Useful Life (RUL). Long periods between inspections and use of categorical variables in inspection reports severely limit the accuracy in predicting the RUL. In summary, this work recommends using the proposed life cycle model, AHP, PHM, a GPM model, and embedded sensors to improve the success of a CbM policy

    An Integrated Retail Supply Chain Risk Management Framework: A System Thinking Approach

    Get PDF
    It is often taken for granted that the right products will be available to buy in retail outlets seven days a week, 52 weeks a year. Consumer perception is that of a simple service requirement, but the reality is a complex, time sensitive system - the retail supply chain (RSC). Due to short product life-cycles with uncertain supply and demand behaviour, the RSC faces many challenges and is very vulnerable to disruptions. In addition, external risk events such as BREXIT, extreme weather, the financial crisis, and terror attacks mean there is a need for effective RSC risk management (RSCRM) processes within organisations. Literature shows that although there is an increasing amount of research in RSCRM, it is highly theoretical with limited empirical evidence or applied methodologies. With an active enthusiasm coming from industry practitioners for RSCRM methodologies and support solutions, the RSCRM research community have acknowledged that the main issue for future research is not tools and techniques, but collaborative RSC system wide implementation. The implementation of a cross-organisational initiative such as RSCRM is a very complex task that requires real-world frameworks for real-world practitioners. Therefore, this research study attempts to explore the business requirements for developing a three-stage integrated RSCRM framework that will encourage extended RSC collaboration. While focusing on the practitioner requirements of RSCRM projects and inspired by the laws of Thermodynamics and the philosophy of System Thinking, in stage one a conceptual reference model, The �6 Coefficient, was developed building on the formative work of supply chain excellence and business process management. The �6 Coefficient reference model has been intricately designed to bridge the theoretical gap between practitioner and researcher with the aim of ensuring practitioner confidence in partaking in a complex business process project. Stage two focused on a need for a standardised vocabulary, and through the SCOR11 reference guide, acts as a calibration point for the integrated framework, ensuring easy transfer and application within supply chain industries. In their design, stages one and two are perfect complements to the final stage of the integrated framework, a risk assessment toolbox based on a Hybrid Simulation Study capable of monitoring the disruptive behaviour of a multi-echelon RSC from both a macro and micro level using the techniques of System Dynamics (SD) and Discrete Event Simulation (DES) modelling respectively. Empirically validated through an embedded mixed methods case study, results of the integrated framework application are very encouraging. The first phase, the secondary exploratory study, gained valuable empirical evidence of the barriers to successfully implementing a complex business project and also validated using simulation as an effective risk assessment tool. Results showed certain high-risk order policy decisions could potentially reduce total costs (TC) by over 55% and reduce delivery times by 3 days. The use of the �6 Coefficient as the communication/consultation phase of the primary RSCRM case study was hugely influential on the success of the overall hybrid simulation study development and application, with significant increase in both practitioner and researcher confidence in running an RSCRM project. This was evident in the results of the hybrid model’s macro and micro assessment of the RSC. SD results effectively monitored the behaviour of the RSC under important disruptive risks, showing delayed effects to promotions and knowledge loss resulted in a bullwhip effect pattern upstream with the FMCG manufacturer’s TC increasing by as much as €50m. The DES analysis, focusing on the NDC function of the RSC also showed results of TC sensitivity to order behaviour from retailers, although an optimisation based risk treatment has reduced TC by 30%. Future research includes a global empirical validation of the �6 Coefficient and enhancement of the application of thermodynamic laws in business process management. The industry calibration capabilities of the integrated framework application of the integrated framework will also be extensively tested

    United States Department of Energy Integrated Manufacturing & Processing Predoctoral Fellowships. Final Report

    Full text link

    Inferring Heterogeneous Treatment Effects of Crashes on Highway Traffic: A Doubly Robust Causal Machine Learning Approach

    Full text link
    Highway traffic crashes exert a considerable impact on both transportation systems and the economy. In this context, accurate and dependable emergency responses are crucial for effective traffic management. However, the influence of crashes on traffic status varies across diverse factors and may be biased due to selection bias. Therefore, there arises a necessity to accurately estimate the heterogeneous causal effects of crashes, thereby providing essential insights to facilitate individual-level emergency decision-making. This paper proposes a novel causal machine learning framework to estimate the causal effect of different types of crashes on highway speed. The Neyman-Rubin Causal Model (RCM) is employed to formulate this problem from a causal perspective. The Conditional Shapley Value Index (CSVI) is proposed based on causal graph theory to filter adverse variables, and the Structural Causal Model (SCM) is then adopted to define the statistical estimand for causal effects. The treatment effects are estimated by Doubly Robust Learning (DRL) methods, which combine doubly robust causal inference with classification and regression machine learning models. Experimental results from 4815 crashes on Highway Interstate 5 in Washington State reveal the heterogeneous treatment effects of crashes at varying distances and durations. The rear-end crashes cause more severe congestion and longer durations than other types of crashes, and the sideswipe crashes have the longest delayed impact. Additionally, the findings show that rear-end crashes affect traffic greater at night, while crash to objects has the most significant influence during peak hours. Statistical hypothesis tests, error metrics based on matched "counterfactual outcomes", and sensitive analyses are employed for assessment, and the results validate the accuracy and effectiveness of our method.Comment: 38 pages, 13 figures, 8 table

    NASA automatic subject analysis technique for extracting retrievable multi-terms (NASA TERM) system

    Get PDF
    Current methods for information processing and retrieval used at the NASA Scientific and Technical Information Facility are reviewed. A more cost effective computer aided indexing system is proposed which automatically generates print terms (phrases) from the natural text. Satisfactory print terms can be generated in a primarily automatic manner to produce a thesaurus (NASA TERMS) which extends all the mappings presently applied by indexers, specifies the worth of each posting term in the thesaurus, and indicates the areas of use of the thesaurus entry phrase. These print terms enable the computer to determine which of several terms in a hierarchy is desirable and to differentiate ambiguous terms. Steps in the NASA TERMS algorithm are discussed and the processing of surrogate entry phrases is demonstrated using four previously manually indexed STAR abstracts for comparison. The simulation shows phrase isolation, text phrase reduction, NASA terms selection, and RECON display

    Development and assessment of dynamic storage cell codes for flood inundation modelling

    Get PDF
    Since 1962 storage cell codes have been developed to simulate flow on fluvial and coastal floodplains. These models treat the floodplain as a series of discrete storage cells, with the flow between cells calculated explicitly using some analytical flow formulae such as the Manning equation. Recently these codes have been reconfigured to use regular Cartesian grids to make full use of widely available high resolution data captured from remote sensing platforms and stored in a raster GIS format. Such raster-based storage cell codes have many of the advantages over full two-dimensional depth averaged schemes but without the computational cost, however their typical implementation results in a number of fundamental limitations. These include an inability to develop solutions that are independent of time step or grid size, and an unrealistic lack of sensitivity to floodplain friction. In this thesis, a new solution to these problems is proposed based on an optimal adaptive time step determined using a Courant-type condition for model stability. Comparison of this new adaptive time step scheme to analytical solutions of wave propagation/recession on flat and sloping planar surfaces and against field measurements acquired for four real flood scenarios demonstrates considerable improvement over a standard raster storage cell model. Moreover, the new scheme is shown to yield results that are independent of grid size or choice of initial time step and which show an intuitively correct sensitivity to floodplain friction over spatially-complex topography. It does, however, incur a prohibitive computation cost at model grid resolutions less than 50 m. This primary research is supplemented by an examination of the data and methods used to apply, and in particular calibrate, distributed flood inundation models in practice. Firstly, different objective functions for evaluating the overall similarity between binary predictions of flood extent and remotely sensed images of inundation patterns are examined. On the basis of the results presented, recommendations are provided regarding the use of various measures for hydrological problems. Secondly, the value of different observational data types typically available for calibrating/constraining model predictions is explored within an extended Generalised Likelihood Uncertainty Estimation (GLUE) framework. A quasi-Bayesian methodology for combining these individual evaluations that overcomes the limitations of calibration against any single measurement source/item is also presented.EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    Impacts of Climate Change on Rainfall Extremes and Urban Drainage Systems

    Get PDF
    Impacts of Climate Change on Rainfall Extremes and Urban Drainage Systems provides a state-of-the-art overview of existing methodologies and relevant results related to the assessment of the climate change impacts on urban rainfall extremes as well as on urban hydrology and hydraulics. This overview focuses mainly on several difficulties and limitations regarding the current methods and discusses various issues and challenges facing the research community in dealing with the climate change impact assessment and adaptation for urban drainage infrastructure design and managemen
    • …
    corecore