668 research outputs found

    Detailed Inventory Record Inaccuracy Analysis

    Get PDF
    This dissertation performs a methodical analysis to understand the behavior of inventory record inaccuracy (IRI) when it is influenced by demand, supply and lead time uncertainty in both online and offline retail environment separately. Additionally, this study identifies the susceptibility of the inventory systems towards IRI due to conventional perfect data visibility assumptions. Two different alternatives for such methods are presented and analyzed; the IRI resistance and the error control methods. The discussed methods effectively countered various aspects of IRI; the IRI resistance method performs better on stock-out and lost sales, whereas error control method keeps lower inventory. Furthermore, this research also investigates the value of using a secondary source of information (automated data capturing) along with traditional inventory record keeping methods to control the effects of IRI. To understand the combined behavior of the pooled data sources an infinite horizon discounted Markov decision process (MDP) is generated and optimized. Moreover, the traditional cost based reward structure is abandoned to put more emphasis on the effects of IRI. Instead a new measure is developed as inventory performance by combining four key performance metrics; lost sales, amount of correction, fill rate and amount of inventory counted. These key metrics are united under a unitless platform using fuzzy logic and combined through additive methods. The inventory model is then analyzed to understand the optimal policy structure, which is proven to be of a control limit type

    Slave to the Algorithm? Why a \u27Right to an Explanation\u27 Is Probably Not the Remedy You Are Looking For

    Get PDF
    Algorithms, particularly machine learning (ML) algorithms, are increasingly important to individuals’ lives, but have caused a range of concerns revolving mainly around unfairness, discrimination and opacity. Transparency in the form of a “right to an explanation” has emerged as a compellingly attractive remedy since it intuitively promises to open the algorithmic “black box” to promote challenge, redress, and hopefully heightened accountability. Amidst the general furore over algorithmic bias we describe, any remedy in a storm has looked attractive. However, we argue that a right to an explanation in the EU General Data Protection Regulation (GDPR) is unlikely to present a complete remedy to algorithmic harms, particularly in some of the core “algorithmic war stories” that have shaped recent attitudes in this domain. Firstly, the law is restrictive, unclear, or even paradoxical concerning when any explanation-related right can be triggered. Secondly, even navigating this, the legal conception of explanations as “meaningful information about the logic of processing” may not be provided by the kind of ML “explanations” computer scientists have developed, partially in response. ML explanations are restricted both by the type of explanation sought, the dimensionality of the domain and the type of user seeking an explanation. However, “subject-centric explanations (SCEs) focussing on particular regions of a model around a query show promise for interactive exploration, as do explanation systems based on learning a model from outside rather than taking it apart (pedagogical versus decompositional explanations) in dodging developers\u27 worries of intellectual property or trade secrets disclosure. Based on our analysis, we fear that the search for a “right to an explanation” in the GDPR may be at best distracting, and at worst nurture a new kind of “transparency fallacy.” But all is not lost. We argue that other parts of the GDPR related (i) to the right to erasure ( right to be forgotten ) and the right to data portability; and (ii) to privacy by design, Data Protection Impact Assessments and certification and privacy seals, may have the seeds we can use to make algorithms more responsible, explicable, and human-centered

    Research on the System Safety Management in Urban Railway

    Get PDF
    Nowadays, rail transport has become one of the most widely utilised forms of transport thanks to its high safety level, large capacity, and cost-effectiveness. With the railway network's continuous development, including urban rail transit, one of the major areas of increasing attention and demand is ensuring safety or risk management in operation long-term remains for the whole life cycle by scientific tools, management of railway operation (Martani 2017), specifically in developed and developing countries like Vietnam. The situation in Vietnam demonstrates that the national mainline railway network has been built and operated entirely in a single narrow gauge (1000mm) since the previous century, with very few updates of manual operating technology. This significantly highlights that up to now, the conventional technique for managing the safety operation in general, and collision in particular, of the current Vietnamese railway system, including its subsystems, is only accident statistics which is not a scientific-based tool as the others like risk identify and analyse methods, risk mitigation…, that are already available in many countries. Accident management of Vietnam Railways is limited and responsible for accident statistics analysis to avoid and minimise the harm caused by phenomena that occur only after an accident. Statistical analysis of train accident case studies in Vietnam railway demonstrates that, because hazards and failures that could result in serious system occurrences (accidents and incidents) have not been identified, recorded, and evaluated to conduct safety-driven risk analysis using a well-suited assessment methodology, risk prevention and control cannot be achieved. Not only is it hard to forecast and avoid events, but it may also raise the chance and amount of danger, as well as the severity of the later effects. As a result, Vietnam's railway system has a high number of accidents and failure rates. For example, Vietnam Rail-ways' mainline network accounted for approximately 200 railway accidents in 2018, a 3% increase over the previous year, including 163 collisions between trains and road vehicles/persons, resulting in more than 100 fatalities and more than 150 casualties; 16 accidents, including almost derailments, the signal passed at danger… without fatality or casual-ty, but significant damage to rolling stock and track infrastructure (VR 2021). Focusing and developing a new standardised framework for safety management and availability of railway operation in Vietnam is required in view of the rapid development of rail urban transport in the country in recent years (VmoT 2016; VmoT 2018). UMRT Line HN2A in southwest Hanoi is the country's first elevated light rail transit line, which was completed and officially put into revenue service in November 2021. This greatly highlights that up to the current date, the UMRT Line HN2A is the first and only railway line in Vietnam with operational safety assessment launched for the first time and long-term remains for the whole life cycle. The fact that the UMRT Hanoi has a large capacity, more complicated rolling stock and infrastructure equipment, as well as a modern communica-tion-based train control (CBTC) signalling system and automatic train driving without the need for operator intervention (Lindqvist 2006), are all advantages. Developing a compatible and integrated safety management system (SMS) for adaption to the safety operating requirements of this UMRT is an important major point of concern, and this should be proven. In actuality, the system acceptance and safety certification phase for Metro Line HN2A prolonged up to 2.5 years owing to the identification of difficulties with noncompliance to safety requirements resulting from inadequate SMS documents and risk assessment. These faults and hazards have developed during the manufacturing and execution of the project; it is impossible to go back in time to correct them, and it is also impossible to ignore the project without assuming responsibility for its management. At the time of completion, the HN2A metro line will have required an expenditure of up to $868 million, thus it is vital to create measures to prevent system failure and assure passenger safety. This dissertation has reviewed the methods to solve the aforementioned challenges and presented a solution blueprint to attain the European standard level of system safety in three-phase as in the following: • Phase 1: applicable for lines that are currently in operation, such as Metro Line HN2A. Focused on operational and maintenance procedures, as well as a training plan for railway personnel, in order to enhance human performance. Complete and update the risk assessment framework for Metro Line HN2A. The dissertation's findings are described in these applications. • Phase 2: applicable for lines that are currently in construction and manufacturing, such as Metro Line HN3, Line HN2, HCMC Line 1 and Line 2. Continue refining and enhancing engineering management methods introduced during Phase 1. On the basis of the risk assessment by manufacturers (Line HN3, HCMC Line 2 with European manufacturers) and the risk assessment framework described in Chapter 4, a risk management plan for each line will be developed. Building Accident database for risk assessment research and development. • Phase 3: applicable for lines that are currently in planning. Enhance safety requirements and life-cycle management. Building a proactive Safety Culture step by step for the railway industry. This material is implemented gradually throughout all three phases, beginning with the creation of the concept and concluding with an improvement in the attitude of railway personnel on the HN2A line. In addition to this overview, Chapters 4 through Chapter 9 of the dissertation include particular solutions for Risk assessment, Vehicle and Infrastructure Maintenance methods, Inci-dent Management procedures, and Safety Culture installation. This document focuses on constructing a system safety concept for railway personnel, providing stringent and scientific management practises to assure proper engineering conditions, to manage effectively the metro line system, and ensuring passenger safety in Hanoi's metro operatio

    Artificial markets and intelligent agents

    Get PDF
    Thesis (Ph.D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2001.Includes bibliographical references (p. 173-178).In many studies of market microstructure, theoretical analysis quickly becomes in tractable for all but the simplest stylized models. This thesis considers two alternative approaches, namely, the use of experiments with human subjects and simulations with intelligent agents, to address some of the limitations of theoretical modeling. The thesis aims to study the design, development and characterization of artificial markets as well as the behaviors and strategies of intelligent trading and market making agents. Simulations and experiments are conducted to study information aggregation and dissemination in a market. A number of features of the market dynamics are examined: the price efficiency of the market, the speed at which prices converge to the rational expectations equilibrium price, and the learning dynamics of traders who possess diverse information or preferences.(cont.) By constructing simple intelligent agents, not only am I able to replicate several findings of human-based experiments, but I also find intriguing differences between agent-based and human based experiments. The importance of liquidity in securities markets motivates considerable inter ests in studying the behaviors of market-makers. A rule-based market-maker, built in with multiple objectives, including maintaining a fair and orderly market, maximizing profit and minimizing inventory risk, is constructed and tested on historical transaction data. Following the same design, an adaptive market-maker is modeled in the framework of reinforcement learning. The agent is shown to be able to adapt its strategies to different noisy market environments.by Tung Chan.Ph.D

    A model-based approach to System of Systems risk management

    Get PDF
    The failure of many System of Systems (SoS) enterprises can be attributed to the inappropriate application of traditional Systems Engineering (SE) processes within the SoS domain, because of the mistaken belief that a SoS can be regarded as a single large, or complex, system. SoS Engineering (SoSE) is a sub-discipline of SE; Risk Management and Modelling and Simulation (M&S) are key areas within SoSE, both of which also lie within the traditional SE domain. Risk Management of SoS requires a different approach to that currently taken for individual systems; if risk is managed for each component system then it cannot be assumed that the aggregated affect will be to mitigate risk at the SoS level. A literature review was undertaken examining three themes: (1) SoS Engineering (SoSE), (2) M&S and (3) Risk. Theme 1 of the literature provided insight into the activities comprising SoSE and its difference from traditional SE with risk management identified as a key activity. The second theme discussed the application of M&S to SoS, providing an output, which supported the identification of appropriate techniques and concluding that, the inherent complexity of a SoS required the use of M&S in order to support SoSE activities. Current risk management approaches were reviewed in theme 3 as well as the management of SoS risk. Although some specific examples of the management of SoS risk were found, no mature, general approach was identified, indicating a gap in current knowledge. However, it was noted most of these examples were underpinned by M&S approaches. It was therefore concluded a general approach SoS risk management utilising M&S methods would be of benefit. In order to fill the gap identified in current knowledge, this research proposed a new model based approach to Risk Management where risk identification was supported by a framework, which combined SoS system of interest dimensions with holistic risk types, where the resulting risks and contributing factors are captured in a causal network. Analysis of the causal network using a model technique selection tool, developed as part of this research, allowed the causal network to be simplified through the replacement of groups of elements within the network by appropriate supporting models. The Bayesian Belief Network (BBN) was identified as a suitable method to represent SoS risk. Supporting models run in Monte Carlo Simulations allowed data to be generated from which the risk BBNs could learn, thereby providing a more quantitative approach to SoS risk management. A method was developed which provided context to the BBN risk output through comparison with worst and best-case risk probabilities. The model based approach to Risk Management was applied to two very different case studies: Close Air Support mission planning and the Wheat Supply Chain, UK National Food Security risks, demonstrating its effectiveness and adaptability. The research established that the SoS SoI is essential for effective SoS risk identification and analysis of risk transfer, effective SoS modelling requires a range of techniques where suitability is determined by the problem context, the responsibility for SoS Risk Management is related to the overall SoS classification and the model based approach to SoS risk management was effective for both application case studies

    An Optimisation-based Framework for Complex Business Process: Healthcare Application

    Get PDF
    The Irish healthcare system is currently facing major pressures due to rising demand, caused by population growth, ageing and high expectations of service quality. This pressure on the Irish healthcare system creates a need for support from research institutions in dealing with decision areas such as resource allocation and performance measurement. While approaches such as modelling, simulation, multi-criteria decision analysis, performance management, and optimisation can – when applied skilfully – improve healthcare performance, they represent just one part of the solution. Accordingly, to achieve significant and sustainable performance, this research aims to develop a practical, yet effective, optimisation-based framework for managing complex processes in the healthcare domain. Through an extensive review of the literature on the aforementioned solution techniques, limitations of using each technique on its own are identified in order to define a practical integrated approach toward developing the proposed framework. During the framework validation phase, real-time strategies have to be optimised to solve Emergency Department performance issues in a major hospital. Results show a potential of significant reduction in patients average length of stay (i.e. 48% of average patient throughput time) whilst reducing the over-reliance on overstretched nursing resources, that resulted in an increase of staff utilisation between 7% and 10%. Given the high uncertainty in healthcare service demand, using the integrated framework allows decision makers to find optimal staff schedules that improve emergency department performance. The proposed optimum staff schedule reduces the average waiting time of patients by 57% and also contributes to reduce number of patients left without treatment to 8% instead of 17%. The developed framework has been implemented by the hospital partner with a high level of success

    How markets slowly digest changes in supply and demand

    Full text link
    In this article we revisit the classic problem of tatonnement in price formation from a microstructure point of view, reviewing a recent body of theoretical and empirical work explaining how fluctuations in supply and demand are slowly incorporated into prices. Because revealed market liquidity is extremely low, large orders to buy or sell can only be traded incrementally, over periods of time as long as months. As a result order flow is a highly persistent long-memory process. Maintaining compatibility with market efficiency has profound consequences on price formation, on the dynamics of liquidity, and on the nature of impact. We review a body of theory that makes detailed quantitative predictions about the volume and time dependence of market impact, the bid-ask spread, order book dynamics, and volatility. Comparisons to data yield some encouraging successes. This framework suggests a novel interpretation of financial information, in which agents are at best only weakly informed and all have a similar and extremely noisy impact on prices. Most of the processed information appears to come from supply and demand itself, rather than from external news. The ideas reviewed here are relevant to market microstructure regulation, agent-based models, cost-optimal execution strategies, and understanding market ecologies.Comment: 111 pages, 24 figure
    • …
    corecore