114,725 research outputs found

    Worst-Case Analysis of Process Flexibility Designs

    Get PDF
    Theoretical studies of process flexibility designs have mostly focused on expected sales. In this paper, we take a different approach by studying process flexibility designs from the worst-case point of view. To study the worst-case performances, we introduce the plant cover indices (PCIs), defined by bottlenecks in flexibility designs containing a fixed number of products. We prove that given a flexibility design, a general class of worst-case performance measures can be expressed as functions of the design’s PCIs and the given uncertainty set. This result has several major implications. First, it suggests a method to compare the worst-case performances of different flexibility designs without the need to know the specifics of the uncertainty sets. Second, we prove that under symmetric uncertainty sets and a large class of worst-case performance measures, the long chain, a celebrated sparse design, is superior to a large class of sparse flexibility designs, including any design that has a degree of two on each of its product nodes. Third, we show that under stochastic demand, the classical Jordan and Graves (JG) index can be expressed as a function of the PCIs. Furthermore, the PCIs motivate a modified JG index that is shown to be more effective in our numerical study. Finally, the PCIs lead to a heuristic for finding sparse flexibility designs that perform well under expected sales and have lower risk measures in our computational study.National Science Foundation (U.S.) (Grant CMMI-0758069)Masdar Institute of Science and TechnologyFord-MIT AllianceNatural Sciences and Engineering Research Council of Canada (Postgraduate Scholarship

    A PROBABILISTIC APPROACH FOR COMPRESSOR SIZING AND PLANT DESIGN

    Get PDF
    LectureEquipment sizing decisions in the Oil and Gas Industry often have to be made based on incomplete data. Often, the exact process conditions are based on numerous assumptions about well performance, market conditions, environmental conditions and others. Since the ultimate goal is to meet production commitments, the traditional way of addressing this is, to use worst case conditions, and often adding margins onto these. This will invariably lead to plants that are oversized, in some instances by large margins. In reality, the operating conditions are very rarely the assumed worst case conditions, but they are usually more benign most of the time. Plants designed based on worst case conditions, once in operation, will therefore usually not operate under optimum conditions, have reduced flexibility, and therefore cause both higher capital expenses and operating expenses. The authors outline a new probabilistic methodology that provides a framework for more intelligent process-machine designs . A standardized framework using Monte Carlo simulation and risk analysis is presented that more accurately defines process uncertainty and its impact on machine performance . This paper describes a new method for the design of efficient plants. The use of statistical and probabilistic tools allows to better account for the unpredictability of component performance, as well as for ambient conditions and demand. Using the methodology allows to design plants that perform best under the most likely scenarios, as opposed to traditional designs that tend to work best under unlikely worst case scenarios. A study was performed for a relatively simple scenario, but the method is not limited, and can easily be adapted to scenarios involving entire pipeline systems, complete plants, or platform operations. Based on these considerations, significant cost reductions are possible in many cases

    A PROBABILISTIC APPROACH FOR COMPRESSOR SIZING AND PLANT DESIGN

    Get PDF
    LectureEquipment sizing decisions in the Oil and Gas Industry often have to be made based on incomplete data. Often, the exact process conditions are based on numerous assumptions about well performance, market conditions, environmental conditions and others. Since the ultimate goal is to meet production commitments, the traditional way of addressing this is, to use worst case conditions, and often adding margins onto these. This will invariably lead to plants that are oversized, in some instances by large margins. In reality, the operating conditions are very rarely the assumed worst case conditions, but they are usually more benign most of the time. Plants designed based on worst case conditions, once in operation, will therefore usually not operate under optimum conditions, have reduced flexibility, and therefore cause both higher capital expenses and operating expenses. The authors outline a new probabilistic methodology that provides a framework for more intelligent process-machine designs . A standardized framework using Monte Carlo simulation and risk analysis is presented that more accurately defines process uncertainty and its impact on machine performance . This paper describes a new method for the design of efficient plants. The use of statistical and probabilistic tools allows to better account for the unpredictability of component performance, as well as for ambient conditions and demand. Using the methodology allows to design plants that perform best under the most likely scenarios, as opposed to traditional designs that tend to work best under unlikely worst case scenarios. A study was performed for a relatively simple scenario, but the method is not limited, and can easily be adapted to scenarios involving entire pipeline systems, complete plants, or platform operations. Based on these considerations, significant cost reductions are possible in many cases

    A Probabilistic Approach for Compressor Sizing and Plant Design

    Get PDF
    LectureEquipment sizing decisions in the Oil and Gas Industry often have to be made based on incomplete data. Often, the exact process conditions are based on numerous assumptions about well performance, market conditions, environmental conditions and others. Since the ultimate goal is to meet production commitments, the traditional way of addressing this is, to use worst case conditions, and often adding margins onto these. This will invariably lead to plants that are oversized, in some instances by large margins. In reality, the operating conditions are very rarely the assumed worst case conditions, but they are usually more benign most of the time. Plants designed based on worst case conditions, once in operation, will therefore usually not operate under optimum conditions, have reduced flexibility, and therefore cause both higher capital expenses and operating expenses. The authors outline a new probabilistic methodology that provides a framework for more intelligent process-machine designs. A standardized framework using Monte Carlo simulation and risk analysis is presented that more accurately defines process uncertainty and its impact on machine performance . This paper describes a new method for the design of efficient plants. The use of statistical and probabilistic tools allows to better take the unpredictability of component performance, as well as ambient conditions and demand, into account. Using the methodology allows to design plants that perform best under the most likely scenarios, as opposed to traditional designs that tend to work best under unlikely worst case scenarios. A study was performed for a relatively simple scenario, but the method is not limited, and can easily be adapted to scenarios involving entire pipeline systems, complete plants, or platform operations. Based on these considerations, significant cost reductions are possible in many cases

    Adaptive Survival Trials

    Full text link
    Mid-study design modifications are becoming increasingly accepted in confirmatory clinical trials, so long as appropriate methods are applied such that error rates are controlled. It is therefore unfortunate that the important case of time-to-event endpoints is not easily handled by the standard theory. We analyze current methods that allow design modifications to be based on the full interim data, i.e., not only the observed event times but also secondary endpoint and safety data from patients who are yet to have an event. We show that the final test statistic may ignore a substantial subset of the observed event times. Since it is the data corresponding to the earliest recruited patients that is ignored, this neglect becomes egregious when there is specific interest in learning about long-term survival. An alternative test incorporating all event times is proposed, where a conservative assumption is made in order to guarantee type I error control. We examine the properties of our proposed approach using the example of a clinical trial comparing two cancer therapies.Comment: 22 pages, 7 figure

    Spatially Selective Artificial-Noise Aided Transmit Optimization for MISO Multi-Eves Secrecy Rate Maximization

    Full text link
    Consider an MISO channel overheard by multiple eavesdroppers. Our goal is to design an artificial noise (AN)-aided transmit strategy, such that the achievable secrecy rate is maximized subject to the sum power constraint. AN-aided secure transmission has recently been found to be a promising approach for blocking eavesdropping attempts. In many existing studies, the confidential information transmit covariance and the AN covariance are not simultaneously optimized. In particular, for design convenience, it is common to prefix the AN covariance as a specific kind of spatially isotropic covariance. This paper considers joint optimization of the transmit and AN covariances for secrecy rate maximization (SRM), with a design flexibility that the AN can take any spatial pattern. Hence, the proposed design has potential in jamming the eavesdroppers more effectively, based upon the channel state information (CSI). We derive an optimization approach to the SRM problem through both analysis and convex conic optimization machinery. We show that the SRM problem can be recast as a single-variable optimization problem, and that resultant problem can be efficiently handled by solving a sequence of semidefinite programs. Our framework deals with a general setup of multiple multi-antenna eavesdroppers, and can cater for additional constraints arising from specific application scenarios, such as interference temperature constraints in interference networks. We also generalize the framework to an imperfect CSI case where a worst-case robust SRM formulation is considered. A suboptimal but safe solution to the outage-constrained robust SRM design is also investigated. Simulation results show that the proposed AN-aided SRM design yields significant secrecy rate gains over an optimal no-AN design and the isotropic AN design, especially when there are more eavesdroppers.Comment: To appear in IEEE Trans. Signal Process., 201

    Design implications of the new harmonised probabilistic damage stability regulations

    Get PDF
    In anticipation of the forthcoming new harmonised regulations for damage stability, SOLAS Chapter II-1, proposed in IMO MSC 80 and due for enforcement in 2009, a number of ship owners and consequentially yards and classification societies are venturing to exploit the new degrees of freedom afforded by the probabilistic concept of ship subdivision. In this process, designers are finding it rather difficult to move away from the prescription mindset that has been deeply ingrained in their way of conceptualising, creating and completing a ship design. Total freedom it appears is hard to cope with and a helping hand is needed to guide them in crossing the line from prescriptive to goal-setting design. This will be facilitated considerably with improved understanding of what this concept entails and of its limitations and range of applicability. This paper represents an attempt in this direction, based on the results of a research study, financed by the Maritime and Coastguard Agency in the UK, to assess the design implications of the new harmonised rules on passenger and cargo ships
    • 

    corecore