181,888 research outputs found

    Data-driven Approximation of Distributionally Robust Chance Constraints using Bayesian Credible Intervals

    Full text link
    The non-convexity and intractability of distributionally robust chance constraints make them challenging to cope with. From a data-driven perspective, we propose formulating it as a robust optimization problem to ensure that the distributionally robust chance constraint is satisfied with high probability. To incorporate available data and prior distribution knowledge, we construct ambiguity sets for the distributionally robust chance constraint using Bayesian credible intervals. We establish the congruent relationship between the ambiguity set in Bayesian distributionally robust chance constraints and the uncertainty set in a specific robust optimization. In contrast to most existent uncertainty set construction methods which are only applicable for particular settings, our approach provides a unified framework for constructing uncertainty sets under different marginal distribution assumptions, thus making it more flexible and widely applicable. Additionally, under the concavity assumption, our method provides strong finite sample probability guarantees for optimal solutions. The practicality and effectiveness of our approach are illustrated with numerical experiments on portfolio management and queuing system problems. Overall, our approach offers a promising solution to distributionally robust chance constrained problems and has potential applications in other fields

    Short interval control for the cost estimate baseline of novel high value manufacturing products – a complexity based approach

    Get PDF
    Novel high value manufacturing products by default lack the minimum a priori data needed for forecasting cost variance over of time using regression based techniques. Forecasts which attempt to achieve this therefore suffer from significant variance which in turn places significant strain on budgetary assumptions and financial planning. The authors argue that for novel high value manufacturing products short interval control through continuous revision is necessary until the context of the baseline estimate stabilises sufficiently for extending the time intervals for revision. Case study data from the United States Department of Defence Scheduled Annual Summary Reports (1986-2013) is used to exemplify the approach. In this respect it must be remembered that the context of a baseline cost estimate is subject to a large number of assumptions regarding future plausible scenarios, the probability of such scenarios, and various requirements related to such. These assumptions change over time and the degree of their change is indicated by the extent that cost variance follows a forecast propagation curve that has been defined in advance. The presented approach determines the stability of this context by calculating the effort required to identify a propagation pattern for cost variance using the principles of Kolmogorov complexity. Only when that effort remains stable over a sufficient period of time can the revision periods for the cost estimate baseline be changed from continuous to discrete time intervals. The practical implication of the presented approach for novel high value manufacturing products is that attention is shifted from the bottom up or parametric estimation activity to the continuous management of the context for that cost estimate itself. This in turn enables a faster and more sustainable stabilisation of the estimating context which then creates the conditions for reducing cost estimate uncertainty in an actionable and timely manner

    Evaluation and management implications of uncertainty in a multispecies size-structured model of population and community responses to fishing

    Get PDF
    1. Implementation of an ecosystem approach to fisheries requires advice on trade-offs among fished species and between fisheries yields and biodiversity or food web properties. However, the lack of explicit representation, analysis and consideration of uncertainty in most multispecies models has limited their application in analyses that could support management advice. 2. We assessed the consequences of parameter uncertainty by developing 78 125 multispecies size-structured fish community models, with all combinations of parameters drawn from ranges that spanned parameter values estimated from data and literature. This unfiltered ensemble was reduced to 188 plausible models, the filtered ensemble (FE), by screening outputs against fish abundance data and ecological principles such as requiring species' persistence. 3. Effects of parameter uncertainty on estimates of single-species management reference points for fishing mortality (FMSY, fishing mortality rate providing MSY, the maximum sustainable yield) and biomass (BMSY, biomass at MSY) were evaluated by calculating probability distributions of estimated reference points with the FE. There was a 50% probability that multispecies FMSY could be estimated to within ±25% of its actual value, and a 50% probability that BMSY could be estimated to within ±40% of its actual value. 4. Signal-to-noise ratio was assessed for four community indicators when mortality rates were reduced from current rates to FMSY. The slope of the community size spectrum showed the greatest signal-to-noise ratio, indicating that it would be the most responsive indicator to the change in fishing mortality F. Further, the power of an ongoing international monitoring survey to detect predicted responses of size spectrum slope was higher than for other size-based metrics. 5. Synthesis and applications: Application of the ensemble model approach allows explicit representation of parameter uncertainty and supports advice and management by (i) providing uncertainty intervals for management reference points, (ii) estimating working values of reference points that achieve a defined reduction in risk of not breaching the true reference point, (iii) estimating the responsiveness of population, community, food web and biodiversity indicators to changes in F, (iv) assessing the performance of indicators and monitoring programmes and (v) identifying priorities for data collection and changes to model structure to reduce uncertainty

    Multiobjective Tactical Planning under Uncertainty for Air Traffic Flow and Capacity Management

    Get PDF
    We investigate a method to deal with congestion of sectors and delays in the tactical phase of air traffic flow and capacity management. It relies on temporal objectives given for every point of the flight plans and shared among the controllers in order to create a collaborative environment. This would enhance the transition from the network view of the flow management to the local view of air traffic control. Uncertainty is modeled at the trajectory level with temporal information on the boundary points of the crossed sectors and then, we infer the probabilistic occupancy count. Therefore, we can model the accuracy of the trajectory prediction in the optimization process in order to fix some safety margins. On the one hand, more accurate is our prediction; more efficient will be the proposed solutions, because of the tighter safety margins. On the other hand, when uncertainty is not negligible, the proposed solutions will be more robust to disruptions. Furthermore, a multiobjective algorithm is used to find the tradeoff between the delays and congestion, which are antagonist in airspace with high traffic density. The flow management position can choose manually, or automatically with a preference-based algorithm, the adequate solution. This method is tested against two instances, one with 10 flights and 5 sectors and one with 300 flights and 16 sectors.Comment: IEEE Congress on Evolutionary Computation (2013). arXiv admin note: substantial text overlap with arXiv:1309.391

    Risk assessment in life-cycle costing for road asset management

    Get PDF
    Queensland Department of Main Roads, Australia, spends approximately A$ 1 billion annually for road infrastructure asset management. To effectively manage road infrastructure, firstly road agencies not only need to optimise the expenditure for data collection, but at the same time, not jeopardise the reliability in using the optimised data to predict maintenance and rehabilitation costs. Secondly, road agencies need to accurately predict the deterioration rates of infrastructures to reflect local conditions so that the budget estimates could be accurately estimated. And finally, the prediction of budgets for maintenance and rehabilitation must provide a certain degree of reliability. This paper presents the results of case studies in using the probability-based method for an integrated approach (i.e. assessing optimal costs of pavement strength data collection; calibrating deterioration prediction models that suit local condition and assessing risk-adjusted budget estimates for road maintenance and rehabilitation for assessing life-cycle budget estimates). The probability concept is opening the path to having the means to predict life-cycle maintenance and rehabilitation budget estimates that have a known probability of success (e.g. produce budget estimates for a project life-cycle cost with 5% probability of exceeding). The paper also presents a conceptual decision-making framework in the form of risk mapping in which the life-cycle budget/cost investment could be considered in conjunction with social, environmental and political issues
    • …
    corecore