121 research outputs found

    Modeling and Robust Design of Networks under Risk: The Case of Information Infrastructure

    Get PDF
    Study of network risks allows to develop insights into the methods of building robust networks, which are also critical elements of infrastructures that are of a paramount importance for the modern society. In this paper we show how the modern quantitative modeling methodologies can be employed for analysis of network risks and for design of robust networks under uncertainty. This is done on the example of important problem arising in the process of building of the information infrastructure: provision of advanced mobile data services. We show how portfolio theory developed in the modern finance can be used for design of robust provision network comprising of independent agents. After this the modeling frameworks of Bayesian nets andMarkov fields are used for the study of several problems fundamental for the process of service adoption such as the sensitivity of networks, the direction of improvements, and the propagation of user attitudes on social networks

    Induced Discounting and Risk Management

    Get PDF
    The goal of this paper is to specify and summarize assumptions and proofs for new approaches to discounting proposed inour catastrophic risk management studies. The main issue is concerned with justification of investments, which may turn into benefits over long and uncertain time horizon. For example, how can we justify mitigation efforts for expected 300-year flood that can occur also next year. The discounting is supposed to impose time preferences to resolve this issue, but this view may be dramatically misleading. We show that any discounted infinite horizon sum of values can be equivalently replaced by undiscounted sum of the same values with random finite time horizon. The expected duration of this stopping time horizon for standard discount rates obtained from capital markets does not exceed a few decades and therefore such rates may significantly underestimate the net benefits of long-term decisions. The alternative undiscounted random stopping time criterion allows to induce social stopping time discounting focusing on arrival times of potential extreme events rather then horizons of market interests. In general, induced discount rates are conditional on the degree of social commitment to mitigate risk. Random extreme events affect these rates, which alter the optimal mitigation efforts that, in turn, change events This endogeneity of the induced discounting restricts exact evaluations necessary for using traditional deterministic methods and it calls for stochastic optimisation methods. The paper provides insights in the nature of discounting that are critically important for developing robust long-term risk management strategies

    Discounting and catastrophic risk management

    Get PDF
    The risk management of complex coupled human-environmental systems essentially relies on discounting future losses and gains to their present values. These evaluations are used to justify catastrophic risks management decisions which may turn into benefits over long and uncertain time horizons. The misperception of proper discounting rates critically affects evaluations and may be rather misleading. Catastrophes are not properly treated within conventional economic theory. The lack of proper evaluations dramatically contributes to increasing the vulnerability of our society to human-made and natural disasters. Underestimation of rare low probability - high consequences potentially catastrophic scenarios (events) have led to the growth of buildings and industrial land and sizable value accumulation in flood (and other disaster) prone areas without paying proper attention to flood mitigations. A challenge is that an extreme event, say a once-in-300-year flood which occurs on average only once in 300 years, may have never occurred before in a given region. Therefore, purely adaptive policies relying on historical observations provide no awareness of the risk although, a 300-year flood may occur next year. For example, floods in Austria, Germany and the Czech Republic in 2002 were classified as 1000-, 500-, 250-, and 100-year events. Chernobyl nuclear disaster was evaluated as 106-year event. Yet common practice is to ignore these types of events as improbable events during a human lifetime. This paper analyzes the implications of potentially catastrophic events on the choice of discounting for long-term catastrophic risk management. It is shown that arbitrary discounting can be linked to "stopping time" events, which define the discount-related random horizon ("end of the world") of valuations. In other words, any discounting compares potential gains and losses only within a finite random discount-related stopping time horizon. The expected duration of this horizon for standard discount rates obtained from capital markets does not exceed a few decades and, as such, these rates cannot properly evaluate impacts of 1000-, 500-, 250-, 100- year catastrophes. The paper demonstrates that the correct discounting can be induced by the concept of stopping time, i.e. by explicit modelling of arrival time scenarios of potential catastrophes. In general, catastrophic events affect the induced discount rates, which alter the optimal mitigation efforts that, in turn, change events. The paper shows that stopping-time related discounting calls for the use of stochastic optimisation methods. Combined with explicit spatio-temporal catastrophe modelling, this induces the discounting which allows to properly focus risk management solutions on arrival times of potential catastrophic events rather then horizons of capital markets

    Integrated modeling approach to the analysis of food security and sustainable rural developments: Ukrainian case study

    Get PDF
    In Ukraine, the growth of intensive agricultural enterprises with a focus on fast profits contributes considerably to food insecurity and increasing socio-economic and environmental risks. Ukraine has important natural and labor resources for effective rural development. For example, more than 50% of food production is still managed in small and medium farms despite the difficulties associated with economic instabilities and the lack of proper policy support. The main issue for the agro-policy nowadays is to use these resources in a sustainable way enforcing robust long term development of rural communities and agriculture. In this paper, we introduce a stochastic geographically explicit model for designing forward looking policies regarding robust resources allocation and composition of agricultural production enhancing food security and rural development. In particular, we investigate the role of investments into rural facilities to stabilize and enhance the performance of the agrofood sector in view of uncertainties and incomplete information. The security goals are introduced in the form of multidimensional risk indicators

    Robust rescaling methods for integrated water, food, energy security management under uncertainty

    Get PDF
    The aim of the paper is to discuss robust non-Bayesian probabilistic cross-entropy based disaggregation (downscaling) techniques driven by the need to address local herterogeneities related to secure food, water, energy provisions consistent with available aggregate data and projections of global and national development trends. For example, aggregate land use projections derived from global economic land use planning models give no insights regarding potentially critical heterogeneities of local processes. High spatial resolution land use and cover change projections are also required as one the crucial inputs into Global Circulation Models. Many practical studies analyzing regional developments use cross-entropy minimization as an underlying principle for estimation of local processes consistent with available aggregate data. Traditional cross-entropy downscaling relies on a single prior distribution. In reality, prior distributions depend on various "environmental" parameters which may not be known exactly. Therefore in general instead of a uniquely defined prior there is a feasible set of these distributions. In this case, the estimation of local changes consistent with available aggregate data can be formulated as probabilistic inverse (from aggregate to local data) problem in the form of, in general, stochastic non-convex cross-entropy minimization model. Specific reparametrization permits to convexify the model. The duality relations derive numerical procedure for local estimates robust with respect to all priors from the feasible set. The approach will be illustrated by downscaling regional GLOBIOM (Global Biosphere Management Mode) model projections of land use changes

    SQG-Differential Evolution for difficult optimization problems under a tight function evaluation budget

    Full text link
    In the context of industrial engineering, it is important to integrate efficient computational optimization methods in the product development process. Some of the most challenging simulation-based engineering design optimization problems are characterized by: a large number of design variables, the absence of analytical gradients, highly non-linear objectives and a limited function evaluation budget. Although a huge variety of different optimization algorithms is available, the development and selection of efficient algorithms for problems with these industrial relevant characteristics, remains a challenge. In this communication, a hybrid variant of Differential Evolution (DE) is introduced which combines aspects of Stochastic Quasi-Gradient (SQG) methods within the framework of DE, in order to improve optimization efficiency on problems with the previously mentioned characteristics. The performance of the resulting derivative-free algorithm is compared with other state-of-the-art DE variants on 25 commonly used benchmark functions, under tight function evaluation budget constraints of 1000 evaluations. The experimental results indicate that the new algorithm performs excellent on the 'difficult' (high dimensional, multi-modal, inseparable) test functions. The operations used in the proposed mutation scheme, are computationally inexpensive, and can be easily implemented in existing differential evolution variants or other population-based optimization algorithms by a few lines of program code as an non-invasive optional setting. Besides the applicability of the presented algorithm by itself, the described concepts can serve as a useful and interesting addition to the algorithmic operators in the frameworks of heuristics and evolutionary optimization and computing

    Robust Management of Systemic Risks and Food-Water-Energy-Environmental Security: Two-Stage Strategic-Adaptive GLOBIOM Model

    Get PDF
    Critical imbalances and threshold exceedances can trigger a disruption in a network of interdependent systems. An insignificant-at-first-glance shock can induce systemic risks with cascading catastrophic impacts. Systemic risks challenge traditional risk assessment and management approaches. These risks are shaped by systemic interactions, risk exposures, and decisions of various agents. The paper discusses the need for the two-stage stochastic optimization (STO) approach that enables the design of a robust portfolio of precautionary strategic and operational adaptive decisions that makes the interdependent systems flexible and robust with respect to risks of all kinds. We established a connection between the robust quantile-based non-smooth estimation problem in statistics and the two-stage non-smooth STO problem of robust strategic-adaptive decisionmaking. The coexistence of complementary strategic and adaptive decisions induces systemic risk aversion in the form of Value-at-Risk (VaR) quantile-based risk constraints. The two-stage robust decision-making is implemented into a large-scale Global Biosphere Management (GLOBIOM) model, showing that robust management of systemic risks can be addressed by solving a system of probabilistic security equations. Selected numerical results emphasize that a robust combination of interdependent strategic and adaptive solutions presents qualitatively new policy recommendations, if compared to a traditional scenario-by-scenario decision-making analysis

    Dynamic Merge of the Global and Local Models for Sustainable Land Use Planning with Regard for Global Projections from GLOBIOM and Local Technical–Economic Feasibility and Resource Constraints

    Get PDF
    In order to conduct research at required spatial resolution, we propose a model fusion involving interlinked calculations of regional projections by the global dynamic model GLOBIOM (Global Biosphere Management Model) and robust dynamic downscaling model, based on cross-entropy principle, for deriving spatially resolved projections. The proposed procedure allows incorporating data from satellite images, statistics, expert opinions, as well as data from global land use models. In numerous case studies in China and Ukraine, the approach allowed to estimate local land use and land use change projections corresponding to real trends and expectations. The disaggregated data and projections were used in national models for planning sustainable land use and agricultural development
    • …
    corecore