3,606 research outputs found

    Assessing current and future impacts of climate-related extreme events. The case of Bangladesh

    Get PDF
    Extreme events and options for managing these risks are receiving increasing attention in research and policy. In order to cost these extremes, a standard approach is to use Integrated Assessment Models with global or regional resolution and represent risk using add-on damage functions that are based on observed impacts and contingent on gradual temperature increase. Such assessments generally find that economic development and population growth are likely to be the major drivers of natural disaster risk in the future; yet, little is said about changes in vulnerability, generally considered a key component of risk. As well, risk is represented by an estimate of average observed impacts using the statistical expectation. Explicitly accounting for vulnerability and using a fuller risk-analytical framework embedded in a simpler economic model, we study the case of Bangladesh, the most flood prone country in the world, in order to critically examine the contribution of all drivers to risk. Specifically, we assess projected changes in riverine flood risk in Bangladesh up to the year 2050 and attempt to quantitatively assess the relative importance of climate change versus socio-economic change in current and future disaster risk. We find that, while flood frequency and intensity, based on regional climate downscaling, are expected to increase, vulnerability, based on observed behaviour in real events over the last 30 years, can be expected to decrease. Also, changes in vulnerability and hazard are roughly of similar magnitudes, while uncertainties are large. Overall, we interpret our findings to corroborate the need for taking a more risk-based approach when assessing extreme events impacts and adaptation - cognizant of the large associated uncertainties and methodological challenges -

    Incorporating model uncertainty into optimal insurance contract design

    Get PDF
    In stochastic optimization models, the optimal solution heavily depends on the selected probability model for the scenarios. However, the scenario models are typically chosen on the basis of statistical estimates and are therefore subject to model error. We demonstrate here how the model uncertainty can be incorporated into the decision making process. We use a nonparametric approach for quantifying the model uncertainty and a minimax setup to find model-robust solutions. The method is illustrated by a risk management problem involving the optimal design of an insurance contract

    Numerical indications of a q-generalised central limit theorem

    Get PDF
    We provide numerical indications of the qq-generalised central limit theorem that has been conjectured (Tsallis 2004) in nonextensive statistical mechanics. We focus on NN binary random variables correlated in a {\it scale-invariant} way. The correlations are introduced by imposing the Leibnitz rule on a probability set based on the so-called qq-product with q1q \le 1. We show that, in the large NN limit (and after appropriate centering, rescaling, and symmetrisation), the emerging distributions are qeq_e-Gaussians, i.e., p(x)[1(1qe)β(N)x2]1/(1qe)p(x) \propto [1-(1-q_e) \beta(N) x^2]^{1/(1-q_e)}, with qe=21qq_e=2-\frac{1}{q}, and with coefficients β(N)\beta(N) approaching finite values β()\beta(\infty). The particular case q=qe=1q=q_e=1 recovers the celebrated de Moivre-Laplace theorem.Comment: Minor improvements and corrections have been introduced in the new version. 7 pages including 4 figure

    Emergence of long memory in stock volatility from a modified Mike-Farmer model

    Full text link
    The Mike-Farmer (MF) model was constructed empirically based on the continuous double auction mechanism in an order-driven market, which can successfully reproduce the cubic law of returns and the diffusive behavior of stock prices at the transaction level. However, the volatility (defined by absolute return) in the MF model does not show sound long memory. We propose a modified version of the MF model by including a new ingredient, that is, long memory in the aggressiveness (quantified by the relative prices) of incoming orders, which is an important stylized fact identified by analyzing the order flows of 23 liquid Chinese stocks. Long memory emerges in the volatility synthesized from the modified MF model with the DFA scaling exponent close to 0.76, and the cubic law of returns and the diffusive behavior of prices are also produced at the same time. We also find that the long memory of order signs has no impact on the long memory property of volatility, and the memory effect of order aggressiveness has little impact on the diffusiveness of stock prices.Comment: 6 pages, 6 figures and 1 tabl

    Reflections on ten years of using economics games and experiments in teaching

    Get PDF
    In this paper, the author reflects on his 10 years’ experience of using games and experiments and in the process develops a type of practitioner’s guide. The existing quantitative and qualitative evidence on the impact of using games on student learning is reviewed. On balance, a positive effect, on measures of attainment, is found in the literature. Given these findings, it is surprising that there is also evidence in the UK and US that they are not widely used. Some factors are discussed that might deter tutors from employing them. Unsurprisingly, one of these is the additional cost, which might make the use of online games seem more attractive, given the way results can be automatically recorded. However, some relatively low-cost paper-based games were found to have significant advantages. In particular, they appear to facilitate social interaction which has a positive impact on student motivation and learning. One popular and effective paper-based game is discussed in some detail. A number of recommendations are provided on how to implement the game in order to maximise the learning benefits it can provide. Some ideas on how to maximise the learning benefits from using games more generally are also considered

    Enhancing Resilience of Systems to Individual and Systemic Risk: Steps toward An Integrative Framework

    Get PDF
    Individual events can trigger systemic risks in many complex systems, from natural to man-made. Yet, analysts are still usually treating these two types of risks separately. We suggest that, rather, individual risks and systemic risks represent two ends of a continuum and therefore should not be analyzed in isolation, but in an integrative manner. Such a perspective can further be related to the notion of resilience and opens up options for developing an integrated framework for increasing the resilience of systems to both types of risks simultaneously. Systemic risks are sometimes called network risks to emphasize the importance of inter-linkages, while, in contrast, individual risks originate from individual events that directly affect an agent and happen independently from the rest of the system. The two different perspectives on risk have major implications for strategies aiming at increasing resilience, and we, therefore, discuss how such strategies differ between individual risks and systemic risks. In doing so, we suggest that for individual risks, a risk-layering approach can be applied, using probability distributions and their associated measures. Following the risk-layering approach, agents can identify their own tipping points, i.e., the points in their loss distributions at which their operation would fail, and on this basis determine the most appropriate measures for decreasing their risk of such failures. This approach can rely on several well-established market-based instruments, including insurance and portfolio diversification. To deal with systemic risks, these individual tipping points need to be managed in their totality, because system collapses are triggered by individual failures. An additional and complementary approach is to adjust the network structure of the system, which determines how individual failures can cascade and generate systemic risks. Instead of one-size-fits-all rules of thumb, we suggest that the management of systemic risks should be based on a careful examination of a system’s risk landscape. Especially a node-criticality approach, which aims to induce a network restructuring based on the differential contributions of nodes to systemic risk may be a promising way forward toward an integrated framework. Hence, we argue that tailor-made transformational approaches are needed, which take into account the specificities of a system’s network structure and thereby push it toward safer configurations for both individual risks and systemic risks

    Family Variables and Reading

    Full text link
    others of poor and average readers in Japan, Taiwan and the United States were iterviewed about their child-rearing practices, attitudes, and beliefs, and their children's current and earlier experiences. Poor readers represented the lowest fifth percentile in reading scores; they were matched by classroom, sex, and age with average readers; i.e., children who obtained reading scores within one standard deviation from the mean. The groups seldom differed significantly according to environmental variables and parent-child interactions. Maternal ratings of cognitive and achievement variables differentiated both the children in the two groups and the mothers themselves. Maternal beliefs and descriptions of how children use time also differed between the two groups. Notable was the absence of significant interactions between country and reading level.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/68579/2/10.1177_002221948401700305.pd
    corecore