811 research outputs found

    The application of componentised modelling techniques to catastrophe model generation

    Get PDF
    In this paper we show that integrated environmental modelling (IEM) techniques can be used to generate a catastrophe model for groundwater flooding. Catastrophe models are probabilistic models based upon sets of events representing the hazard and weights their likelihood with the impact of such an event happening which is then used to estimate future financial losses. These probabilistic loss estimates often underpin re-insurance transactions. Modelled loss estimates can vary significantly, because of the assumptions used within the models. A rudimentary insurance-style catastrophe model for groundwater flooding has been created by linking seven individual components together. Each component is linked to the next using an open modelling framework (i.e. an implementation of OpenMI). Finally, we discuss how a flexible model integration methodology, such as described in this paper, facilitates a better understanding of the assumptions used within the catastrophe model by enabling the interchange of model components created using different, yet appropriate, assumptions

    Toward Business Integrity Modeling and Analysis Framework for Risk Measurement and Analysis

    Get PDF
    Financialization has contributed to economic growth but has caused scandals, misselling, rogue trading, tax evasion, and market speculation. To a certain extent, it has also created problems in social and economic instability. It is an important aspect of Enterprise Security, Privacy, and Risk (ESPR), particularly in risk research and analysis. In order to minimize the damaging impacts caused by the lack of regulatory compliance, governance, ethical responsibilities, and trust, we propose a Business Integrity Modeling and Analysis (BIMA) framework to unify business integrity with performance using big data predictive analytics and business intelligence. Comprehensive services include modeling risk and asset prices, and consequently, aligning them with business strategies, making our services, according to market trend analysis, both transparent and fair. The BIMA framework uses Monte Carlo simulation, the Black–Scholes–Merton model, and the Heston model for performing financial, operational, and liquidity risk analysis and present outputs in the form of analytics and visualization. Our results and analysis demonstrate supplier bankruptcy modeling, risk pricing, high-frequency pricing simulations, London Interbank Offered Rate (LIBOR) rate simulation, and speculation detection results to provide a variety of critical risk analysis. Our approaches to tackle problems caused by financial services and the operational risk clearly demonstrate that the BIMA framework, as the outputs of our data analytics research, can effectively combine integrity and risk analysis together with overall business performance and can contribute to operational risk research

    Using Physical and Social Sensors in Real-Time Data Streaming for Natural Hazard Monitoring and Response

    Get PDF
    Technological breakthroughs in computing over the last few decades have resulted in important advances in natural hazards analysis. In particular, integration of a wide variety of information sources, including observations from spatially-referenced physical sensors and new social media sources, enables better estimates of real-time hazard. The main goal of this work is to utilize innovative streaming algorithms for improved real-time seismic hazard analysis by integrating different data sources and processing tools into cloud applications. In streaming algorithms, a sequence of items from physical and social sensors can be processed in as little as one pass with no need to store the data locally. Massive data volumes can be analyzed in near-real time with reasonable limits on storage space, an important advantage for natural hazard analysis. Seismic hazard maps are used by policymakers to set earthquake resistant construction standards, by insurance companies to set insurance rates and by civil engineers to estimate stability and damage potential. This research first focuses on improving probabilistic seismic hazard map production. The result is a series of maps for different frequency bands at significantly increased resolution with much lower latency time that includes a range of high-resolution sensitivity tests. Second, a method is developed for real-time earthquake intensity estimation using joint streaming analysis from physical and social sensors. Automatically calculated intensity estimates from physical sensors such as seismometers use empirical relationships between ground motion and intensity, while those from social sensors employ questionaries that evaluate ground shaking levels based on personal observations. Neither is always sufficiently precise and/or timely. Results demonstrate that joint processing can significantly reduce the response time to a damaging earthquake and estimate preliminary intensity levels during the first ten minutes after an event. The combination of social media and network sensor data, in conjunction with innovative computing algorithms, provides a new paradigm for real-time earthquake detection, facilitating rapid and inexpensive risk reduction. In particular, streaming algorithms are an efficient method that addresses three major problems in hazard estimation by improving resolution, decreasing processing latency to near real-time standards and providing more accurate results through the integration of multiple data sets

    A review of the internet of floods : near real-time detection of a flood event and its impact

    Get PDF
    Worldwide, flood events frequently have a dramatic impact on urban societies. Time is key during a flood event in order to evacuate vulnerable people at risk, minimize the socio-economic, ecologic and cultural impact of the event and restore a society from this hazard as quickly as possible. Therefore, detecting a flood in near real-time and assessing the risks relating to these flood events on the fly is of great importance. Therefore, there is a need to search for the optimal way to collect data in order to detect floods in real time. Internet of Things (IoT) is the ideal method to bring together data of sensing equipment or identifying tools with networking and processing capabilities, allow them to communicate with one another and with other devices and services over the Internet to accomplish the detection of floods in near real-time. The main objective of this paper is to report on the current state of research on the IoT in the domain of flood detection. Current trends in IoT are identified, and academic literature is examined. The integration of IoT would greatly enhance disaster management and, therefore, will be of greater importance into the future

    Leveraging Technology and Innovation for Disaster Risk Management and Financing

    Get PDF
    The Asia-Pacific Economic Cooperation (APEC) region is highly exposed to disaster and climate risks, accounting for more than 80% of global economic losses from disaster events in the last 20 years. The destruction and disruption that usually follow disaster events pose an important challenge to economic development and can perpetuate vulnerability. Despite substantial investment in reducing risk across the region, economic losses from disaster events continue to increase at a much faster rate than gross domestic product, implying that the relative economic burden is increasing over time. Efforts to enhance the reach of insurance and other financial protection tools have not significantly reduced the share of economic losses borne by households, businesses, and governments, which often lack the capacity to absorb these impacts. A changing climate as well as continued population growth and asset accumulation in areas exposed to disaster and climate risks is expected to exacerbate these challenges—with particular implications for vulnerable groups with limited economic resources. Enhancing resilience in the face of increasing natural hazards, exposure, and vulnerability will require investments in reducing the economic, social, and financial impacts of disasters by improving risk and impact assessment and leveraging those improvements to invest in risk reduction, preparedness, and response. APEC finance ministers have long recognized the need to build financial resilience to disaster risks and have included this objective in their work for a number of years. The Cebu Action Plan, approved by APEC finance ministers in 2015, aims to enhance financial resilience against economic shocks, including by “developing innovative disaster risk financing and insurance mechanisms (including micro insurance) to enable APEC economies exposed to natural hazards to increase their financial response to disasters and reduce their fiscal burden” (APEC 2015). Referenced by APEC finance ministers in their 2019 Joint Ministerial Statement, this report aims to contribute to this objective by supporting efforts to reduce underlying risk and develop tools to manage the financial consequences

    A discourse on geospatial technology applications in predictive analytics and evidence-based decision support for disaster research and management

    Get PDF
    Continued population growth and development in vulnerable locations across the world are creating a new geography of hazards and disasters. Increasing storm frequencies coupled with unrelenting efforts to control flooding through structural means will undoubtedly intensify the intersection between flood hazards and humans. Accordingly, the baseline capacity of places to prepare for and rebound from disaster events adequately is negatively impacted. Hurricane Katrina brought this reality to the forefront of disaster science and management in 2005. Concurrent with the increased awareness of evolving hazardscapes has been the identification of deficiencies in how components of disasters are studied and managed. The topic of recovery represents one of the least understood elements in hazards geography, owing most of its existing catalogue of knowledge to social sciences and public administration. This dissertation summarizes an effort to develop a spatial metric which quantifies recovery from flood events as well as the evaluation of applying these research based methods in practical environments. The study theorizes that recovery can be measured by assessing the proximity of critical elements within the built environment. These elements (buildings) represent hubs of social activity necessary for social networks to flourish in post disaster settings. It goes on to evaluate and apply this metric in both New Orleans, LA and Carinthia, Austria, in order to identify cultural bias in model design prior to conducting a case study where research based predictive analytics are used in a real world mitigation plan. The outcome of the study suggests that recovery is indeed measurable spatially and is heavily influenced by culture and scale. By integrating this new understanding of recovery into potential mitigation strategies, planning for risk reduction expenditures can more appropriately consider the drivers of place-specific vulnerability

    ICT for Disaster Risk Management:The Academy of ICT Essentials for Government Leaders

    Get PDF

    Essays in Quantitative Risk Management for Financial Regulation of Operational Risk Models

    Get PDF
    An extensive amount of evolving guidance and rules are provided to banks by financial regulators. A particular set of instructions outline requirements to calculate and set aside loss-absorbing regulatory capital to ensure the solvency of a bank. Mathematical models are typically used by banks to quantify sufficient amounts of capital. In this thesis, we explore areas that advance our knowledge in regulatory risk management. In the first essay, we explore an aspect of operational risk loss modeling using scenario analysis. An actuarial modeling method is typically used to quantify a baseline capital value which is then layered with a judgemental component in order to account for and integrate what-if future potential losses into the model. We propose a method from digital signal processing using the convolution operator that views the problem of the blending of two signals. That is, a baseline loss distribution obtained from the modeling of frequency and severity of internal losses is combined with a probability distribution obtained from scenario responses to yield a final output that integrates both sets of information. In the second essay, we revisit scenario analysis and the potential impact of catastrophic events to that of the enterprise level of a bank. We generalize an algorithm to account for multiple level of intensities of events together with unique loss profiles depending on the business units effected. In the third essay, we investigate the problem of allocating aggregate capital across sub-portfolios in a fair manner when there are various forms of interdependencies. Relevant to areas of market, credit and operational risk, the multivariate shortfall allocation problem quantifies the optimal amount of capital needed to ensure that the expected loss under a convex loss penalty function remains bounded by a threshold. We first provide an application of the existing methodology to a subset of high frequency loss cells. Lastly, we provide an extension using copula models which allows for the modeling of joint fat-tailed events or asymmetries in the underlying process
    • …
    corecore