868 research outputs found

    Математические методы в кибербезопасности: теория катастроф

    Get PDF
    The improvement of protection systems is based on the introduction and use of a mathematical apparatus. Ensuring the confidentiality, integrity and availability of information is an urgent and important problem in the modern world. Crisis processes are characteristic phenomena in security systems, so stochastic models cannot always describe their functioning and give a solution. An effective tool for solving this problem can be the use of dynamic models based on the provisions of catastrophe theory. This study is devoted to the analysis of modern approaches to the use of the basic provisions of catastrophe theory in cybersecurity systems. The work presents a brief historical view of the development of this theory and highlights the main definitions: bifurcations, attractors, catastrophes. Elementary catastrophes, their forms and features are characterized. A review of the literary sources of the use of catastrophe theory in information and cyber security was carried out. The analysis made it possible to single out that this theory has not yet been widely implemented, but there are point scientific developments in the process of detecting network anomalies in the cloud environment. The considered approaches to the application of catastrophe theory in information and cyber security can be used to train specialists in the specialty 125 Cybersecurity in the process of research.Удосконалення систем захисту інформації базується на впровадженні і застосуванні математичного апарату. Забезпечення конфіденційності, цілісності і доступності інформації є актуальною і важливою проблемою сучасного світу. Кризові процеси є характерними явищами у системах безпеки, тому стохастичні моделі не завжди можуть описати їх функціонування та надати рішення. Ефективним інструментарієм для вирішення даної проблеми може стати використання динамічних моделей, що ґрунтуються на положеннях теорії катастроф. Дане дослідження присвячене аналізу сучасних підходів до використання основних положень теорії катастроф у системах кібербезпеки. У роботі представлено стисло історичний ракурс розвитку даної теорії та висвітлені основні дефініції: біфуркації, атрактори, катастрофи. Охарактеризовані елементарні катастрофи, їх форми та особливості. Здійснено огляд літературних джерел щодо застосування теорії катастроф в інформаційній та кібернетичній безпеці. Аналіз дозволив виділити, що дана теорія не набула ще широкого впровадження, але є точкові наукові наробки у процесі виявлення мережевих аномалій у хмарному середовищі. Розглянуті підходи до застосування теорії катастроф в інформаційній та кібернетичній безпеці можуть бути використані при підготовці фахівців спеціальності 125 Кібербезпека у процесі науково-дослідної роботи.Усовершенствование систем защиты базируется на внедрении и применении математического аппарата. Обеспечение конфиденциальности, целостности и доступности информации – актуальная и важная проблема современного мира. Кризисные процессы являются характерными явлениями в системах безопасности, поэтому стохастические модели не всегда могут описать их функционирование и дать решение. Эффективным инструментарием для решения данной проблемы может стать использование динамических моделей, основанных на положениях теории катастроф. Данное исследование посвящено анализу современных подходов к использованию основных положений теории катастроф в системах кибербезопасности. В работе представлен краткий исторический ракурс развития данной теории и освещены основные дефиниции: бифуркации, аттракторы, катастрофы. Охарактеризованы элементарные катастрофы, их формы и особенности. Осуществлен обзор литературных источников применения теории катастроф в информационной и кибернетической безопасности. Анализ позволил выделить, что данная теория не получила еще широкого внедрения, но есть точечные научные наработки в процессе обнаружения сетевых аномалий в облачной среде. Рассмотренные подходы к применению теории катастроф в информационной и кибернетической безопасности могут использоваться при подготовке специалистов специальности 125 Кибербезопасность в процессе научно-исследовательской работы

    МАТЕМАТИЧНІ МЕТОДИ В КІБЕРБЕЗПЕЦІ: ТЕОРІЯ КАТАСТРОФ

    Get PDF
    The improvement of protection systems is based on the introduction and use of a mathematical apparatus. Ensuring the confidentiality, integrity and availability of information is an urgent and important problem in the modern world. Crisis processes are characteristic phenomena in security systems, so stochastic models cannot always describe their functioning and give a solution. An effective tool for solving this problem can be the use of dynamic models based on the provisions of catastrophe theory. This study is devoted to the analysis of modern approaches to the use of the basic provisions of catastrophe theory in cybersecurity systems. The work presents a brief historical view of the development of this theory and highlights the main definitions: bifurcations, attractors, catastrophes. Elementary catastrophes, their forms and features are characterized. A review of the literary sources of the use of catastrophe theory in information and cyber security was carried out. The analysis made it possible to single out that this theory has not yet been widely implemented, but there are point scientific developments in the process of detecting network anomalies in the cloud environment. The considered approaches to the application of catastrophe theory in information and cyber security can be used to train specialists in the specialty 125 Cybersecurity in the process of researchУдосконалення систем захисту інформації базується на впровадженні і застосуванні математичного апарату. Забезпечення конфіденційності, цілісності і доступності інформації є актуальною і важливою проблемою сучасного світу. Кризові процеси є характерними явищами у системах безпеки, тому стохастичні моделі не завжди можуть описати їх функціонування та надати рішення. Ефективним інструментарієм для вирішення даної проблеми може стати використання динамічних моделей, що ґрунтуються на положеннях теорії катастроф. Дане дослідження присвячене аналізу сучасних підходів до використання основних положень теорії катастроф у системах кібербезпеки. У роботі представлено стисло історичний ракурс розвитку даної теорії та висвітлені основні дефініції: біфуркації, атрактори, катастрофи. Охарактеризовані елементарні катастрофи, їх форми та особливості. Здійснено огляд літературних джерел щодо застосування теорії катастроф в інформаційній та кібернетичній безпеці. Аналіз дозволив виділити, що дана теорія  не набула ще широкого впровадження, але є точкові наукові наробки у процесі виявлення мережевих аномалій у хмарному середовищі. Розглянуті підходи до застосування теорії катастроф в інформаційній та кібернетичній безпеці можуть бути використані при підготовці фахівців спеціальності 125 Кібербезпека у процесі науково-дослідної роботи

    A new unified intrusion anomaly detection in identifying unseen web attacks

    Get PDF
    The global usage of more sophisticated web-based application systems is obviously growing very rapidly. Major usage includes the storing and transporting of sensitive data over the Internet. The growth has consequently opened up a serious need for more secured network and application security protection devices. Security experts normally equip their databases with a large number of signatures to help in the detection of known web-based threats. In reality, it is almost impossible to keep updating the database with the newly identified web vulnerabilities. As such, new attacks are invisible. This research presents a novel approach of Intrusion Detection System (IDS) in detecting unknown attacks on web servers using the Unified Intrusion Anomaly Detection (UIAD) approach. The unified approach consists of three components (preprocessing, statistical analysis, and classification). Initially, the process starts with the removal of irrelevant and redundant features using a novel hybrid feature selection method. Thereafter, the process continues with the application of a statistical approach to identifying traffic abnormality. We performed Relative Percentage Ratio (RPR) coupled with Euclidean Distance Analysis (EDA) and the Chebyshev Inequality Theorem (CIT) to calculate the normality score and generate a finest threshold. Finally, Logitboost (LB) is employed alongside Random Forest (RF) as a weak classifier, with the aim of minimising the final false alarm rate. The experiment has demonstrated that our approach has successfully identified unknown attacks with greater than a 95% detection rate and less than a 1% false alarm rate for both the DARPA 1999 and the ISCX 2012 datasets

    On the puzzling feature of the silence of precursory electromagnetic emissions

    Get PDF
    It has been suggested that fracture-induced MHz-kHz electromagnetic (EM) emissions, which emerge from a few days up to a few hours before the main seismic shock occurrence permit a real-time monitoring of the damage process during the last stages of earthquake preparation, as it happens at the laboratory scale. Despite fairly abundant evidence, EM precursors have not been adequately accepted as credible physical phenomena. These negative views are enhanced by the fact that certain 'puzzling features' are repetitively observed in candidate fracture-induced pre-seismic EM emissions. More precisely, EM silence in all frequency bands appears before the main seismic shock occurrence, as well as during the aftershock period. Actually, the view that 'acceptance of 'precursive' EM signals without convincing co-seismic signals should not be expected' seems to be reasonable. In this work we focus on this point. We examine whether the aforementioned features of EM silence are really puzzling ones or, instead, reflect well-documented characteristic features of the fracture process, in terms of: universal structural patterns of the fracture process, recent laboratory experiments, numerical and theoretical studies of fracture dynamics, critical phenomena, percolation theory, and micromechanics of granular materials. Our analysis shows that these features should not be considered puzzling.Comment: arXiv admin note: text overlap with arXiv:cond-mat/0603542 by other author

    Automatic Building of a Powerful IDS for The Cloud Based on Deep Neural Network by Using a Novel Combination of Simulated Annealing Algorithm and Improved Self- Adaptive Genetic Algorithm

    Get PDF
    Cloud computing (CC) is the fastest-growing data hosting and computational technology that stands today as a satisfactory answer to the problem of data storage and computing. Thereby, most organizations are now migratingtheir services into the cloud due to its appealing features and its tangible advantages. Nevertheless, providing privacy and security to protect cloud assets and resources still a very challenging issue. To address the aboveissues, we propose a smart approach to construct automatically an efficient and effective anomaly network IDS based on Deep Neural Network, by using a novel hybrid optimization framework “ISAGASAA”. ISAGASAA framework combines our new self-adaptive heuristic search algorithm called “Improved Self-Adaptive Genetic Algorithm” (ISAGA) and Simulated Annealing Algorithm (SAA). Our approach consists of using ISAGASAA with the aim of seeking the optimal or near optimal combination of most pertinent values of the parametersincluded in building of DNN based IDS or impacting its performance, which guarantee high detection rate, high accuracy and low false alarm rate. The experimental results turn out the capability of our IDS to uncover intrusionswith high detection accuracy and low false alarm rate, and demonstrate its superiority in comparison with stateof-the-art methods

    An adaptive, fault-tolerant system for road network traffic prediction using machine learning

    Get PDF
    This thesis has addressed the design and development of an integrated system for real-time traffic forecasting based on machine learning methods. Although traffic prediction has been the driving motivation for the thesis development, a great part of the proposed ideas and scientific contributions in this thesis are generic enough to be applied in any other problem where, ideally, their definition is that of the flow of information in a graph-like structure. Such application is of special interest in environments susceptible to changes in the underlying data generation process. Moreover, the modular architecture of the proposed solution facilitates the adoption of small changes to the components that allow it to be adapted to a broader range of problems. On the other hand, certain specific parts of this thesis are strongly tied to the traffic flow theory. The focus in this thesis is on a macroscopic perspective of the traffic flow where the individual road traffic flows are correlated to the underlying traffic demand. These short-term forecasts include the road network characterization in terms of the corresponding traffic measurements –traffic flow, density and/or speed–, the traffic state –whether a road is congested or not, and its severity–, and anomalous road conditions –incidents or other non-recurrent events–. The main traffic data used in this thesis is data coming from detectors installed along the road networks. Nevertheless, other kinds of traffic data sources could be equally suitable with the appropriate preprocessing. This thesis has been developed in the context of Aimsun Live –a simulation-based traffic solution for real-time traffic prediction developed by Aimsun–. The methods proposed here is planned to be linked to it in a mutually beneficial relationship where they cooperate and assist each other. An example is when an incident or non-recurrent event is detected with the proposed methods in this thesis, then the simulation-based forecasting module can simulate different strategies to measure their impact. Part of this thesis has been also developed in the context of the EU research project "SETA" (H2020-ICT-2015). The main motivation that has guided the development of this thesis is enhancing those weak points and limitations previously identified in Aimsun Live, and whose research found in literature has not been especially extensive. These include: • Autonomy, both in the preparation and real-time stages. • Adaptation, to gradual or abrupt changes in traffic demand or supply. • Informativeness, about anomalous road conditions. • Forecasting accuracy improved with respect to previous methodology at Aimsun and a typical forecasting baseline. • Robustness, to deal with faulty or missing data in real-time. • Interpretability, adopting modelling choices towards a more transparent reasoning and understanding of the underlying data-driven decisions. • Scalable, using a modular architecture with emphasis on a parallelizable exploitation of large amounts of data. The result of this thesis is an integrated system –Adarules– for real-time forecasting which is able to make the best of the available historical data, while at the same time it also leverages the theoretical unbounded size of data in a continuously streaming scenario. This is achieved through the online learning and change detection features along with the automatic finding and maintenance of patterns in the network graph. In addition to the Adarules system, another result is a probabilistic model that characterizes a set of interpretable latent variables related to the traffic state based on the traffic data provided by the sensors along with optional prior knowledge provided by the traffic expert following a Bayesian approach. On top of this traffic state model, it is built the probabilistic spatiotemporal model that learns the dynamics of the transition of traffic states in the network, and whose objectives include the automatic incident detection.Esta tesis ha abordado el diseño y desarrollo de un sistema integrado para la predicción de tráfico en tiempo real basándose en métodos de aprendizaje automático. Aunque la predicción de tráfico ha sido la motivación que ha guiado el desarrollo de la tesis, gran parte de las ideas y aportaciones científicas propuestas en esta tesis son lo suficientemente genéricas como para ser aplicadas en cualquier otro problema en el que, idealmente, su definición sea la del flujo de información en una estructura de grafo. Esta aplicación es de especial interés en entornos susceptibles a cambios en el proceso de generación de datos. Además, la arquitectura modular facilita la adaptación a una gama más amplia de problemas. Por otra parte, ciertas partes específicas de esta tesis están fuertemente ligadas a la teoría del flujo de tráfico. El enfoque de esta tesis se centra en una perspectiva macroscópica del flujo de tráfico en la que los flujos individuales están ligados a la demanda de tráfico subyacente. Las predicciones a corto plazo incluyen la caracterización de las carreteras en base a las medidas de tráfico -flujo, densidad y/o velocidad-, el estado del tráfico -si la carretera está congestionada o no, y su severidad-, y la detección de condiciones anómalas -incidentes u otros eventos no recurrentes-. Los datos utilizados en esta tesis proceden de detectores instalados a lo largo de las redes de carreteras. No obstante, otros tipos de fuentes de datos podrían ser igualmente empleados con el preprocesamiento apropiado. Esta tesis ha sido desarrollada en el contexto de Aimsun Live -software desarrollado por Aimsun, basado en simulación para la predicción en tiempo real de tráfico-. Los métodos aquí propuestos cooperarán con este. Un ejemplo es cuando se detecta un incidente o un evento no recurrente, entonces pueden simularse diferentes estrategias para medir su impacto. Parte de esta tesis también ha sido desarrollada en el marco del proyecto de la UE "SETA" (H2020-ICT-2015). La principal motivación que ha guiado el desarrollo de esta tesis es mejorar aquellas limitaciones previamente identificadas en Aimsun Live, y cuya investigación encontrada en la literatura no ha sido muy extensa. Estos incluyen: -Autonomía, tanto en la etapa de preparación como en la de tiempo real. -Adaptación, a los cambios graduales o abruptos de la demanda u oferta de tráfico. -Sistema informativo, sobre las condiciones anómalas de la carretera. -Mejora en la precisión de las predicciones con respecto a la metodología anterior de Aimsun y a un método típico usado como referencia. -Robustez, para hacer frente a datos defectuosos o faltantes en tiempo real. -Interpretabilidad, adoptando criterios de modelización hacia un razonamiento más transparente para un humano. -Escalable, utilizando una arquitectura modular con énfasis en una explotación paralela de grandes cantidades de datos. El resultado de esta tesis es un sistema integrado –Adarules- para la predicción en tiempo real que sabe maximizar el provecho de los datos históricos disponibles, mientras que al mismo tiempo también sabe aprovechar el tamaño teórico ilimitado de los datos en un escenario de streaming. Esto se logra a través del aprendizaje en línea y la capacidad de detección de cambios junto con la búsqueda automática y el mantenimiento de los patrones en la estructura de grafo de la red. Además del sistema Adarules, otro resultado de la tesis es un modelo probabilístico que caracteriza un conjunto de variables latentes interpretables relacionadas con el estado del tráfico basado en los datos de sensores junto con el conocimiento previo –opcional- proporcionado por el experto en tráfico utilizando un planteamiento Bayesiano. Sobre este modelo de estados de tráfico se construye el modelo espacio-temporal probabilístico que aprende la dinámica de la transición de estadosPostprint (published version

    An adaptive, fault-tolerant system for road network traffic prediction using machine learning

    Get PDF
    This thesis has addressed the design and development of an integrated system for real-time traffic forecasting based on machine learning methods. Although traffic prediction has been the driving motivation for the thesis development, a great part of the proposed ideas and scientific contributions in this thesis are generic enough to be applied in any other problem where, ideally, their definition is that of the flow of information in a graph-like structure. Such application is of special interest in environments susceptible to changes in the underlying data generation process. Moreover, the modular architecture of the proposed solution facilitates the adoption of small changes to the components that allow it to be adapted to a broader range of problems. On the other hand, certain specific parts of this thesis are strongly tied to the traffic flow theory. The focus in this thesis is on a macroscopic perspective of the traffic flow where the individual road traffic flows are correlated to the underlying traffic demand. These short-term forecasts include the road network characterization in terms of the corresponding traffic measurements –traffic flow, density and/or speed–, the traffic state –whether a road is congested or not, and its severity–, and anomalous road conditions –incidents or other non-recurrent events–. The main traffic data used in this thesis is data coming from detectors installed along the road networks. Nevertheless, other kinds of traffic data sources could be equally suitable with the appropriate preprocessing. This thesis has been developed in the context of Aimsun Live –a simulation-based traffic solution for real-time traffic prediction developed by Aimsun–. The methods proposed here is planned to be linked to it in a mutually beneficial relationship where they cooperate and assist each other. An example is when an incident or non-recurrent event is detected with the proposed methods in this thesis, then the simulation-based forecasting module can simulate different strategies to measure their impact. Part of this thesis has been also developed in the context of the EU research project "SETA" (H2020-ICT-2015). The main motivation that has guided the development of this thesis is enhancing those weak points and limitations previously identified in Aimsun Live, and whose research found in literature has not been especially extensive. These include: • Autonomy, both in the preparation and real-time stages. • Adaptation, to gradual or abrupt changes in traffic demand or supply. • Informativeness, about anomalous road conditions. • Forecasting accuracy improved with respect to previous methodology at Aimsun and a typical forecasting baseline. • Robustness, to deal with faulty or missing data in real-time. • Interpretability, adopting modelling choices towards a more transparent reasoning and understanding of the underlying data-driven decisions. • Scalable, using a modular architecture with emphasis on a parallelizable exploitation of large amounts of data. The result of this thesis is an integrated system –Adarules– for real-time forecasting which is able to make the best of the available historical data, while at the same time it also leverages the theoretical unbounded size of data in a continuously streaming scenario. This is achieved through the online learning and change detection features along with the automatic finding and maintenance of patterns in the network graph. In addition to the Adarules system, another result is a probabilistic model that characterizes a set of interpretable latent variables related to the traffic state based on the traffic data provided by the sensors along with optional prior knowledge provided by the traffic expert following a Bayesian approach. On top of this traffic state model, it is built the probabilistic spatiotemporal model that learns the dynamics of the transition of traffic states in the network, and whose objectives include the automatic incident detection.Esta tesis ha abordado el diseño y desarrollo de un sistema integrado para la predicción de tráfico en tiempo real basándose en métodos de aprendizaje automático. Aunque la predicción de tráfico ha sido la motivación que ha guiado el desarrollo de la tesis, gran parte de las ideas y aportaciones científicas propuestas en esta tesis son lo suficientemente genéricas como para ser aplicadas en cualquier otro problema en el que, idealmente, su definición sea la del flujo de información en una estructura de grafo. Esta aplicación es de especial interés en entornos susceptibles a cambios en el proceso de generación de datos. Además, la arquitectura modular facilita la adaptación a una gama más amplia de problemas. Por otra parte, ciertas partes específicas de esta tesis están fuertemente ligadas a la teoría del flujo de tráfico. El enfoque de esta tesis se centra en una perspectiva macroscópica del flujo de tráfico en la que los flujos individuales están ligados a la demanda de tráfico subyacente. Las predicciones a corto plazo incluyen la caracterización de las carreteras en base a las medidas de tráfico -flujo, densidad y/o velocidad-, el estado del tráfico -si la carretera está congestionada o no, y su severidad-, y la detección de condiciones anómalas -incidentes u otros eventos no recurrentes-. Los datos utilizados en esta tesis proceden de detectores instalados a lo largo de las redes de carreteras. No obstante, otros tipos de fuentes de datos podrían ser igualmente empleados con el preprocesamiento apropiado. Esta tesis ha sido desarrollada en el contexto de Aimsun Live -software desarrollado por Aimsun, basado en simulación para la predicción en tiempo real de tráfico-. Los métodos aquí propuestos cooperarán con este. Un ejemplo es cuando se detecta un incidente o un evento no recurrente, entonces pueden simularse diferentes estrategias para medir su impacto. Parte de esta tesis también ha sido desarrollada en el marco del proyecto de la UE "SETA" (H2020-ICT-2015). La principal motivación que ha guiado el desarrollo de esta tesis es mejorar aquellas limitaciones previamente identificadas en Aimsun Live, y cuya investigación encontrada en la literatura no ha sido muy extensa. Estos incluyen: -Autonomía, tanto en la etapa de preparación como en la de tiempo real. -Adaptación, a los cambios graduales o abruptos de la demanda u oferta de tráfico. -Sistema informativo, sobre las condiciones anómalas de la carretera. -Mejora en la precisión de las predicciones con respecto a la metodología anterior de Aimsun y a un método típico usado como referencia. -Robustez, para hacer frente a datos defectuosos o faltantes en tiempo real. -Interpretabilidad, adoptando criterios de modelización hacia un razonamiento más transparente para un humano. -Escalable, utilizando una arquitectura modular con énfasis en una explotación paralela de grandes cantidades de datos. El resultado de esta tesis es un sistema integrado –Adarules- para la predicción en tiempo real que sabe maximizar el provecho de los datos históricos disponibles, mientras que al mismo tiempo también sabe aprovechar el tamaño teórico ilimitado de los datos en un escenario de streaming. Esto se logra a través del aprendizaje en línea y la capacidad de detección de cambios junto con la búsqueda automática y el mantenimiento de los patrones en la estructura de grafo de la red. Además del sistema Adarules, otro resultado de la tesis es un modelo probabilístico que caracteriza un conjunto de variables latentes interpretables relacionadas con el estado del tráfico basado en los datos de sensores junto con el conocimiento previo –opcional- proporcionado por el experto en tráfico utilizando un planteamiento Bayesiano. Sobre este modelo de estados de tráfico se construye el modelo espacio-temporal probabilístico que aprende la dinámica de la transición de estado

    Valuing adaptation under rapid change

    Get PDF
    AbstractThe methods used to plan adaptation to climate change have been heavily influenced by scientific narratives of gradual change and economic narratives of marginal adjustments to that change. An investigation of the theoretical aspects of how the climate changes suggests that scientific narratives of climate change are socially constructed, biasing scientific narratives to descriptions of gradual as opposed rapid, non-linear change. Evidence of widespread step changes in recent climate records and in model projections of future climate is being overlooked because of this. Step-wise climate change has the potential to produce rapid increases in extreme events that can cross institutional, geographical and sectoral domains.Likewise, orthodox economics is not well suited to the deep uncertainty faced under climate change, requiring a multi-faceted approach to adaptation. The presence of tangible and intangible values range across five adaptation clusters: goods; services; capital assets and infrastructure; social assets and infrastructure; and natural assets and infrastructure. Standard economic methods have difficulty in giving adequate weight to the different types of values across these clusters. They also do not account well for the inter-connectedness of impacts and subsequent responses between agents in the economy. As a result, many highly-valued aspects of human and environmental capital are being overlooked.Recent extreme events are already pressuring areas of public policy, and national strategies for emergency response and disaster risk reduction are being developed as a consequence. However, the potential for an escalation of total damage costs due to rapid change requires a coordinated approach at the institutional level, involving all levels of government, the private sector and civil society.One of the largest risks of maladaptation is the potential for un-owned risks, as risks propagate across domains and responsibility for their management is poorly allocated between public and private interests, and between the roles of the individual and civil society. Economic strategies developed by the disaster community for disaster response and risk reduction provide a base to work from, but many gaps remain.We have developed a framework for valuing adaptation that has the following aspects: the valuation of impacts thus estimating values at risk, the evaluation of different adaptation options and strategies based on cost, and the valuation of benefits expressed as a combination of the benefits of avoided damages and a range of institutional values such as equity, justice, sustainability and profit.The choice of economic methods and tools used to assess adaptation depends largely on the ability to constrain uncertainty around problems (predictive uncertainty) and solutions (outcome uncertainty). Orthodox methods can be used where both are constrained, portfolio methodologies where problems are constrained and robust methodologies where solutions are constrained. Where both are unconstrained, process-based methods utilising innovation methods and adaptive management are most suitable. All methods should involve stakeholders where possible.Innovative processes methods that enable transformation will be required in some circumstances, to allow institutions, sectors and communities to prepare for anticipated major change.Please cite this report as: Jones, RN, Young, CK, Handmer, J, Keating, A, Mekala, GD, Sheehan, P 2013 Valuing adaptation under rapid change, National Climate Change Adaptation Research Facility, Gold Coast, pp. 192.The methods used to plan adaptation to climate change have been heavily influenced by scientific narratives of gradual change and economic narratives of marginal adjustments to that change. An investigation of the theoretical aspects of how the climate changes suggests that scientific narratives of climate change are socially constructed, biasing scientific narratives to descriptions of gradual as opposed rapid, non-linear change. Evidence of widespread step changes in recent climate records and in model projections of future climate is being overlooked because of this. Step-wise climate change has the potential to produce rapid increases in extreme events that can cross institutional, geographical and sectoral domains.Likewise, orthodox economics is not well suited to the deep uncertainty faced under climate change, requiring a multi-faceted approach to adaptation. The presence of tangible and intangible values range across five adaptation clusters: goods; services; capital assets and infrastructure; social assets and infrastructure; and natural assets and infrastructure. Standard economic methods have difficulty in giving adequate weight to the different types of values across these clusters. They also do not account well for the inter-connectedness of impacts and subsequent responses between agents in the economy. As a result, many highly-valued aspects of human and environmental capital are being overlooked.Recent extreme events are already pressuring areas of public policy, and national strategies for emergency response and disaster risk reduction are being developed as a consequence. However, the potential for an escalation of total damage costs due to rapid change requires a coordinated approach at the institutional level, involving all levels of government, the private sector and civil society.One of the largest risks of maladaptation is the potential for un-owned risks, as risks propagate across domains and responsibility for their management is poorly allocated between public and private interests, and between the roles of the individual and civil society. Economic strategies developed by the disaster community for disaster response and risk reduction provide a base to work from, but many gaps remain.We have developed a framework for valuing adaptation that has the following aspects: the valuation of impacts thus estimating values at risk, the evaluation of different adaptation options and strategies based on cost, and the valuation of benefits expressed as a combination of the benefits of avoided damages and a range of institutional values such as equity, justice, sustainability and profit.The choice of economic methods and tools used to assess adaptation depends largely on the ability to constrain uncertainty around problems (predictive uncertainty) and solutions (outcome uncertainty). Orthodox methods can be used where both are constrained, portfolio methodologies where problems are constrained and robust methodologies where solutions are constrained. Where both are unconstrained, process-based methods utilising innovation methods and adaptive management are most suitable. All methods should involve stakeholders where possible.Innovative processes methods that enable transformation will be required in some circumstances, to allow institutions, sectors and communities to prepare for anticipated major change

    Nonextensive statistics: Theoretical, experimental and computational evidences and connections

    Full text link
    The domain of validity of standard thermodynamics and Boltzmann-Gibbs statistical mechanics is discussed and then formally enlarged in order to hopefully cover a variety of anomalous systems. The generalization concerns {\it nonextensive} systems, where nonextensivity is understood in the thermodynamical sense. This generalization was first proposed in 1988 inspired by the probabilistic description of multifractal geometries, and has been intensively studied during this decade. In the present effort, after introducing some historical background, we briefly describe the formalism, and then exhibit the present status in what concerns theoretical, experimental and computational evidences and connections, as well as some perspectives for the future. In addition to these, here and there we point out various (possibly) relevant questions, whose answer would certainly clarify our current understanding of the foundations of statistical mechanics and its thermodynamical implicationsComment: 15 figure
    corecore