46,201 research outputs found

    Pathways to Coastal Resiliency: the Adaptive Gradients Framework

    Get PDF
    Current and future climate-related coastal impacts such as catastrophic and repetitive flooding, hurricane intensity, and sea level rise necessitate a new approach to developing and managing coastal infrastructure. Traditional “hard” or “grey” engineering solutions are proving both expensive and inflexible in the face of a rapidly changing coastal environment. Hybrid solutions that incorporate natural, nature-based, structural, and non-structural features may better achieve a broad set of goals such as ecological enhancement, long-term adaptation, and social benefits, but broad consideration and uptake of these approaches has been slow. One barrier to the widespread implementation of hybrid solutions is the lack of a relatively quick but holistic evaluation framework that places these broader environmental and societal goals on equal footing with the more traditional goal of exposure reduction. To respond to this need, the Adaptive Gradients Framework was developed and pilot-tested as a qualitative, flexible, and collaborative process guide for organizations to understand, evaluate, and potentially select more diverse kinds of infrastructural responses. These responses would ideally include natural, nature-based, and regulatory/cultural approaches, as well as hybrid designs combining multiple approaches. It enables rapid expert review of project designs based on eight metrics called “gradients”, which include exposure reduction, cost efficiency, institutional capacity, ecological enhancement, adaptation over time, greenhouse gas reduction, participatory process, and social benefits. The framework was conceptualized and developed in three phases: relevant factors and barriers were collected from practitioners and experts by survey; these factors were ranked by importance and used to develop the initial framework; several case studies were iteratively evaluated using this technique; and the framework was finalized for implementation. The article presents the framework and a pilot test of its application, along with resources that would enable wider application of the framework by practitioners and theorists

    A New Approach in Risk Stratification by Coronary CT Angiography.

    Get PDF
    For a decade, coronary computed tomographic angiography (CCTA) has been used as a promising noninvasive modality for the assessment of coronary artery disease (CAD) as well as cardiovascular risks. CCTA can provide more information incorporating the presence, extent, and severity of CAD; coronary plaque burden; and characteristics that highly correlate with those on invasive coronary angiography. Moreover, recent techniques of CCTA allow assessing hemodynamic significance of CAD. CCTA may be potentially used as a substitute for other invasive or noninvasive modalities. This review summarizes risk stratification by anatomical and hemodynamic information of CAD, coronary plaque characteristics, and burden observed on CCTA

    Badger: Complexity Analysis with Fuzzing and Symbolic Execution

    Full text link
    Hybrid testing approaches that involve fuzz testing and symbolic execution have shown promising results in achieving high code coverage, uncovering subtle errors and vulnerabilities in a variety of software applications. In this paper we describe Badger - a new hybrid approach for complexity analysis, with the goal of discovering vulnerabilities which occur when the worst-case time or space complexity of an application is significantly higher than the average case. Badger uses fuzz testing to generate a diverse set of inputs that aim to increase not only coverage but also a resource-related cost associated with each path. Since fuzzing may fail to execute deep program paths due to its limited knowledge about the conditions that influence these paths, we complement the analysis with a symbolic execution, which is also customized to search for paths that increase the resource-related cost. Symbolic execution is particularly good at generating inputs that satisfy various program conditions but by itself suffers from path explosion. Therefore, Badger uses fuzzing and symbolic execution in tandem, to leverage their benefits and overcome their weaknesses. We implemented our approach for the analysis of Java programs, based on Kelinci and Symbolic PathFinder. We evaluated Badger on Java applications, showing that our approach is significantly faster in generating worst-case executions compared to fuzzing or symbolic execution on their own

    Harvey: A Greybox Fuzzer for Smart Contracts

    Full text link
    We present Harvey, an industrial greybox fuzzer for smart contracts, which are programs managing accounts on a blockchain. Greybox fuzzing is a lightweight test-generation approach that effectively detects bugs and security vulnerabilities. However, greybox fuzzers randomly mutate program inputs to exercise new paths; this makes it challenging to cover code that is guarded by narrow checks, which are satisfied by no more than a few input values. Moreover, most real-world smart contracts transition through many different states during their lifetime, e.g., for every bid in an auction. To explore these states and thereby detect deep vulnerabilities, a greybox fuzzer would need to generate sequences of contract transactions, e.g., by creating bids from multiple users, while at the same time keeping the search space and test suite tractable. In this experience paper, we explain how Harvey alleviates both challenges with two key fuzzing techniques and distill the main lessons learned. First, Harvey extends standard greybox fuzzing with a method for predicting new inputs that are more likely to cover new paths or reveal vulnerabilities in smart contracts. Second, it fuzzes transaction sequences in a targeted and demand-driven way. We have evaluated our approach on 27 real-world contracts. Our experiments show that the underlying techniques significantly increase Harvey's effectiveness in achieving high coverage and detecting vulnerabilities, in most cases orders-of-magnitude faster; they also reveal new insights about contract code.Comment: arXiv admin note: substantial text overlap with arXiv:1807.0787

    On Synchronous and Asynchronous Monitor Instrumentation for Actor-based systems

    Full text link
    We study the impact of synchronous and asynchronous monitoring instrumentation on runtime overheads in the context of a runtime verification framework for actor-based systems. We show that, in such a context, asynchronous monitoring incurs substantially lower overhead costs. We also show how, for certain properties that require synchronous monitoring, a hybrid approach can be used that ensures timely violation detections for the important events while, at the same time, incurring lower overhead costs that are closer to those of an asynchronous instrumentation.Comment: In Proceedings FOCLASA 2014, arXiv:1502.0315

    Earthquake risk assessment using an integrated Fuzzy Analytic Hierarchy Process with Artificial Neural Networks based on GIS: A case study of Sanandaj in Iran

    Get PDF
    Earthquakes are natural phenomena, which induce natural hazard that seriously threatens urban areas, despite significant advances in retrofitting urban buildings and enhancing the knowledge and ability of experts in natural disaster control. Iran is one of the most seismically active countries in the world. The purpose of this study was to evaluate and analyze the extent of earthquake vulnerability in relation to demographic, environmental, and physical criteria. An earthquake risk assessment (ERA) map was created by using a Fuzzy-Analytic Hierarchy Process coupled with an Artificial Neural Networks (FAHP-ANN) model generating five vulnerability classes. Combining the application of a FAHP-ANN with a geographic information system (GIS) enabled to assign weights to the layers of the earthquake vulnerability criteria. The model was applied to Sanandaj City in Iran, located in the seismically active Sanandaj-Sirjan zone which is frequently affected by devastating earthquakes. The Multilayer Perceptron (MLP) model was implemented in the IDRISI software and 250 points were validated for grades 0 and 1. The validation process revealed that the proposed model can produce an earthquake probability map with an accuracy of 95%. A comparison of the results attained by using a FAHP, AHP and MLP model shows that the hybrid FAHP-ANN model proved flexible and reliable when generating the ERA map. The FAHP-ANN model accurately identified the highest earthquake vulnerability in densely populated areas with dilapidated building infrastructure. The findings of this study are useful for decision makers with a scientific basis to develop earthquake risk management strategies

    Neuro-Fuzzy Based Software Risk Estimation Tool

    Get PDF
    To develop the secure software is one of the major concerns in the software industry. To make the easier task of finding and fixing the security flaws, software developers should integrate the security at all stages of Software Development Life Cycle (SDLC).In this paper, based on Neuro- Fuzzy approach software Risk Prediction tool is created. Firstly Fuzzy Inference system is created and then Neural Network based three different training algorithms: BR (Bayesian Regulation), BP (Back propagation) and LM (Levenberg-Marquardt) are used to train the neural network. From the results it is conclude that for the Software Risk Estimation, BR (Bayesian Regulation) performs better and also achieves the greater accuracy than other algorithms
    • 

    corecore