107,571 research outputs found

    Autonomic computing architecture for SCADA cyber security

    Get PDF
    Cognitive computing relates to intelligent computing platforms that are based on the disciplines of artificial intelligence, machine learning, and other innovative technologies. These technologies can be used to design systems that mimic the human brain to learn about their environment and can autonomously predict an impending anomalous situation. IBM first used the term ‘Autonomic Computing’ in 2001 to combat the looming complexity crisis (Ganek and Corbi, 2003). The concept has been inspired by the human biological autonomic system. An autonomic system is self-healing, self-regulating, self-optimising and self-protecting (Ganek and Corbi, 2003). Therefore, the system should be able to protect itself against both malicious attacks and unintended mistakes by the operator

    Architecture of Environmental Risk Modelling: for a faster and more robust response to natural disasters

    Full text link
    Demands on the disaster response capacity of the European Union are likely to increase, as the impacts of disasters continue to grow both in size and frequency. This has resulted in intensive research on issues concerning spatially-explicit information and modelling and their multiple sources of uncertainty. Geospatial support is one of the forms of assistance frequently required by emergency response centres along with hazard forecast and event management assessment. Robust modelling of natural hazards requires dynamic simulations under an array of multiple inputs from different sources. Uncertainty is associated with meteorological forecast and calibration of the model parameters. Software uncertainty also derives from the data transformation models (D-TM) needed for predicting hazard behaviour and its consequences. On the other hand, social contributions have recently been recognized as valuable in raw-data collection and mapping efforts traditionally dominated by professional organizations. Here an architecture overview is proposed for adaptive and robust modelling of natural hazards, following the Semantic Array Programming paradigm to also include the distributed array of social contributors called Citizen Sensor in a semantically-enhanced strategy for D-TM modelling. The modelling architecture proposes a multicriteria approach for assessing the array of potential impacts with qualitative rapid assessment methods based on a Partial Open Loop Feedback Control (POLFC) schema and complementing more traditional and accurate a-posteriori assessment. We discuss the computational aspect of environmental risk modelling using array-based parallel paradigms on High Performance Computing (HPC) platforms, in order for the implications of urgency to be introduced into the systems (Urgent-HPC).Comment: 12 pages, 1 figure, 1 text box, presented at the 3rd Conference of Computational Interdisciplinary Sciences (CCIS 2014), Asuncion, Paragua

    The valuation tool user guide: monetizing Cradle to Cradle®

    Get PDF
    This User Guide outlines the object, scope and expected deliverables from the Valuation Tool component of the Cradle to Cradle ® C2C BIZZ project. It describes the compendium of subtools that have been developed comprising: i) overview of funding tools; ii) C2C investment appraisal tool; and iii) C2C value indexing tool. The underpinning methodologies, as well as their inherent strengths and limitations are also described. The C2C BIZZ project as a whole aims specifically to promote and enhance the implementation of C2C methods in business site development within North Western Europe (NWE) (PAD, p.14). It is intended to infuse C2C notions into conventional site development, restructuring and management. The primary focus of the project is on planning, building and managing of business sites with C2C credentials (PAD, p.18) using sites in Lille Metropole (La Lainiere), London (London Sustainable Industries Park) and Luxemburg (Ecoparc Windhof) as experimental fields. C2C BIZZ is not concerned with the internal operations and activities of occupiers or users of the developed site. Accordingly, the scope of the valuation tool is confined to the planning, building and management of C2C sites. The deliverable from this component is a compendium of subtools (see Figure 1 below) that may be used to analyse the financial performance of C2C credentials in business sites to aid the making of a business case for such developments and evaluating the financial incentives for particular C2C site development projects. This entire work is premised on the argument that the wider adoption of C2C principles within the built environment depends on the rate of uptake by the private sector. The private sector, being profit driven, are likely to engage in C2C site development if they are convinced of its capacity to contribute to their business goals which ultimately is a return on their investment. The tool development described in this document attempts to provide a framework for collating an evidence base that can assist in articulating the business case for C2C in business site developments

    Big data analytics:Computational intelligence techniques and application areas

    Get PDF
    Big Data has significant impact in developing functional smart cities and supporting modern societies. In this paper, we investigate the importance of Big Data in modern life and economy, and discuss challenges arising from Big Data utilization. Different computational intelligence techniques have been considered as tools for Big Data analytics. We also explore the powerful combination of Big Data and Computational Intelligence (CI) and identify a number of areas, where novel applications in real world smart city problems can be developed by utilizing these powerful tools and techniques. We present a case study for intelligent transportation in the context of a smart city, and a novel data modelling methodology based on a biologically inspired universal generative modelling approach called Hierarchical Spatial-Temporal State Machine (HSTSM). We further discuss various implications of policy, protection, valuation and commercialization related to Big Data, its applications and deployment
    corecore