1,215 research outputs found

    Software-Defined Cloud Computing: Architectural Elements and Open Challenges

    Full text link
    The variety of existing cloud services creates a challenge for service providers to enforce reasonable Software Level Agreements (SLA) stating the Quality of Service (QoS) and penalties in case QoS is not achieved. To avoid such penalties at the same time that the infrastructure operates with minimum energy and resource wastage, constant monitoring and adaptation of the infrastructure is needed. We refer to Software-Defined Cloud Computing, or simply Software-Defined Clouds (SDC), as an approach for automating the process of optimal cloud configuration by extending virtualization concept to all resources in a data center. An SDC enables easy reconfiguration and adaptation of physical resources in a cloud infrastructure, to better accommodate the demand on QoS through a software that can describe and manage various aspects comprising the cloud environment. In this paper, we present an architecture for SDCs on data centers with emphasis on mobile cloud applications. We present an evaluation, showcasing the potential of SDC in two use cases-QoS-aware bandwidth allocation and bandwidth-aware, energy-efficient VM placement-and discuss the research challenges and opportunities in this emerging area.Comment: Keynote Paper, 3rd International Conference on Advances in Computing, Communications and Informatics (ICACCI 2014), September 24-27, 2014, Delhi, Indi

    Change Detection in Graph Streams by Learning Graph Embeddings on Constant-Curvature Manifolds

    Get PDF
    The space of graphs is often characterised by a non-trivial geometry, which complicates learning and inference in practical applications. A common approach is to use embedding techniques to represent graphs as points in a conventional Euclidean space, but non-Euclidean spaces have often been shown to be better suited for embedding graphs. Among these, constant-curvature Riemannian manifolds (CCMs) offer embedding spaces suitable for studying the statistical properties of a graph distribution, as they provide ways to easily compute metric geodesic distances. In this paper, we focus on the problem of detecting changes in stationarity in a stream of attributed graphs. To this end, we introduce a novel change detection framework based on neural networks and CCMs, that takes into account the non-Euclidean nature of graphs. Our contribution in this work is twofold. First, via a novel approach based on adversarial learning, we compute graph embeddings by training an autoencoder to represent graphs on CCMs. Second, we introduce two novel change detection tests operating on CCMs. We perform experiments on synthetic data, as well as two real-world application scenarios: the detection of epileptic seizures using functional connectivity brain networks, and the detection of hostility between two subjects, using human skeletal graphs. Results show that the proposed methods are able to detect even small changes in a graph-generating process, consistently outperforming approaches based on Euclidean embeddings.Comment: 14 pages, 8 figure

    Skewed Evolving Data Streams Classification with Actionable Knowledge Extraction using Data Approximation and Adaptive Classification Framework

    Get PDF
    Skewed evolving data stream (SEDS) classification is a challenging research problem for online streaming data applications. The fundamental challenges in streaming data classification are class imbalance and concept drift. However, recently, either independently or together, the two topics have received enough attention; the data redundancy while performing stream data mining and classification remains unexplored. Moreover, the existing solutions for the classification of SEDSs have focused on solving concept drift and/or class imbalance problems using the sliding window mechanism, which leads to higher computational complexity and data redundancy problems. To end this, we propose a novel Adaptive Data Stream Classification (ADSC) framework for solving the concept drift, class imbalance, and data redundancy problems with higher computational and classification efficiency. Data approximation, adaptive clustering, classification, and actionable knowledge extraction are the major phases of ADSC. For the purpose of approximating unique items in the data stream with data pre-processing during the data approximation phase, we develop the Flajolet Martin (FM) algorithm. The periodically approximated tuples are grouped into distinct classes using an adaptive clustering algorithm to address the problem of concept drift and class imbalance. In the classification phase, the supervised classifiers are employed to classify the unknown incoming data streams into either of the classes discovered by the adaptive clustering algorithm. We then extract the actionable knowledge using classified skewed evolved data stream information for the end user decision-making process. The ADSC framework is empirically assessed utilizing two streaming datasets regarding classification and computing efficiency factors. The experimental results shows the better efficiency of the proposed ADSC framework as compared with existing classification methods

    ‘Teach in’ on energy and existing homes: restoring neighbourhoods and slowing climate change.

    Get PDF
    Homes that have already built account for 99% of our total housing stock. We estimate that 86% of the current stock will still be in use in 2050. Building new homes is carbon intensive and implies many wider environmental impacts. But the existing stock can be made more efficient, at a reasonable cost, to realise many environmental and social gains. Homes are responsible for 27% of our total CO2 emissions through their energy use, for half of public water use, and they generate large amounts of total UK waste. Large savings can be achieved using technologies that are readily available, cost effective and cheaper than many alternatives. In addition, construction waste contributes to 33% of the total UK waste stream. LSE Housing held two workshops in June 2008 to explore how to retrofit the existing stock. The workshops specifically looked at demonstrating the links between neighbourhood renewal, social cohesion and energy conservation. Participants included managers of existing homes, regeneration companies, local authorities, and housing associations as well as policy makers. The aim of the workshop was to share experience on how to make the existing stock both more attractive and more energy efficient with big gains for the environment and communities. Tackling resource efficiency in existing homes requires a comprehensive package of measures to deliver a step change. But the payback from implementing these changes will be great. This report summarises the aims of the workshops, together with the views of participants on the main barriers to retrofitting the existing stock, and key ideas on ‘where to start’.

    An Artificial Immune System Strategy for Robust Chemical Spectra Classification via Distributed Heterogeneous Sensors

    Get PDF
    The timely detection and classification of chemical and biological agents in a wartime environment is a critical component of force protection in hostile areas. Moreover, the possibility of toxic agent use in heavily populated civilian areas has risen dramatically in recent months. This thesis effort proposes a strategy for identifying such agents vis distributed sensors in an Artificial Immune System (AIS) network. The system may be used to complement electronic nose ( E-nose ) research being conducted in part by the Air Force Research Laboratory Sensors Directorate. In addition, the proposed strategy may facilitate fulfillment of a recent mandate by the President of the United States to the Office of Homeland Defense for the provision of a system that protects civilian populations from chemical and biological agents. The proposed system is composed of networked sensors and nodes, communicating via wireless or wired connections. Measurements are continually taken via dispersed, redundant, and heterogeneous sensors strategically placed in high threat areas. These sensors continually measure and classify air or liquid samples, alerting personnel when toxic agents are detected. Detection is based upon the Biological Immune System (BIS) model of antigens and antibodies, and alerts are generated when a measured sample is determined to be a valid toxic agent (antigen). Agent signatures (antibodies) are continually distributed throughout the system to adapt to changes in the environment or to new antigens. Antibody features are determined via data mining techniques in order to improve system performance and classification capabilities. Genetic algorithms (GAs) are critical part of the process, namely in antibody generation and feature subset selection calculations. Demonstrated results validate the utility of the proposed distributed AIS model for robust chemical spectra recognition

    Implementing privacy-preserving filters in the MOA stream mining framework

    Get PDF
    [CATALÀ] S'han implementat mètodes d'SDC en quatre filtres de privacitat pel software MOA. Els algorismes han estat adaptats de solucions conegudes per habilitar el seu ús en entorns de processament de fluxos. Finalment, han estat avaluats en termes del risc de revelació i la pèrdua d'informació.[ANGLÈS] Four MOA privacy-preserving filters have been developed to implement some SDC methods. The algorithms have been adapted from well-known solutions to enable their use in streaming settings. Finally, they have been benchmarked to assess their quality in terms of disclosure risk and information loss

    Towards implementing climate services in Peru – The project CLIMANDES

    Get PDF
    AbstractCLIMANDES is a pilot twinning project between the National Weather Services of Peru and Switzerland (SENAMHI and MeteoSwiss), developed within the Global Framework for Climate Services of the World Meteorological Organization (WMO). Split in two modules, CLIMANDES aims at improving education in meteorology and climatology in support of the WMO Regional Training Center in Peru, and introducing user-tailored climate services in two pilot regions in the Peruvian Andes.Four areas were prioritized in the first phase of CLIMANDES lasting from 2012 to 2015 to introduce climate services in Peru. A demand study identified the user needs of climate services and showed that climate information must be reliable, of high-quality, and precise. The information should be accessible and timely, understandable and applicable for the users’ specific needs. Second, the quality of climate data was enhanced through the establishment of quality control and homogenization procedures at SENAMHI. Specific training and application of the implemented methods at stations in the pilot regions was promoted to ensure the sustainability of the work. Third, the specific work on climate data enabled the creation of a webpage to disseminate climate indicators among users. The forth priority of the project enhanced the broad communication strategy of SENAMHI through creation of a specialized network of journalists, diverse climate forums, and the establishment of a user database.The efforts accomplished within CLIMANDES improved the quality of the climate services provided by SENAMHI. The project hence contributed successfully to higher awareness and higher confidence in the climate information by SENAMHI
    • …
    corecore