7 research outputs found

    Optimizing Sensor Network Reprogramming via In-situ Reconfigurable Components

    Get PDF
    International audienceWireless reprogramming of sensor nodes is a critical requirement in long-lived Wireless Sensor Networks (WSNs) for several concerns, such as fixing bugs, upgrading the operating system and applications, and adapting applications behavior according to the physical environment. In such resource-poor platforms, the ability to efficiently delimit and reconfigure the necessary portion of sensor software--instead of updating the full binary image--is of vital importance. However, most of existing approaches in this field have not been widely adopted to date due to the extensive use of WSN resources or lack of generality. In this article, we therefore consider WSN programming models and run-time reconfiguration models as two interrelated factors and we present an integrated approach for addressing efficient reprogramming in WSNs. The middleware solution we propose, RemoWare, is characterized by mitigating the cost of post-deployment software updates on sensor nodes via the notion of in-situ reconfigurability and providing a component-based programming abstraction to facilitate the development of dynamic WSN applications. Our evaluation results show that RemoWare imposes a very low energy overhead in code distribution and component reconfiguration, and consumes approximately 6% of the total code memory on a TelosB sensor platform

    Runtime variability for dynamic reconfiguration in wireless sensor network product lines

    Get PDF
    Runtime variability is a key technique for the success of Dynamic Software Product Lines (DSPLs), as certain application demand reconfiguration of system features and execution plans at runtime. In this emerging research work we address the problem of dynamic changes in feature models in sensor networks product families, where nodes of the network demand dynamic reconfiguration at post-deployment time

    IntegraDos: facilitating the adoption of the Internet of Things through the integration of technologies

    Get PDF
    También, han sido analizados los componentes para una integración del IoT y cloud computing, concluyendo en la arquitectura Lambda-CoAP. Y por último, los desafíos para una integración del IoT y Blockchain han sido analizados junto con una evaluación de las posibilidades de los dispositivos del IoT para incorporar nodos de Blockchain. Las contribuciones de esta tesis doctoral contribuyen a acercar la adopción del IoT en la sociedad, y por tanto, a la expansión de esta prominente tecnología. Fecha de lectura de Tesis: 17 de diciembre 2018.El Internet de las Cosas (IoT) fue un nuevo concepto introducido por K. Asthon en 1999 para referirse a un conjunto identificable de objetos conectados a través de RFID. Actualmente, el IoT se caracteriza por ser una tecnología ubicua que está presente en un gran número de áreas, como puede ser la monitorización de infraestructuras críticas, sistemas de trazabilidad o sistemas asistidos para el cuidado de la salud. El IoT está cada vez más presente en nuestro día a día, cubriendo un gran abanico de posibilidades con el fin de optimizar los procesos y problemas a los que se enfrenta la sociedad. Es por ello por lo que el IoT es una tecnología prometedora que está continuamente evolucionando gracias a la continua investigación y el gran número de dispositivos, sistemas y componentes emergidos cada día. Sin embargo, los dispositivos involucrados en el IoT se corresponden normalmente con dispositivos embebidos con limitaciones de almacenamiento y procesamiento, así como restricciones de memoria y potencia. Además, el número de objetos o dispositivos conectados a Internet contiene grandes previsiones de crecimiento para los próximos años, con unas expectativas de 500 miles de millones de objetos conectados para 2030. Por lo tanto, para dar cabida a despliegues globales del IoT, además de suplir las limitaciones que existen, es necesario involucrar nuevos sistemas y paradigmas que faciliten la adopción de este campo. El principal objetivo de esta tesis doctoral, conocida como IntegraDos, es facilitar la adopción del IoT a través de la integración con una serie de tecnologías. Por un lado, ha sido abordado cómo puede ser facilitada la gestión de sensores y actuadores en dispositivos físicos sin tener que acceder y programar las placas de desarrollo. Por otro lado, un sistema para programar aplicaciones del IoT portables, adaptables, personalizadas y desacopladas de los dispositivos ha sido definido

    Enabling Model-Driven Live Analytics For Cyber-Physical Systems: The Case of Smart Grids

    Get PDF
    Advances in software, embedded computing, sensors, and networking technologies will lead to a new generation of smart cyber-physical systems that will far exceed the capabilities of today’s embedded systems. They will be entrusted with increasingly complex tasks like controlling electric grids or autonomously driving cars. These systems have the potential to lay the foundations for tomorrow’s critical infrastructures, to form the basis of emerging and future smart services, and to improve the quality of our everyday lives in many areas. In order to solve their tasks, they have to continuously monitor and collect data from physical processes, analyse this data, and make decisions based on it. Making smart decisions requires a deep understanding of the environment, internal state, and the impacts of actions. Such deep understanding relies on efficient data models to organise the sensed data and on advanced analytics. Considering that cyber-physical systems are controlling physical processes, decisions need to be taken very fast. This makes it necessary to analyse data in live, as opposed to conventional batch analytics. However, the complex nature combined with the massive amount of data generated by such systems impose fundamental challenges. While data in the context of cyber-physical systems has some similar characteristics as big data, it holds a particular complexity. This complexity results from the complicated physical phenomena described by this data, which makes it difficult to extract a model able to explain such data and its various multi-layered relationships. Existing solutions fail to provide sustainable mechanisms to analyse such data in live. This dissertation presents a novel approach, named model-driven live analytics. The main contribution of this thesis is a multi-dimensional graph data model that brings raw data, domain knowledge, and machine learning together in a single model, which can drive live analytic processes. This model is continuously updated with the sensed data and can be leveraged by live analytic processes to support decision-making of cyber-physical systems. The presented approach has been developed in collaboration with an industrial partner and, in form of a prototype, applied to the domain of smart grids. The addressed challenges are derived from this collaboration as a response to shortcomings in the current state of the art. More specifically, this dissertation provides solutions for the following challenges: First, data handled by cyber-physical systems is usually dynamic—data in motion as opposed to traditional data at rest—and changes frequently and at different paces. Analysing such data is challenging since data models usually can only represent a snapshot of a system at one specific point in time. A common approach consists in a discretisation, which regularly samples and stores such snapshots at specific timestamps to keep track of the history. Continuously changing data is then represented as a finite sequence of such snapshots. Such data representations would be very inefficient to analyse, since it would require to mine the snapshots, extract a relevant dataset, and finally analyse it. For this problem, this thesis presents a temporal graph data model and storage system, which consider time as a first-class property. A time-relative navigation concept enables to analyse frequently changing data very efficiently. Secondly, making sustainable decisions requires to anticipate what impacts certain actions would have. Considering complex cyber-physical systems, it can come to situations where hundreds or thousands of such hypothetical actions must be explored before a solid decision can be made. Every action leads to an independent alternative from where a set of other actions can be applied and so forth. Finding the sequence of actions that leads to the desired alternative, requires to efficiently create, represent, and analyse many different alternatives. Given that every alternative has its own history, this creates a very high combinatorial complexity of alternatives and histories, which is hard to analyse. To tackle this problem, this dissertation introduces a multi-dimensional graph data model (as an extension of the temporal graph data model) that enables to efficiently represent, store, and analyse many different alternatives in live. Thirdly, complex cyber-physical systems are often distributed, but to fulfil their tasks these systems typically need to share context information between computational entities. This requires analytic algorithms to reason over distributed data, which is a complex task since it relies on the aggregation and processing of various distributed and constantly changing data. To address this challenge, this dissertation proposes an approach to transparently distribute the presented multi-dimensional graph data model in a peer-to-peer manner and defines a stream processing concept to efficiently handle frequent changes. Fourthly, to meet future needs, cyber-physical systems need to become increasingly intelligent. To make smart decisions, these systems have to continuously refine behavioural models that are known at design time, with what can only be learned from live data. Machine learning algorithms can help to solve this unknown behaviour by extracting commonalities over massive datasets. Nevertheless, searching a coarse-grained common behaviour model can be very inaccurate for cyber-physical systems, which are composed of completely different entities with very different behaviour. For these systems, fine-grained learning can be significantly more accurate. However, modelling, structuring, and synchronising many fine-grained learning units is challenging. To tackle this, this thesis presents an approach to define reusable, chainable, and independently computable fine-grained learning units, which can be modelled together with and on the same level as domain data. This allows to weave machine learning directly into the presented multi-dimensional graph data model. In summary, this thesis provides an efficient multi-dimensional graph data model to enable live analytics of complex, frequently changing, and distributed data of cyber-physical systems. This model can significantly improve data analytics for such systems and empower cyber-physical systems to make smart decisions in live. The presented solutions combine and extend methods from model-driven engineering, [email protected], data analytics, database systems, and machine learning
    corecore